juliageo / naturalearth.jl Goto Github PK
View Code? Open in Web Editor NEWJulia interface to Natural Earth data
License: MIT License
Julia interface to Natural Earth data
License: MIT License
@time_imports using NaturalEarth
1.2 ms Extents
2.9 ms GeoFormatTypes
3.8 ms GeoInterface
0.3 ms PrecompileTools
5.9 ms RecipesBase
0.4 ms GeoInterfaceRecipes
┌ 0.0 ms Parsers.__init__()
28.4 ms Parsers 36.75% compilation time
3.7 ms StructTypes
19.5 ms JSON3
0.4 ms DataValueInterfaces
0.5 ms DataAPI
0.2 ms IteratorInterfaceExtensions
0.2 ms TableTraits
2.2 ms OrderedCollections
5.2 ms Tables
62.3 ms GeoJSON
0.5 ms Scratch
┌ 2.1 ms NaturalEarth.__init__()
7.4 ms NaturalEarth
julia> @time naturalearth("admin_0_countries", 10)
2.445488 seconds (3.90 M allocations: 229.510 MiB, 1.94% gc time, 38.96% compilation time)
FeatureCollection with 258 Features
julia> @time naturalearth("admin_0_countries", 10)
0.065051 seconds (270.78 k allocations: 38.551 MiB, 10.15% gc time)
FeatureCollection with 258 Features
Probably needs:
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
We can't pull artifacts because Julia expects tarballs for all artifacts. It seems like scratchspaces are a better method to do this, then.
It doesn't look like there is an automated solution to get rasters in the way that we've obtained vectors, so we'd have to create a manual database of all links. Then, they are presented as zipfiles, meaning that we would have to use something like Scratch.jl and ZipFile.jl to unpack them manually and store them in a package-specific scratchspace. Assuming no name duplicates, this should also allow us to present a list of already-downloaded rasters.
@asinghvi17 Do you have any thoughts about how to structure this repo?
My initial commit is a rough sketch of how I imagine we can maintain this repo:
geojson_files.jl
looks up all available datasets from this github directory: link.create_artifacts.jl
does the job of creating Artifacts.toml
. I imagine we can hook that up to GitHub CI to periodically check for new datasets or updates to existing datasets.
[...]/master/[...]
, but we should probably do this by tags or something to ensure reproducibility. Although I think Artifacts
will fail if a file is updated upstream since the SHA will change... Anyways -- just some food for thought.Project.toml
are not really necessary, and, along with the aforementioned files, should be moved to some subdirectory that is tied to Github CI or tests.GeoMakie
.A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.