Giter VIP home page Giter VIP logo

dataretrieval's Introduction

dataRetrieval dataRetrieval

CRAN version

The dataRetrieval package was created to simplify the process of loading hydrologic data into the R environment. It is designed to retrieve the major data types of U.S. Geological Survey (USGS) hydrology data that are available on the Web, as well as data from the Water Quality Portal (WQP), which currently houses water quality data from the Environmental Protection Agency (EPA), U.S. Department of Agriculture (USDA), and USGS. Direct USGS data is obtained from a service called the National Water Information System (NWIS).

For complete tutorial information, see:

https://doi-usgs.github.io/dataRetrieval/

https://waterdata.usgs.gov/blog/dataretrieval/

dataRetrieval Introduction 1

dataRetrieval Introduction 2

โš ๏ธ USGS discrete water-quality data availability and format are changing. Beginning in mid-March 2024 the data obtained from legacy profiles will not include new USGS data or recent updates to existing data. To view the status of changes in data availability and code functionality, visit: https://doi-usgs.github.io/dataRetrieval/articles/Status.html

If you have additional questions about these changes, email [email protected].

Sample Workflow

USGS

library(dataRetrieval)
# Choptank River near Greensboro, MD
siteNumber <- "01491000"
ChoptankInfo <- readNWISsite(siteNumber)
parameterCd <- "00060"

# Raw daily data:
rawDailyData <- readNWISdv(
  siteNumber, parameterCd,
  "1980-01-01", "2010-01-01"
)

pCode <- readNWISpCode(parameterCd)

Water Quality Portal

specificCond <- readWQPqw(
  siteNumbers = "WIDNR_WQX-10032762",
  parameterCd = "Specific conductance",
  startDate = "2011-05-01",
  endDate = "2011-09-30"
)

Network Linked Data Index

features <- findNLDI(
  nwis = "01491000",
  nav = "UT",
  find = c("basin", "wqp")
)

Installation of dataRetrieval

To install the dataRetrieval package, you must be using R 3.0 or greater and run the following command:

install.packages("dataRetrieval")

To get cutting-edge changes, install from GitHub using the remotes packages:

library(remotes)
install_github("DOI-USGS/dataRetrieval",
               build_vignettes = TRUE, 
               build_opts = c("--no-resave-data",
                              "--no-manual"))

Reporting bugs

Please consider reporting bugs and asking questions on the Issues page: https://github.com/DOI-USGS/dataRetrieval/issues

Citations

Citing the dataRetrieval package

citation(package = "dataRetrieval")
#> To cite dataRetrieval in publications, please use:
#> 
#>   De Cicco, L.A., Hirsch, R.M., Lorenz, D., Watkins, W.D., Johnson, M.,
#>   2024, dataRetrieval: R packages for discovering and retrieving water
#>   data available from Federal hydrologic web services, v.2.7.15,
#>   doi:10.5066/P9X4L3GE
#> 
#> A BibTeX entry for LaTeX users is
#> 
#>   @Manual{,
#>     author = {Laura DeCicco and Robert Hirsch and David Lorenz and David Watkins and Mike Johnson},
#>     title = {dataRetrieval: R packages for discovering and retrieving water data available from U.S. federal hydrologic web services},
#>     publisher = {U.S. Geological Survey},
#>     address = {Reston, VA},
#>     version = {2.7.15},
#>     institution = {U.S. Geological Survey},
#>     year = {2024},
#>     doi = {10.5066/P9X4L3GE},
#>     url = {https://code.usgs.gov/water/dataRetrieval},
#>   }

Citing NWIS data

U.S. Geological Survey, 2023, National Water Information System data available on the World Wide Web (USGS Water Data for the Nation), accessed [April 26, 2023], at http://waterdata.usgs.gov/nwis/. http://dx.doi.org/10.5066/F7P55KJN

This can be done using the create_NWIS_bib function:

dv <- readNWISdv("09010500", "00060")

NWIScitation <- create_NWIS_bib(dv)

NWIScitation
#> U.S. Geological Survey (2024). _National Water Information System data
#> available on the World Wide Web (USGS Water Data for the Nation)_.
#> doi:10.5066/F7P55KJN <https://doi.org/10.5066/F7P55KJN>, Accessed Feb
#> 20, 2024,
#> <https://waterservices.usgs.gov/nwis/dv/?site=09010500&format=waterml,1.1&ParameterCd=00060&StatCd=00003&startDT=1851-01-01>.
print(NWIScitation, style = "Bibtex")
#> @Manual{,
#>   title = {National Water Information System data available on the World Wide Web (USGS Water Data for the Nation)},
#>   author = {{U.S. Geological Survey}},
#>   doi = {10.5066/F7P55KJN},
#>   note = {Accessed Feb 20, 2024},
#>   year = {2024},
#>   url = {https://waterservices.usgs.gov/nwis/dv/?site=09010500&format=waterml,1.1&ParameterCd=00060&StatCd=00003&startDT=1851-01-01},
#> }

Citing WQP data

Citations for specific datasets should use this format:

National Water Quality Monitoring Council, YYYY, Water Quality Portal, accessed mm, dd, yyyy, hyperlink_for_query, https://doi.org/10.5066/P9QRKUVJ.

This can be done using the create_WQP_bib function:

SC <- readWQPqw(siteNumbers = "USGS-05288705",
                parameterCd = "00300")

WQPcitation <- create_WQP_bib(SC)
WQPcitation
#> National Water Quality Monitoring Council (2024). _ Water Quality
#> Portal_. doi:10.5066/P9QRKUVJ <https://doi.org/10.5066/P9QRKUVJ>,
#> Accessed Feb 20, 2024,
#> <https://www.waterqualitydata.us/data/Result/search?siteid=USGS-05288705&pCode=00300&mimeType=tsv&zip=yes>.
print(WQPcitation, style = "Bibtex")
#> @Manual{,
#>   title = { Water Quality Portal},
#>   author = {{National Water Quality Monitoring Council}},
#>   doi = {10.5066/P9QRKUVJ},
#>   note = {Accessed Feb 20, 2024},
#>   year = {2024},
#>   url = {https://www.waterqualitydata.us/data/Result/search?siteid=USGS-05288705&pCode=00300&mimeType=tsv&zip=yes},
#> }

Citing Water Quality Portal itself

General Water Quality Portal citations should use the following:

Water Quality Portal. Washington (DC): National Water Quality Monitoring Council, United States Geological Survey (USGS), Environmental Protection Agency (EPA); 2021. https://doi.org/10.5066/P9QRKUVJ.

Package Support

The Water Mission Area of the USGS supports the development and maintenance of dataRetrieval, and most likely further into the future. Resources are available primarily for maintenance and responding to user questions. Priorities on the development of new features are determined by the dataRetrieval development team. This software was last released with USGS record: IP-147158.

Disclaimer

This software is preliminary or provisional and is subject to revision. It is being provided to meet the need for timely best science. The software has not received final approval by the U.S. Geological Survey (USGS). No warranty, expressed or implied, is made by the USGS or the U.S. Government as to the functionality of the software and related material nor shall the fact of release constitute any such warranty. The software is provided on the condition that neither the USGS nor the U.S. Government shall be held liable for any damages resulting from the authorized or unauthorized use of the software.

dataretrieval's People

Contributors

aappling-usgs avatar agriculturist avatar alkrall avatar bmccloskey avatar dblodgett-usgs avatar dependabot[bot] avatar doug-friedman avatar elbeejay avatar jimhester avatar jiwalker-usgs avatar jpadilla-spu avatar jsta avatar jwpestrak avatar katrinleinweber avatar kevin-m-smith avatar ldecicco-usgs avatar lstanish-usgs avatar mikejohnson51 avatar padilla410 avatar rckwzrd avatar wdwatkins avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dataretrieval's Issues

can 'whatNWISsites' take multiple pcodes?

whatNWISsites(stateCd="OH",parameterCd=c("00665", "00060"))
URL caused an error:http://waterservices.usgs.gov/nwis/site/?format=mapper&stateCd=OH&parameterCd=0066500060

Use case: I want a list of sites that have param X and param Y

Bug when multiple codes

The daily streamflow data for "05270500" contain 2 codes for 2014-11-11. An extract of the WML is below.

<ns1:value qualifiers="P" dateTime="2014-11-10T00:00:00.000">315/ns1:value
<ns1:value qualifiers="P Ice" dateTime="2014-11-11T00:00:00.000">-999999/ns1:value
<ns1:value qualifiers="P Bkw" dateTime="2014-11-11T00:00:00.000">-999999/ns1:value
<ns1:value qualifiers="P Ice" dateTime="2014-11-12T00:00:00.000">-999999/ns1:value
<ns1:value qualifiers="P Ice" dateTime="2014-11-13T00:00:00.000">-999999/ns1:value
<ns1:value qualifiers="P Ice" dateTime="2014-11-14T00:00:00.000">-999999/ns1:value

When the data are retrieved though 2014-11-10, or beginning on 2-014-11-12, no error/warning is printed and the data look OK. If the retrieval overlaps that period, a weird message is printed and the data are coded as 1/0. See the example retrieval:

readNWISdv("05270500", "00060", "2014-11-10", "2014-11-12")
Aggregation function missing: defaulting to length
agency_cd site_no Date X_00060_00003_cd X_00060_00003
1 USGS 05270500 2014-11-10 1 1
2 USGS 05270500 2014-11-11 2 0
3 USGS 05270500 2014-11-12 1 0

getNWISDaily

From an email:
""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Back in January, I tried using getNWISDaily to pull groundwater water quality data and to my surprise (at the time) it worked! However, in trying this stunt again today, it didn't work, which has me wondering if 1) getNWISDaily has been modified in such a way over the last couple of months to temporarily disable this functionality or 2) it was a fluke that it worked in the first place? For example, gw site 390208119433201 and param code 00631 (nitrate) has data (see screen shot below), but the following script won't seem to pull the data. Is there a new function for pulling gw data?

Thanks for your consideration. Best, Eric

library(EGRET)
library(dataRetrieval)

StartDate <- ""  
EndDate <- ""
siteNumber <- "390208119433201"
QParameterCd <- "00631"
Daily <- readNWISDaily(siteNumber, QParameterCd, StartDate, EndDate)

proof that the data is there:

http://nwis.waterdata.usgs.gov/nwis/qwdata?site_no=390208119433201&agency_cd=USGS&format=serial_rdb

cast and dcast error in importWaterML1

Error in dim(ordered) <- ns : 
  dims [product 1] do not match the length of object [0] 
6 cast(data, formula, fun.aggregate, ..., subset = subset, fill = fill, 
    drop = drop, value.var = value.var) 
5 dcast(meltedmergedDF, castFormula, drop = FALSE) 
4 importWaterML1(url, asDateTime = TRUE, tz = tz) 
3 dataRetrieval::readNWISuv(siteNumbers = site, parameterCd = p_code, 
    ...) at get_nwis_df.R#25

Unsupported pcodes

I received an error the other day about "pcodes may be mistyped...". Unfortunately, I did not save the exact error. However, pcodes were retrieved using whatNWISdata so they were not mistyped. I was pulling all data for a site, and the "mistyped" pcodes looked like something from a pesticide schedule.

I will make sure to save the error next time with exact pcodes.
The code I am using to do this data pull is in the QWToolbox package under the publicData function.

readNWISuv siteNumbers only accepts usgs 8-digit USGS site IDs?

Hi there-

First of all, this tool is fantastic. Thank you for making it.

The discovery aspect of this tool is great and critical to my work. But I'm a bit confused about the output of this kind of command meta <- whatWQPsites(huc='10190005'), which gives me meta$MonitoringLocationIdentifier like this

...
[565] "NARS_WQX-FW08CO004" ...
[583] "USGS-06725100"      ...
[613] "USGS-395528105133800" ...

in relationship to use of readNWISSuv. Can I only supply the numeric part of the MonitoringLoactionIdentifiers for rows/sites with identifiers like row 583 as the siteNumbers argument? The documentation is not specific ( siteNumbers character USGS site number (or multiple sites). This is usually an 8 digit number ) other than possibly limiting the query to USGS data. But the queries using other identifiers, even the longer USGS ones, have failed for me. In other words, am I only able to get a small subset of the data listed in meta with readNWISSuv and is there a way to get the other data listed in meta? Maybe I'm missing something.

Also, by the way, is there any access to gages-II data/attributes via these NWIS queries?

Thanks again,
james

startDateLo argument not working for readWQPdata

wqp_DF1 <- readWQPdata(statecode="US:20", siteType="Stream", sampleMedia="Water", startDateLo="01-01-2014",characteristicType="Nutrient" )
min(wqp_DF1$ActivityStartDate)
[1] "1916-06-11"

Seems to be an issue since this last round of commits from earlier this week.

StateCode

It would be fantastic to have a function to convert state codes (or county codes) to character names.

Document that readNWISuv(startDate, EndDate) query dates are in the station's local time zone.

Something is a bit confusing here:

> d = readNWISuv('01670080','62620',startDate='2015-03-19',endDate='2015-03-19')
> head(d)
  agency_cd  site_no            dateTime tz_cd X_62620_00011_cd X_62620_00011
1      USGS 01670080 2015-03-19 04:00:00   EDT                P         -0.86
2      USGS 01670080 2015-03-19 04:06:00   EDT                P         -0.95
3      USGS 01670080 2015-03-19 04:12:00   EDT                P         -1.03
4      USGS 01670080 2015-03-19 04:18:00   EDT                P         -1.10
5      USGS 01670080 2015-03-19 04:24:00   EDT                P         -1.19
6      USGS 01670080 2015-03-19 04:30:00   EDT                P         -1.29
> format(d$dateTime[2],usetz=TRUE)
[1] "2015-03-19 04:06:00 UTC"
> format(d$dateTime[2],tz='America/New_York',usetz=TRUE)
[1] "2015-03-19 00:06:00 EDT"

From the printout It appears to be doing the request for a GMT-bounded day, but the timestamps on the actual data appear to be for for a local-standard-time bounded day. If it is the latter, then this printout is a bit misleading. Since the time data seems to be translated into UTC, maybe the data returned shouldn't have the timezone column.

In the '?readNWISuv' help, it might be nice to highlight that the startDate and endDate are in the station local-standard time, not the date ranges indicated in http://waterservices.usgs.gov/rest/IV-Service.html#Specifying

For reference:
> time2=as.POSIXct(360, origin = "2015-03-19", tz = "GMT")
> format(time2,tz='GMT',usetz=TRUE)
[1] "2015-03-19 00:06:00 GMT"
> format(time2,tz='America/New_York',usetz=TRUE)
[1] "2015-03-18 20:06:00 EDT"

Adding handling for 404s etc in the getData* functions?

It would be great to be able to specify how these functions deal with 1) no internet connect 2) no matching site 3) empty returns. Right now I wrap the calls in tryCatch, but what is your take on handling these cases inside the functions?

repeated station in output readNWISsite("02054530")

what can be done about this? seems like a problem on NWIS's side.
It's fairly annoying that this is not 1-1... but manageable.

dataRetrieval::readNWISsite("02054530")

  agency_cd  site_no                         station_nm site_tp_cd lat_va long_va dec_lat_va dec_long_va
1      USGS 02054530       ROANOKE RIVER AT GLENVAR, VA         ST 371604  800823   37.26791   -80.13949
2     VA087 02054530 ROANOKE RIVER AT GLENVAR, VIRGINIA         ST 371604  800823   37.26791   -80.13949
  coord_meth_cd coord_acy_cd coord_datum_cd dec_coord_datum_cd district_cd state_cd county_cd country_cd
1             M            U          NAD27              NAD83          51       51       161         US
2             M            U          NAD27              NAD83          51       51       161         US
  land_net_ds  map_nm map_scale_fc alt_va alt_meth_cd alt_acy_va alt_datum_cd   huc_cd basin_cd topo_cd
1             GLENVAR        24000   1100           L       0.01       NGVD29 03010101                 
2             GLEMVAR        24000   1100           L       0.01       NGVD29 03010101                 
                  instruments_cd construction_dt inventory_dt drain_area_va contrib_drain_area_va tz_cd
1 NNNNYNNNNNNNNNNNNNNNNNNNNNNNNN                                        281                    NA   EST
2 NNNNYNNNNNNNNNNNNNNNNNNNNNNNNN                                        281                    NA   EST
  local_time_fg reliability_cd gw_file_cd nat_aqfr_cd aqfr_cd aqfr_type_cd well_depth_va hole_depth_va
1             Y                  NNNNNNNN                                             NA            NA
2             Y                                                                       NA            NA
  depth_src_cd project_no
1                        
2                        

incorrect warning from readNWISqw

I got a warning from

Well <- "364200119420001"
Pcodes <-c("00010", "00095", "00300", "00400")

TestWellQW <- readNWISqw(Well, Pcodes, expanded=FALSE, reshape=TRUE)
It was:
Warning message:
In readNWISqw(Well, Pcodes, expanded = FALSE, reshape = TRUE) :
Reshape can only be used with expanded data. Reshape request will be ignored.

But the data were reshaped and it does in fact make sense to reshape the data when expanded=FALSE (maybe even more so than when expanded is TRUE).

reshape library interferes with readNWISdv

Linked issue from EflowStats issue number 17 .

This works:

library(dataRetrieval)
x_obs <- readNWISdv("02177000", "00060", "2009-10-01", "2013-09-30") 

This does not, returns NAs

library(dataRetrieval)
library(reshape)
x_obs <- readNWISdv("02177000", "00060", "2009-10-01", "2013-09-30") 

functionality to look up pcode attributes

Thought pCodeToName would let me do something like pCodeToName('00300'), but it is just a big-ol data.frame.

We are looking for basic support for units(pcode) or something similar. Any help for the user here?

getNWISData fails on Florida for some reason

On Windows 7, R

 getNWISData(stateCd='FL', parameterCd="00010")
Error in read.table(file = file, header = header, sep = sep, quote = quote,  : 
  more columns than column names 

R version info

platform       x86_64-w64-mingw32          
arch           x86_64                      
os             mingw32                     
system         x86_64, mingw32             
status                                     
major          3                           
minor          1.0                         
year           2014                        
month          04                          
day            10                          
svn rev        65387                       
language       R                           
version.string R version 3.1.0 (2014-04-10)
nickname       Spring Dance         

How do I retrieve USGS internal data?

EDITED AS OF dataRetrieval 2.4.0 and greater:
Within a USGS firewall, it is possible to retrieve data flagged as 'internal' by using the setAccess function:

library(dataRetrieval)
setAccess('internal')
data <- readNWISuv("373015122071000","00095")

Original message:
Within a USGS firewall, it is possible to retrieve data flagged as 'internal' using "Access=3". Using the url attribute, that can be done as follows:

library(dataRetrieval)
url <- "http://nwis.waterdata.usgs.gov/ca/nwis/uv?cb_00095=on&cb_00095=on&format=rdb&site_no=373015122071000&period=&begin_date=2013-03-06&end_date=2015-07-07&Access=3"
DS <- importRDB1(url, asDateTime=TRUE)

findData <- whatNWISdata("373015122071000")
nrow(findData)
[1] 26
urlToFind <- attr(findData, "url")
urlToFind <- paste0(urlToFind, "&Access=3")
findData2 <- importRDB1(urlToFind)
nrow(findData2)
[1] 31

Depending on the function you are initially trying to retrieve data, you'll want to check if the import is RDB (tab delimited) or WaterML1 and replicate what was happening in the original convenience function. This can be done by either inspecting the function:

readNWISdv
function (siteNumber,parameterCd,startDate="",endDate="",statCd="00003"){  

  url <- constructNWISURL(siteNumber,parameterCd,startDate,endDate,"dv",statCd=statCd)

  data <- importWaterML1(url, asDateTime=FALSE)
  if(nrow(data)>0){
    data$dateTime <- as.Date(data$dateTime)
    data$tz_cd <- NULL

    names(data)[names(data) == "dateTime"] <- "Date"    
  }


  return (data)
}
<environment: namespace:dataRetrieval>

So, for the readNWISdv function, add the "&Access=3" to the url, call importWaterML1, and convert the dateTime to a Date:

siteNumber <- '04085427'
startDate <- '2012-01-01'
endDate <- '2012-06-30'
pCode <- '00060'
rawDailyQ <- readNWISdv(siteNumber,pCode, startDate, endDate)
newURL <- attr(rawDailyQ, "url")
newURL <- paste0(newURL,"&Access=3")
newData <- importWaterML1(newURL, asDateTime=FALSE)
newData$dateTime <- as.Date(newData$dateTime)
newData$tz_cd <- NULL
names(newData)[names(newData) == "dateTime"] <- "Date"   

Alternatively, the readNWISdata could be used. For the example above:

siteNumber <- '04085427'
startDate <- '2012-01-01'
endDate <- '2012-06-30'
pCode <- '00060'

readNWISdata(service = "dv", sites=siteNumber, parameterCd=pCode, 
             startDate=startDate, endDate=endDate, Access=3)

Is there way to merge column output when using readNWISuv?

I have a site where one sensor is discontinued for another. These were assigned different DD numbers and returned in separate columns by readNWISuv. Is there an easy way in R to merge the two into one column filling in the missing values? Here is an example of such a site:

http://waterservices.usgs.gov/nwis/iv/?format=rdb&sites=07144100&startDT=2015-02-01&endDT=2015-04-01&parameterCd=63680

In this case, fill 96_63680 when 44_63680 is missing a value.

Thanks,
Steve

object 'parameterCdFile' not found

image

How to reproduce: have another package import dataRetrieval and use one of its functions inside the other package function. But don't have dataRetrieval attached (see screen shot). If you have that box checked, all is good.

unify sitename arg name

siteNumber vs sites appears in different functions. Should consider unifying this so it is easy to switch between functions and you have a expectation of the design pattern.

Confusing error when no NWIS data exists

It seems that if there is no NWIS data available, the user gets a confusing error

URL caused an error: http://waterservices.usgs.gov/nwis/dv/?site=10264750&format=rdb,1.0&ParameterCd=00060&StatCd=00003&startDT=1980-01-01&endDT=2014-11-03
Content-Type=text/plain; charset=UTF-8
Error in names(data) <- c("agency", "site", "dateTime", "value", "code") : 
  attempt to set an attribute on NULL

To reproduce:

getNWISDaily("10264750","00060", "1980-01-01", "2014-11-03", convert=TRUE)

Tried it with the fork at @ldecicco-USGS. Using:

 readNWISdv("10264750","00060", "1980-01-01", "2014-11-03")

get

Error in timeSeries[[i]] : subscript out of bounds

Not sure if its the same error or not, but I think it is.

Incorrect input arguments listed for readNWISuv()

Hi,

The function inputs listed for the function readNWISuv() on the main dataRetrieval page are incorrect. The inputs are listed as "Common 3, parameter code". The help for the function lists the arguments as ""siteNumbers, parameterCd, startDate = "", endDate = "",
tz = "". Please update the main page - I had issues getting the correct data output the first few times I tried.

WQP url parsing

This works:

nutrient <- readWQPdata(statecode="US:55",characteristicType="Nutrient",
                         siteType="Lake,+Reservoir,+Impoundment")

But this doesn't:

nutrient <- readWQPdata(statecode="US:55",characteristicType="Nutrient",
                         siteType="Lake%2C+Reservoir%2C+Impoundment")

which is what you'd copy from the WQP itself.

Did NWIS start blocking multiple rapid requests?

I am trying to get the state code for a bunch of sites, and am looping through them and using readNWISsite(nwis_site). This has always worked out fine, but now is giving me a bad connection error after a few hundred of them. Seen this before?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.