Giter VIP home page Giter VIP logo

datareader's Introduction

Build Status Go Report Card codecov GoDoc

datareader : read SAS and Stata files in Go

datareader is a pure Go (Golang) package that can read binary SAS format (SAS7BDAT) and Stata format (dta) data files into native Go data structures. For non-Go users, there are command line utilities that convert SAS and Stata files into text/csv and parquet files.

The Stata reader is based on the Stata documentation for the dta file format and supports dta versions 115, 117, and 118.

There is no official documentation for SAS binary format files. The code here is translated from the Python sas7bdat package, which in turn is based on an R package. Also see here for more information about the SAS7BDAT file structure.

This package also provides a simple column-oriented data container called a Series. Both the SAS reader and Stata reader return the data as an array of Series objects, corresponding to the columns of the data file. These can in turn be converted to other formats as needed.

Both the Stata and SAS reader support streaming access to the data (i.e. reading the file by chunks of consecutive records).

SAS

Here is an example of how the SAS reader can be used in a Go program (error handling omitted for brevity):

import (
        "datareader"
        "os"
)

// Create a SAS7BDAT object
f, _ := os.Open("filename.sas7bdat")
sas, _ := datareader.NewSAS7BDATReader(f)

// Read the first 10000 records (rows)
ds, _ := sas.Read(10000)

// If column 0 contains numeric data
// x is a []float64 containing the dta
// m is a []bool containing missingness indicators
x, m, _ := ds[0].AsFloat64Slice()

// If column 1 contains text data
// x is a []string containing the dta
// m is a []bool containing missingness indicators
x, m, _ := ds[1].AsStringSlice()

Stata

Here is an example of how the Stata reader can be used in a Go program (again with no error handling):

import (
        "datareader"
        "os"
)

// Create a StataReader object
f,_ := os.Open("filename.dta")
stata, _ := datareader.NewStataReader(f)

// Read the first 10000 records (rows)
ds, _ := stata.Read(10000)

CSV

The package includes a CSV reader with type inference for the column data types.

import (
        "datareader"
)

f, _ := os.Open("filename.csv")
rt := datareader.NewCSVReader(f)
rt.HasHeader = true
dt, _ := rt.Read(-1)
// obtain data from dt as in the SAS example above

Command line utilities

We provide two command-line utilities allowing conversion of SAS and Stata datasets to other formats without using Go directly. Executables for several OS's and architectures are contained in the bin directory. The script used to cross-compile these binaries is build.sh. To build and install the commands for your local architecture only, run the Makefile (the executables will be copied into your GOBIN directory).

The stattocsv command converts a SAS7BDAT or Stata dta file to a csv file, it can be used as follows:

> stattocsv file.sas7bdat > file.csv
> stattocsv file.dta > file.csv

The columnize command takes the data from either a SAS7BDAT or a Stata dta file, and writes the data from each column into a separate file. Numeric data can be stored in either binary (native 8 byte floats) or text format (binary is considerably faster).

> columnize -in=file.sas7bdat -out=cols -mode=binary
> columnize -in=file.dta -out=cols -mode=text

Parquet conversion

We provide a simple and efficient way to convert a SAS7BDAT file to parquet format, using the parquet-go package. To convert a SAS file called 'mydata.sas7bdat' to Parquet format, begin by running sas_to_parquet as follows:

sas_to_parquet -sasfile=mydata.sas7bdat -outdir=. -structname=MyStruct -pkgname=mypackage

If you want the Parquet file for use outside of Go, you can specify any values for structname and pkgname. The sas_to_parquet command generates a Go program called 'convert_data.go' that you can use to perform the data conversion.

The parquet file will be written to the specified destination directory, which in the above example is the current working directory. The parquet file name will be based on the SAS file name, e.g. in the above example it will be 'mydata.parquet'.

To facilitate reading the Parquet file into Go using the parquet-go package, a Go struct definition will be written to the directory specified by 'mypackage' above. See the sas_to_parquet_check.go script to see how to read the file into Go using these struct definitions.

Testing

Automated testing is implemented against the Stata files used to test the pandas Stata reader (for versions 115+):

https://github.com/pydata/pandas/tree/master/pandas/io/tests/data

A CSV data file for testing is generated by the gendat.go script. There are scripts make.sas and make.stata in the test directory that generate SAS and Stata files for testing. SAS and Stata software are required to run these scripts. The generated files are provided in the test_files/data directory, so go test can be run without having access to SAS or Stata.

The columnize_test.go and stattocsv_test.go scripts test the commands against stored output.

Feedback

Please file an issue if you encounter a file that is not properly handled. If possible, share the file that causes the problem.

datareader's People

Contributors

gmeixiong avatar kshedden avatar mzimmerman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

datareader's Issues

First few rows of Data() method has wrong offset

I am reading a sas7bdat file that represents a table with ~15K rows. The reader gives the correct column formats, names, and labels. For the data itself, the first 18 rows are incorrect in that columns have incorrect offsets while the rest is correct.

SAD7BDAT.read_next_page does not handle EOF.

(This is probably true for other file types, but I haven't tested them.)

In sas7bdat.go:818, the code assumes that any non-nil error should void the read, but that is not true for io.EOF. The golang io package says explicitly:
"""
// When Read encounters an error or end-of-file condition after
// successfully reading n > 0 bytes, it returns the number of
// bytes read. It may return the (non-nil) error from the same call
// or return the error (and n == 0) from a subsequent call.
// An instance of this general case is that a Reader returning
// a non-zero number of bytes at the end of the input stream may
// return either err == EOF or err == nil.
"""

Thus, if a File interface chooses to return EOF immediately on the last, positive-byte-sized read rather than waiting to return EOF with a 0-byte read (as the s3 filesystem I am using does), then this breaks the SAS7BDAT reader.

I believe a simple fix would be to change the line (sas7bdat.go:818) to:

	if err != nil && err != io.EOF {

Would that work?

Writer?

Does this library only read stata.dta files? Is it possible to write them?

Thanks!

Support various type conversions.

haven::read_sas is able to correctly parse "numeric" columns into integer or boolean columns. It's still not clear to me how haven::read_sas becomes aware of the underlying type, though if I had to guess it is related to how many bytes are stored for each column value.

It would be nice if datareader were able to similarly infer type and distinguish ints and bools from floats.

Unexpected non-zero end_of_first_byte

Getting two types of errors reading in .sas7bdat files:

  • Unexpected non-zero end_of_first_byte
  • 32 Character byte unknown

Any ideas on what's going on and how to handle?

Panic

goroutine 1 [running]:
github.com/kshedden/datareader.(*SAS7BDAT).processByteArrayWithData(0xc0000e4000, 0x2fe8, 0x90, 0xc0000ead20, 0x40b7c8)
        /afs/umich.edu/user/k/s/kshedden/go/src/github.com/kshedden/datareader/sas7bdat.go:1190 +0x6b3
github.com/kshedden/datareader.(*SAS7BDAT).readline(0xc0000e4000, 0x1f40, 0x1f40, 0xc000108000)
        /afs/umich.edu/user/k/s/kshedden/go/src/github.com/kshedden/datareader/sas7bdat.go:813 +0x31d
github.com/kshedden/datareader.(*SAS7BDAT).Read(0xc0000e4000, 0x3e8, 0x13, 0x13, 0x0, 0x0, 0x1d)
        /afs/umich.edu/user/k/s/kshedden/go/src/github.com/kshedden/datareader/sas7bdat.go:660 +0x389
main.doConversion(0x50e6c0, 0xc0000e4000)
        /afs/umich.edu/user/k/s/kshedden/go/src/github.com/kshedden/datareader/cmd/stattocsv/main.go:31 +0x94c
main.main()
        /afs/umich.edu/user/k/s/kshedden/go/src/github.com/kshedden/datareader/cmd/stattocsv/main.go:144 +0x24a

TrimStrings truncates data

I often use the stattocsv command line utility to work with data using the GNU utilities and I noticed that some of my columns with addresses, e.g., "50 E. Main St" only came out as "50".

I then wrote my own frontend using your library to spit out as CSV and it didn't have that issue.

Debugging some, I found it was the TrimStrings bool value causing the problem but I didn't debug further.

Benchmarks?

Hello!

I just came across this package. I'm aware that Go is generally very fast. Do you happen to know how the read speeds for Stata files with this package compare to Stata or Python? Is the reading multithreaded?

Also, "simple column-oriented data container" caught my eye. I'm especially curious if this data structure is similar to one that can be written by the parquet-go package. Since Parquet is a column-oriented file format, I'm guessing that reading Stata files with your package and writing it with parquet-go could be much faster than my current code to do that in Python.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.