Giter VIP home page Giter VIP logo

hadoop-snappy's People

Contributors

issay avatar tomwhite avatar tucu00 avatar

Watchers

 avatar

hadoop-snappy's Issues

Hadoop-snappy error

What steps will reproduce the problem?
1. mvn package -Dsnappy.prefix=/usr/local/

What is the expected output? What do you see instead?
Snappy 1.0.5

What version of the product are you using? On what operating system?
Linux Ubuntu 11.10

Please provide any additional information below.

     [exec] config.status: executing libtool commands
     [exec] depbase=`echo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
     [exec]     /bin/bash ./libtool --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I.  -I/usr/lib/jvm/java-6-sun/include -I/usr/lib/jvm/java-6-sun/include/lisrc/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyComprenux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g ssor_initIDs':
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64:49: error: expected expression before ',' token
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function 'Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect':
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: warning: passing argument 4 of 'dlsym_snappy_compress' from incompatible pointer type [enabled by default]
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117:3: note: expected 'size_t *' but argument is of type 'jint *'
     [exec] make: *** [src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo] Error 1
     [exec] -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF $depbase.Tpo -c -o src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c &&\
     [exec]     mv -f $depbase.Tpo $depbase.Plo
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/usr/lib/jvm/java-6-sun/include -I/usr/lib/jvm/java-6-sun/include/linux -I/home/gsd/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/usr/local//include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF src/org/apache/hadoop/io/compress/snappy/.deps/SnappyCompressor.Tpo -c src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c  -fPIC -DPIC -o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o

Thanks!

Original issue reported on code.google.com by [email protected] on 15 Mar 2012 at 6:23

problems with snappy integration with Hadoop 0.20.* on Mac OS X

What steps will reproduce the problem?
1. Use Mac OS X. Follow the guide on the Project Home, the step 
http://code.google.com/p/hadoop-snappy/#Install_Hadoop_Snappy_in_Hadoop
2. try to run a code that uses snappy (i.e the simplest case: hadoop in local 
mode, "hadoop fs -text" for snappy-compressed SequenceFile) 
3. observe "Unknown codec: org.apache.hadoop.io.compress.SnappyCodec" error
4. make sure "hadoop classpath | grep snappy" is empty

What is the expected output? What do you see instead?
Expected behavior would be presence of the snappy jar in the classpath, and 
snappy native code in java.library.path
But neither of this is true on Mac OS with binary hadoop distribution.

What version of the product are you using? On what operating system?
snappy 1.0.5 on Mac OS X 10.7.4, hadoop-bin 0.20.205.0

Please provide any additional information below.
1. Hadoop 0.20.* has two distribution variants with different filesystem 
layout: 
 * _binary_ like hadoop-0.20.205.0-bin.tar.gz
 * _tarball_ with sources hadoop-0.20.205.0.tar.gz
Binary distributed hadoop DOES NOT include <HADOOP_HOME>/lib to the classpath. 
So snappy jar file in this case is not available at the runtime. Instead it 
uses <HADOOP_PREFIX>/share/hadoop/lib/*.jar

Looks like hadoop-snappy installation instruction needs to be updated for this 
case.

2. Native code is NOT added to the java.library.path as the Hadoop naming 
convention for path  is "/lib/native/os.name-os.arch-sun.arch.data.model" 
(org.apache.hadoop.util.PlatformName.platformName L30) which is 
"Mac_OS_X-x86_64-64" but current maven build script for hadoop-snappy in case 
of OS X overrides it to "Mac_OS_X-${sun.arch.data.model}" (pom.xml L278), like 
Mac_OS_X-64 so hadoop run script unable to find it.

Original issue reported on code.google.com by [email protected] on 4 Jul 2012 at 6:53

/usr/bin/ld: cannot find -ljvm

I'm trying to build hadoop-snappy but it does not work with errors.

[exec] make  all-am
     [exec] make[1]: Entering directory `/home/ywkim/workspace/hadoop-snappy/target/native'
     [exec] /bin/bash ./libtool --tag=CXX   --mode=link g++  -g -O2 -version-info 0:1:0 -L/usr/local/lib -o libhadoopsnappy.la -rpath /usr/local/lib src/SnappyCompressor.lo src/SnappyDecompressor.lo  -ljvm 
     [exec] libtool: link: g++ -shared -nostdlib /usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/../../../crti.o /usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/crtbeginS.o  src/.libs/SnappyCompressor.o src/.libs/SnappyDecompressor.o   -L/usr/local/lib -ljvm -L/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2 -L/usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/../../.. -L/usr/lib/x86_64-linux-gnu -lstdc++ -lm -lc -lgcc_s /usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/crtendS.o /usr/lib/x86_64-linux-gnu/gcc/x86_64-linux-gnu/4.5.2/../../../crtn.o    -Wl,-soname -Wl,libhadoopsnappy.so.0 -o .libs/libhadoopsnappy.so.0.0.1
     [exec] make[1]: Leaving directory `/home/ywkim/workspace/hadoop-snappy/target/native'
     [exec] /usr/bin/ld: cannot find -ljvm
     [exec] collect2: ld returned 1 exit status
     [exec] make[1]: *** [libhadoopsnappy.la] 오류 1
     [exec] make: *** [all] 오류 2


Ubuntu 11.04(x86_64, on VirtualBox)

Original issue reported on code.google.com by [email protected] on 24 May 2011 at 7:14

libsnappy.la is missing.

What steps will reproduce the problem?
1. sudo mvn clean package 
-Dsnappy.prefix=~/hadoop-snappy-read-only/snappy-1.1.1/
2.
3.

What is the expected output? What do you see instead?
Fully built hadoop-snappy

What version of the product are you using? On what operating system?
Check out from source, and use on Mac OS 10.8.4

Please provide any additional information below.

After mvn building the package, I have:

Kevin-TBLT:Mac_OS_X-64 Kevin$ pwd
/Users/Kevin/code/apache/hadoop-snappy-read-only/target/hadoop-snappy-0.0.1-SNAP
SHOT/lib/native/Mac_OS_X-64
Kevin-TBLT:Mac_OS_X-64 Kevin$ ls -la
total 632
drwxrwxr-x  13 root  staff     442 Dec 29 10:47 .
drwxrwxr-x   3 root  staff     102 Dec 29 10:33 ..
-rwxrwxr-x   1 root  staff   15996 Dec 29 10:33 libhadoopsnappy.0.dylib
lrwxr-xr-x   1 root  staff      23 Dec 29 10:33 libhadoopsnappy.dylib -> 
libhadoopsnappy.0.dylib
-rwxrwxr-x   1 root  staff   34320 Dec 29 10:33 libsnappy.1.dylib
lrwxr-xr-x   1 root  staff      17 Dec 29 10:33 libsnappy.dylib -> 
libsnappy.1.dylib
lrwxr-xr-x   1 root  staff      15 Dec 29 10:33 libsnappy.la -> ../libsnappy.la
-rwxrwxr-x   1 root  staff     934 Dec 29 10:33 libsnappy.lai
-rwxrwxr-x   1 root  staff    6288 Dec 29 10:33 snappy-c.o
-rwxrwxr-x   1 root  staff   14420 Dec 29 10:33 snappy-sinksource.o
-rwxrwxr-x   1 root  staff   19020 Dec 29 10:33 snappy-stubs-internal.o
-rwxrwxr-x   1 root  staff  115648 Dec 29 10:33 snappy.o
-rwxrwxr-x   1 root  staff   90008 Dec 29 10:33 snappy_unittest

The libsnappy.la is just the link to itself, it doesn't exits. May I know if 
this is a problem because the hadoop still throwing me the exception:

 java.lang.RuntimeException: native snappy library not available
    at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)

But actually all files have been copied to the hadoop/lib/ except the 
libsnappy.la

/Users/Kevin/code/apache/hadoop/lib/native/Mac_OS_X-64
Kevin-TBLT:Mac_OS_X-64 Kevin$ ls -la
total 712
drwxr-xr-x  12 Kevin  staff     408 Dec 29 10:48 .
drwxr-xr-x@  5 Kevin  staff     170 Jul 22 15:26 ..
-rwxr-xr-x   1 Kevin  staff   15996 Dec 29 10:52 libhadoopsnappy.0.dylib
-rwxr-xr-x   1 Kevin  staff   15996 Dec 29 10:52 libhadoopsnappy.dylib
-rwxr-xr-x   1 Kevin  staff   34320 Dec 29 10:52 libsnappy.1.dylib
-rwxr-xr-x   1 Kevin  staff   34320 Dec 29 10:52 libsnappy.dylib
-rw-r--r--   1 Kevin  staff     934 Dec 29 10:52 libsnappy.lai
-rw-r--r--   1 Kevin  staff    6288 Dec 29 10:52 snappy-c.o
-rw-r--r--   1 Kevin  staff   14420 Dec 29 10:52 snappy-sinksource.o
-rw-r--r--   1 Kevin  staff   19020 Dec 29 10:52 snappy-stubs-internal.o
-rw-r--r--   1 Kevin  staff  115648 Dec 29 10:52 snappy.o
-rwxr-xr-x   1 Kevin  staff   90008 Dec 29 10:52 snappy_unittest

Would you please advise? Thanks.

Original issue reported on code.google.com by [email protected] on 29 Dec 2013 at 8:18

how to use snappy with hadoop-0.20.205.0

This project is integrated into Hadoop Common (JUN 2011).

Hadoop-Snappy can be used as an add-on for recent (released) versions of Hadoop 
that do not provide Snappy Codec support yet.

Hadoop-Snappy is being kept in synch with Hadoop Common. 

what's this meaning????

then how to use snappy on hadoop-0.20.205.0????

Original issue reported on code.google.com by [email protected] on 12 Dec 2011 at 3:33

Snappy installation

Hi,

How to install snappy?

What i have done so far is,
  1) downloaded snappy-1.0.3 tar ball.
  2) extracted  it with tar xvzf
  3) and ran ./configure

Is the above process finishes my installation? If so, how can i verify the 
installation and run sample tests?

I didn't find any libsnappy.so in /usr/lib or /usr/local.

Original issue reported on code.google.com by [email protected] on 13 Nov 2014 at 8:40

Compression overhead is too small

Snappy has a potential maximum overhead of size/6+32 according to 
http://code.google.com/p/snappy/source/browse/trunk/snappy.cc#55, while the 
current code assumes size/8+128+3, which could be too small in some 
circumstances.

Original issue reported on code.google.com by [email protected] on 6 Jun 2011 at 6:10

SnappyCompressor.c:64: error

What steps will reproduce the problem?
1. svn checkout 
2. Install all pre-requisites
3. Run 'mvn package -Dsnappy.prefix=/home/ngc/Char/snap/snappy_build' 
(snappy_build is my Snappy installation folder, which contains the following 
folders: include; lib; share).


What version of the product are you using? On what operating system?
- Maven 3.03 on Ubuntu 10.04


--

$ mvn package -Dsnappy.prefix=/home/ngc/Char/snap/snappy_build
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hadoop Snappy 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ 
hadoop-snappy ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/home/ngc/Char/snap/hadoop-snappy/hadoop-snappy-read-only/src/main/resources
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ 
hadoop-snappy ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (compile) @ hadoop-snappy ---
[INFO] Executing tasks

main:

checkpreconditions:

compilenative:
     [exec] libtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, `config'.
     [exec] libtoolize: copying file `config/config.guess'
     [exec] libtoolize: copying file `config/config.sub'
     [exec] libtoolize: copying file `config/install-sh'
     [exec] libtoolize: copying file `config/ltmain.sh'
     [exec] libtoolize: putting macros in AC_CONFIG_MACRO_DIR, `m4'.
     [exec] libtoolize: copying file `m4/libtool.m4'
     [exec] libtoolize: copying file `m4/ltoptions.m4'
     [exec] libtoolize: copying file `m4/ltsugar.m4'
     [exec] libtoolize: copying file `m4/ltversion.m4'
     [exec] libtoolize: copying file `m4/lt~obsolete.m4'
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes

...
...

[exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating config.h
     [exec] config.status: config.h is unchanged
     [exec] config.status: executing depfiles commands
     [exec] config.status: executing libtool commands
     [exec] depbase=`echo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo | sed 's|[^/]*$|.deps/&|;s|\.lo$||'`;\
     [exec]     /bin/bash ./libtool --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I.  -I/home/ngc/jdk1.6.0_25/include -I/home/ngc/jdk1.6.0_25/include/linux src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function \u2018Java_org_apache_hadoop_io_compress_snappy_SnappyComp-I/home/ngc/Char/snap/hadoop-snappy/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/horessor_initIDs\u2019:
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: error: \u2018libnotfound\u2019 undeclared (first use in this function)
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: error: (Each undeclared identifier is reported only once
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: error: for each function it appears in.)
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: error: expected \u2018)\u2019 before \u2018libnotfound\u2019
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: warning: too few arguments for format
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:64: warning: too few arguments for format
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c: In function \u2018Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect\u2019:
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117: warning: passing argument 4 of \u2018dlsym_snappy_compress\u2019 from incompatible pointer type
     [exec] src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c:117: note: expected \u2018size_t *\u2019 but argument is of type \u2018jint *\u2019
     [exec] make: *** [src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo] Error 1
     [exec] me/ngc/Char/snap/snappy_build/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF $depbase.Tpo -c -o src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c &&\
     [exec]     mv -f $depbase.Tpo $depbase.Plo
     [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I. -I/home/ngc/jdk1.6.0_25/include -I/home/ngc/jdk1.6.0_25/include/linux -I/home/ngc/Char/snap/hadoop-snappy/hadoop-snappy-read-only/src/main/native/src -Isrc/org/apache/hadoop/io/compress/snappy -I/home/ngc/Char/snap/snappy_build/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.lo -MD -MP -MF src/org/apache/hadoop/io/compress/snappy/.deps/SnappyCompressor.Tpo -c src/org/apache/hadoop/io/compress/snappy/SnappyCompressor.c  -fPIC -DPIC -o src/org/apache/hadoop/io/compress/snappy/.libs/SnappyCompressor.o
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14.666s
[INFO] Finished at: Tue Aug 09 12:39:44 EDT 2011
[INFO] Final Memory: 6M/361M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile) on project 
hadoop-snappy: An Ant BuildException has occured: The following error occurred 
while executing this line:
[ERROR] 
/home/ngc/Char/snap/hadoop-snappy/hadoop-snappy-read-only/maven/build-compilenat
ive.xml:75: exec returned: 2
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException


--

It looks like the error comes from Line 64 of SnappyCompressor.c. I've tried to 
debug it, but I couldn't really determine why the code wasn't working.

Does anybody have any idea on why the error is generated?

- CK

Original issue reported on code.google.com by [email protected] on 9 Aug 2011 at 4:52

where is libsnappy.so.1.0.1 ?

The second operation of Usage's step 4 is as follows:
ln -s <SNAPPY_LIB_DIR>/libsnappy.so.1.0.1 <HADOOP_HOME>/lib/native/<ARCH>/

However I can't find libsnappy.so.1.0.1 in build directory. So SNAPPY_LIB_DIR 
refer to which directory?

Tks

Liyin Liangly 

Original issue reported on code.google.com by [email protected] on 25 Apr 2011 at 8:10

JVM crashes if snappy is not in system library path

Loading the snappy library via System.loadLibrary() succeeds if the library is 
in the Hadoop lib/native directory (and thus in the java.library.path).  
However, the dlopen() call in the C code fails if the library isn't in the 
system library path.  The Java code ignores this error and goes on to use the 
library, which then crashes the JVM.

This patch does three things:

1) Use RTLD_DEFAULT instead of dlopen(), as the snappy library is already 
loaded via System.loadLibrary().
2) Initialize the native code in LoadSnappy, allowing the entire loading 
process can be guarded by isLoaded().
3) Add isLoaded() guards to the SnappyCompressor and SnappyDecompressor 
constructors, preventing the JVM from crashing if they are used directly.

Original issue reported on code.google.com by electrum on 24 Aug 2011 at 7:11

Attachments:

maven built pom.xml not found

What steps will reproduce the problem?
1. mvn package [-Dsnappy.prefix=SNAPPY_INSTALLATION_DIR]
2.
3.

What is the expected output? What do you see instead?
The built should create the tarball at 
target/hadoop-snappy-0.0.1-SNAPSHOT.tar.gz

What version of the product are you using? On what operating system?
Linux panda 2.6.32-220.4.2.el6.x86_64
Hadoop = Hadoop 1.0.2
snappy = snappy-1.0.5
maven 3

Please provide any additional information below.


Original issue reported on code.google.com by [email protected] on 22 May 2012 at 8:25

Build Error: SnappyDecompressor is not abstract and does not override abstract method getRemaining()

What steps will reproduce the problem?
1. sudo mvn clean package 
-Dsnappy.prefix=~/hadoop-snappy-read-only/snappy-1.1.1/
2.
3.

What is the expected output? What do you see instead?
Generate ~/target/hadoop-snappy-0.0.1-SNAPSHOT.tar.gz

What version of the product are you using? On what operating system?
Check out from source, and use on Mac OS 10.8.4

Please provide any additional information below.
It failed to build with below error:

[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] 
/Users/Kevin/code/apache/hadoop-snappy-read-only/src/main/java/org/apache/hadoop
/io/compress/snappy/SnappyDecompressor.java:[33,7] error: SnappyDecompressor is 
not abstract and does not override abstract method getRemaining() in 
Decompressor
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------

Original issue reported on code.google.com by [email protected] on 29 Dec 2013 at 6:24

Add support for JSnappy to provide a pure Java implementation of SnappyCodec

SequenceFile are sometimes created in environments where native libraries are 
not available or difficult to set up (think AS/400 or z/OS). In those 
enviroments, the only available compression codec is DefaultCodec which has a 
pure java implementation.

With the availability of JSnappy, a pure java implementation of Snappy, it 
would be nice to have a pure Java implementation of SnappyCodec so the power of 
Snappy compression becomes available on those platforms that lack native Snappy 
libraries.

The included patch assumes that JSnappy is available as jsnappy:jsnappy. As far 
as I know jsnappy must be added manually as I am not aware of any official 
maven artifact being available.

Original issue reported on code.google.com by [email protected] on 2 Jan 2012 at 1:13

Attachments:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.