Giter VIP home page Giter VIP logo

euler_kokkos's Introduction

DOI C/C++ CI

Euler_kokkos

What is it ?

Provide performance portable (multi-architecture) Kokkos implementation for compressible hydrodynamics (second order, Godunov, MUSCL-Hancock) on cartesian grids.

Dependencies

  • Kokkos library will be built by euler_kokkos using the same flags (architecture, optimization, ...).
  • cmake with version >= 3.X (3.X is chosen to meet Kokkos own requirement for CMake; i.e. it might increase in the future)

Current application is configured with kokkos library as a git submodule. So you'll need to run the following git commands right after cloning euler_kokkos:

git submodule init
git submodule update

Kokkos is built with the same flags as the main application.

Build

A few example builds, with minimal configuration options.

If you already have Kokkos installed

Just make sure that your env variable CMAKE_PREFIX_PATH point to the location where Kokkos where installed. More precisely if Kokkos is installed in KOKKOS_ROOT, you add $KOKKOS_ROOT/lib/cmake to your CMAKE_PREFIX_PATH; this way kokkos will be found automagically by cmake, and the right Kokkos backend will be selected.

mkdir -p build; cd build
cmake -DEULER_KOKKOS_BUILD_KOKKOS=OFF ..
make

Build without MPI / With Kokkos openmp backend

  • Create a build directory, configure and make
mkdir build; cd build
cmake -DEULER_KOKKOS_USE_MPI=OFF -DEULER_KOKKOS_BUILD_KOKKOS=ON -DEULER_KOKKOS_BACKEND=OpenMP ..
make -j 4

Add variable CXX on the cmake command line to change the compiler (clang++, icpc, pgcc, ....).

Build without MPI / With Kokkos cuda backend

  • Create a build directory, configure and make
mkdir build; cd build
# If you are compiling and running on the same host, you can omit architecture flags,
# Kokkos will detect the GPU architecture available on your paltform
cmake -DEULER_KOKKOS_USE_MPI=OFF -DEULER_KOKKOS_BUILD_KOKKOS=ON -DEULER_KOKKOS_BACKEND=Cuda -DKokkos_ARCH_MAXWELL50=ON ..
make -j 4

nvcc_wrapper is a compiler wrapper arroud NVIDIA nvcc. It is available from Kokkos sources: external/kokkos/bin/nvcc_wrapper. Any Kokkos application target NVIDIA GPUs must be built with nvcc_wrapper.

Build with MPI / With Kokkos cuda backend

Please make sure to use a CUDA-aware MPI implementation (OpenMPI or MVAPICH2) built with the proper flags for activating CUDA support.

It may happen that eventhough your MPI implementation is actually cuda-aware, cmake find_package macro for MPI does not detect it to be cuda aware. In that case, you can enforce cuda awareness by turning option EULER_KOKKOS_USE_MPI_CUDA_AWARE_ENFORCED to ON.

You don't need to use mpi compiler wrapper mpicxx, cmake should be able to correctly populate MPI_CXX_INCLUDE_PATH, MPI_CXX_LIBRARIES which are passed to all final targets.

  • Create a build directory, configure and make
mkdir build; cd build
cmake -DEULER_KOKKOS_USE_MPI=ON -DEULER_KOKKOS_BUILD_KOKKOS=ON -DEULER_KOKKOS_BACKEND=Cuda -DKokkos_ARCH_MAXWELL50=ON ..
make -j 4

Example command line to run the application (1 GPU used per MPI task)

mpirun -np 4 ./euler_kokkos ./test_implode_2D_mpi.ini

Build for AMD GPU with Kokkos Hip backend

Make sure to have rocm/hip tools with version at least 5.2 if build againt kokkos 4.0.

For example:

mkdir build/hip; cd build/hip
export CXX=hipcc
cmake -DEULER_KOKKOS_USE_MPI=ON -DEULER_KOKKOS_BUILD_KOKKOS=ON -DEULER_KOKKOS_BACKEND=HIP -DKokkos_ARCH_VEGA90A=ON ..
make -j 4

Developping with vim or emacs and semantic completion/navigation from ccls

Make sure to have CMake variable CMAKE_EXPORT_COMPILE_COMMANDS set to ON, it will generate a file named compile_commands.json. Then you can symlink the generated file in the top level source directory.

Please visit :

Build Documentation

A Sphinx/html documentation will (hopefully) soon be populated.

To build it:

mkdir -p build/doc
cd build/doc
# build doxygen documentation
cmake .. -DEULER_KOKKOS_BUILD_DOC:BOOL=ON -DEULER_KOKKOS_DOC_TYPE:STRING=doxygen
# build sphinx/html documentation
cmake .. -DEULER_KOKKOS_BUILD_DOC:BOOL=ON -DEULER_KOKKOS_DOC_TYPE:STRING=html

Building sphinx documentation requires to have python3 with up-to-date breathe extension.

euler_kokkos's People

Contributors

pkestene avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

euler_kokkos's Issues

segmentation error for using the cuda backend

Hi, I am trying to learn and use your MPI+kokkos code. It went well for MPI+openmp in kokkos but failed for MPI+cuda. Here is what I got: Do you have sense what's wrong for cuda backend? Thanks

I'm MPI task #3 (out of 4) pinned to GPU #0
(out of 4) pinned to GPU #0
We are about to start simulation with the following characteristics
Global resolution : 256 x 256 x 1
Local resolution : 128 x 128 x 1
MPI Cartesian topology : 2x2x1
[c196-012:10061:0:10061] Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x2ab54d4c3e80)
[c196-012:10062:0:10062] Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x2ac7db4c3e80)
[c196-012:10063:0:10063] Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x2b64db4c3e80)
[c196-012:10064:0:10064] Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x2b131d4c3e80)
==== backtrace (tid: 10064) ====
0 0x000000000004cb95 ucs_debug_print_backtrace() ???:0
1 0x000000000089f648 I_MPI_memcpy_movsb() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/i_mpi_memcpy_sse.h:11
2 0x000000000089f648 bdw_memcpy_write() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_memcpy.h:146
3 0x000000000089bce9 write_to_cell() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_memcpy.h:326
4 0x000000000089bce9 send_cell() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_send.h:890
5 0x00000000008959a4 MPIDI_POSIX_eager_send() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_send.h:1540
6 0x0000000000755399 MPIDI_POSIX_eager_send() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/posix_eager_impl.h:37
7 0x0000000000755399 MPIDI_POSIX_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/../posix/posix_am.h:220
8 0x0000000000755399 MPIDI_SHM_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/shm_am.h:49
9 0x0000000000755399 MPIDIG_isend_impl() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:116
10 0x000000000075870e MPIDIG_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:172
11 0x000000000075870e MPIDIG_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:233
12 0x000000000075870e MPIDI_POSIX_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/../posix/posix_send.h:59
13 0x000000000075870e MPIDI_SHM_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/shm_p2p.h:187
14 0x000000000075870e MPIDI_isend_unsafe() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:314
15 0x000000000075870e MPIDI_isend_safe() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:609
16 0x000000000075870e MPID_Isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:828
17 0x000000000075870e PMPI_Sendrecv() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpi/pt2pt/sendrecv.c:181
18 0x00000000004c453a hydroSimu::MpiComm::sendrecv() ???:0
19 0x0000000000491e9b euler_kokkos::SolverBase::transfert_boundaries_2d() ???:0
20 0x00000000004a1877 euler_kokkos::SolverBase::make_boundaries_mpi() ???:0
21 0x000000000044d786 euler_kokkos::muscl::SolverHydroMuscl<2>::make_boundaries() ???:0
22 0x0000000000445223 euler_kokkos::muscl::SolverHydroMuscl<2>::SolverHydroMuscl() ???:0
23 0x0000000000445d05 euler_kokkos::muscl::SolverHydroMuscl<2>::create() ???:0
24 0x00000000004152e8 euler_kokkos::SolverFactory::create() ???:0
25 0x00000000004116e6 main() ???:0
26 0x0000000000022555 __libc_start_main() ???:0
27 0x0000000000414fec _start() ???:0

=================================
==== backtrace (tid: 10063) ====
0 0x000000000004cb95 ucs_debug_print_backtrace() ???:0
1 0x000000000089f648 I_MPI_memcpy_movsb() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/i_mpi_memcpy_sse.h:11
2 0x000000000089f648 bdw_memcpy_write() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_memcpy.h:146
3 0x000000000089bce9 write_to_cell() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_memcpy.h:326
4 0x000000000089bce9 send_cell() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_send.h:890
5 0x00000000008959a4 MPIDI_POSIX_eager_send() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/intel_transport_send.h:1540
6 0x0000000000755399 MPIDI_POSIX_eager_send() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/posix/eager/include/posix_eager_impl.h:37
7 0x0000000000755399 MPIDI_POSIX_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/../posix/posix_am.h:220
8 0x0000000000755399 MPIDI_SHM_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/shm_am.h:49
9 0x0000000000755399 MPIDIG_isend_impl() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:116
10 0x000000000075870e MPIDIG_am_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:172
11 0x000000000075870e MPIDIG_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/generic/mpidig_send.h:233
12 0x000000000075870e MPIDI_POSIX_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/../posix/posix_send.h:59
13 0x000000000075870e MPIDI_SHM_mpi_isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/shm/src/../src/shm_p2p.h:187
14 0x000000000075870e MPIDI_isend_unsafe() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:314
15 0x000000000075870e MPIDI_isend_safe() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:609
16 0x000000000075870e MPID_Isend() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpid/ch4/src/ch4_send.h:828
17 0x000000000075870e PMPI_Sendrecv() /localdisk/jenkins/workspace/workspace/ch4-build-linux-2019/impi-ch4-build-linux_build/CONF/impi-ch4-build-linux-release/label/impi-ch4-build-linux-intel64/_buildspace/release/../../src/mpi/pt2pt/sendrecv.c:181
18 0x00000000004c453a hydroSimu::MpiComm::sendrecv() ???:0
19 0x0000000000491e9b euler_kokkos::SolverBase::transfert_boundaries_2d() ???:0
20 0x00000000004a1877 euler_kokkos::SolverBase::make_boundaries_mpi() ???:0
21 0x000000000044d786 euler_kokkos::muscl::SolverHydroMuscl<2>::make_boundaries() ???:0
22 0x0000000000445223 euler_kokkos::muscl::SolverHydroMuscl<2>::SolverHydroMuscl() ???:0
23 0x0000000000445d05 euler_kokkos::muscl::SolverHydroMuscl<2>::create() ???:0
24 0x00000000004152e8 euler_kokkos::SolverFactory::create() ???:0
25 0x00000000004116e6 main() ???:0
26 0x0000000000022555 __libc_start_main() ???:0
27 0x0000000000414fec _start() ???:0

Question about the loop method

Hi,

Thank you for contributing to such a great library.

I have a question regarding the loop method used in this repository. Based on the experiment conducted in the repository, which was also carried out by you, it appears that the 3D data array flat loop, test_stechcil_3d_flat, has relatively poor performance in both Cuda and OpenMP. I'm curious why it is still being used in this repository.

Have a nice day!

Build issue with MPI and no OpenMP

Hello,

I'm trying to build this miniapp and get some neat visualizations. I'm not entirely sure, but from my searching this app is what seems to have been used in this paper

Here's my CMake:

$ cmake -DUSE_MPI=ON -DKokkos_ENABLE_HWLOC=ON -DKokkos_ENABLE_OPENMP=OFF ..
-- MPI support found
-- MPI compile flags: -pthread
-- MPI include path: /home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/include
-- MPI LINK flags path: -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/hwloc-1.11.11-lrtc6lv6yhve3m7jir435xc3k2pa34n5/lib -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/zlib-1.2.11-vk2i2rkkp2rz74b2g3gw3m27iupfeqt5/lib -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/lib -pthread
-- MPI libraries: /home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/lib/libmpi.so
CMake Warning at CMakeLists.txt:96 (message):
  OpenMPI found, but it is not built with CUDA support.


-- Setting default Kokkos CXX standard to 11
-- The project name is: Kokkos
-- Using -std=c++11 for C++11 standard as feature
-- Execution Spaces:
--     Device Parallel: NONE
--     Host Parallel: NONE
--       Host Serial: SERIAL
--
-- Architectures:
//===================================================
  euler_kokkos build configuration:
//===================================================
  C++ Compiler : GNU 7.5.0
    /usr/bin/c++
  MPI enabled
  Kokkos OpenMP enabled : OFF
  Kokkos CUDA   enabled : OFF
  Kokkos HWLOC  enabled : ON

-- Configuring done
-- Generating done
-- Build files have been written to: /home/users/spollard/mpi-error/euler_kokkos/build

But I get an error which is the following (many lines omitted)

[ 54%] Building CXX object src/shared/CMakeFiles/shared.dir/SolverBase.cpp.o
/home/users/spollard/mpi-error/euler_kokkos/src/shared/SolverBase.cpp: In member function ‘void euler_kokkos::SolverBase::transfert_boundaries_2d(Direction)’:
/home/users/spollard/mpi-error/euler_kokkos/src/shared/SolverBase.cpp:756:57: error: ‘DataArray2d {aka class Kokkos::View<double***, Kokkos::Serial>}’ has no member named ‘ptr_on_device’
     params.communicator->sendrecv(borderBufSend_xmin_2d.ptr_on_device(),
                                                         ^~~~~~~~~~~~~
.
.
.
.
/home/users/spollard/mpi-error/euler_kokkos/src/shared/mpiBorderUtils.h:122:18: error: ‘const DataArray {aka const class Kokkos::View<double****, Kokkos::Serial>}’ has no member named ‘dimension_2’; did you mean ‘dimension’?
       offset = U.dimension_2()-ghostWidth;
                ~~^~~~~~~~~~~
                dimension
src/shared/CMakeFiles/shared.dir/build.make:86: recipe for target 'src/shared/CMakeFiles/shared.dir/SolverBase.cpp.o' failed

I haven't used kokkos before, but I suspect it has to do with an incompatible kokkos version. Do you know how I could fix this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.