Giter VIP home page Giter VIP logo

waymo-open-dataset's People

Contributors

carranza96 avatar dkandrew avatar henrikkretzschmar avatar ltskv avatar patzm avatar peisun1115 avatar rezama avatar vatavua avatar xmyqsh avatar yurongyou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

waymo-open-dataset's Issues

Loading data in non-eager mode?

I'm trying to write a tf.data pipeline, but i cannot figure out how to load data when tensorflow is not in eager mode.

FILENAME = "/home/skaae/Downloads/waymo/training_0000/segment-13519445614718437933_4060_000_4080_000_with_camera_labels.tfrecord"
dataset = tf.data.TFRecordDataset(FILENAME, compression_type='')

def parse_data(data):
    frame = open_dataset.Frame()
    frame.ParseFromString(bytearray(data.numpy()))  # <--- is Tensor in normal mode? 
    range_images, camera_projections,range_image_top_pose = parse_range_image_and_camera_projection(frame)
    
    range_image_top = range_images[open_dataset.LaserName.TOP][0]
    range_image_top_tensor = tf.reshape(
            tf.convert_to_tensor(range_image_top.data), range_image_top.shape.dims)

    return range_image_top_tensor

parsed_dataset = dataset.map(parse_data)

Can't follow the ''Build and test'' step

Hello,
When I follow the "Build and test" step described in the Tutorialhttps://colab.research.google.com/github/waymo-research/waymo-open-dataset/blob/r1.0/tutorial/tutorial.ipynb#scrollTo=-pVhOfzLx9us, I can't build it successfully.

This is the log :

In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.h:22:0,
from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:23,
from ./waymo_open_dataset/metrics/ops/utils.h:23,
from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
#error This file was generated by an older version of protoc which is
^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
#error incompatible with your Protocol Buffer headers. Please
^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
#error regenerate this file with a newer version of protoc.

I find the types.pb.h and it says "#if 3006001 < GOOGLE_PROTOBUF_MIN_PROTOC_VERSION"
But I'm pretty sure that my bazel version is 0.28.0 and the protoc version is 3.6.1.

What should I do ? Please help me, thank you !

Occluded Objects

Hi,
After exploring the data, I found out that you have also bounding boxes annotation for objects which are occluded by other objects. Do you have any information telling how much the object is occluded?
Can we infer something from the field:
optional DifficultyLevel detection_difficulty_level = 5;

Thanks,

What is the format of the point cloud from convert_range_image_to_point_cloud?

Hi,

This is my first time to use this data, I learn it from Waymo Open Dataset Tutorial.

But when I want to get the point cloud data such like [x, y, z] or depth_data [distance, ](The distance of each pixel.), I failed to understand.

Here is my some code:

def load_dataset(filenames):
    dataset = tf.data.TFRecordDataset(filenames=filenames, 
                                      compression_type='')
    frames = []
    for data in dataset:
        frame = open_dataset.Frame()
        frame.ParseFromString(bytearray(data.numpy()))
        frames.append(frame)
    
    return frames

frames = load_dataset(TF_RECORD_FILE)

# Point Cloud

points, cp_points = frame_utils.convert_range_image_to_point_cloud(
    frames[0], 
    range_images, 
    camera_projections, 
    range_image_top_pose)

When I print the point cloud data:

print(points[0][0])

# Return
[-36.440834    -0.03664176   3.5856297 ]

print(cp_points[0][0])

# Return
[0 0 0 0 0 0]

So can you tell me about the data format of every channel? And If I want to use it such like (x, y, z) or (distance, ), how to do?

Bazel Build and test execution error

OS: Ubuntu 16.04
Tensorflow Version: 1.8.0 (Python-package CPU)
Anaconda Version: 4.6.2
Protobuf Version (libprotoc): 3.7.1

I am having issues trying to install the repo, following the tutorials/tutorial.ipynb script in the r1.0 branch. I get the following error when I try to run bazel build ... --show_progress_rate_limit=10.0:

WARNING: Ignoring JAVA_HOME, because it must point to a JDK, not a JRE.
WARNING: /home/kenny/.cache/bazel/_bazel_kenny/e25a28ddc954c6434533a57c92d5fe6c/external/local_config_tf/BUILD:1716:1: target 'libtensorflow_framework.so' is both a rule and a file; please choose another name for the rule
INFO: Analyzed 62 targets (57 packages loaded, 8197 targets configured).
INFO: Found 62 targets...
INFO: From Compiling waymo_open_dataset/protos/breakdown.pb.cc:
bazel-out/k8-opt/bin/waymo_open_dataset/protos/breakdown.pb.cc:103:13: warning: 'dynamic_init_dummy_waymo_5fopen_5fdataset_2fprotos_2fbreakdown_2eproto' defined but not used [-Wunused-variable]
 static bool dynamic_init_dummy_waymo_5fopen_5fdataset_2fprotos_2fbreakdown_2eproto = []() { AddDescriptors_waymo_5fopen_5fdataset_2fprotos_2fbreakdown_2eproto(); return true; }();
             ^
INFO: From Compiling waymo_open_dataset/protos/metrics.pb.cc:
bazel-out/k8-opt/bin/waymo_open_dataset/protos/metrics.pb.cc:557:13: warning: 'dynamic_init_dummy_waymo_5fopen_5fdataset_2fprotos_2fmetrics_2eproto' defined but not used [-Wunused-variable]
 static bool dynamic_init_dummy_waymo_5fopen_5fdataset_2fprotos_2fmetrics_2eproto = []() { AddDescriptors_waymo_5fopen_5fdataset_2fprotos_2fmetrics_2eproto(); return true; }();
             ^
INFO: From Compiling waymo_open_dataset/label.pb.cc:
bazel-out/k8-opt/bin/waymo_open_dataset/label.pb.cc:230:13: warning: 'dynamic_init_dummy_waymo_5fopen_5fdataset_2flabel_2eproto' defined but not used [-Wunused-variable]
 static bool dynamic_init_dummy_waymo_5fopen_5fdataset_2flabel_2eproto = []() { AddDescriptors_waymo_5fopen_5fdataset_2flabel_2eproto(); return true; }();
             ^
ERROR: /mnt/linuxsec/datasets/waymo-open-dataset/waymo-od/waymo_open_dataset/metrics/ops/BUILD:10:1: C++ compilation of rule '//waymo_open_dataset/metrics/ops:utils' failed (Exit 1) gcc failed: error executing command /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG -ffunction-sections ... (remaining 53 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
 #error This file was generated by an older version of protoc which is
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
 #error incompatible with your Protocol Buffer headers.  Please
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
 #error regenerate this file with a newer version of protoc.
  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:32:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
 #error This file was generated by an older version of protoc which is
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
 #error incompatible with your Protocol Buffer headers.  Please
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
 #error regenerate this file with a newer version of protoc.
  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:33:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
 #error This file was generated by an older version of protoc which is
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
 #error incompatible with your Protocol Buffer headers.  Please
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
 #error regenerate this file with a newer version of protoc.
  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:34:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
 #error This file was generated by an older version of protoc which is
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
 #error incompatible with your Protocol Buffer headers.  Please
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
 #error regenerate this file with a newer version of protoc.
  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/status.h:23:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:29,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/error_codes.pb.h:17:2: error: #error This file was generated by an older version of protoc which is
 #error This file was generated by an older version of protoc which is
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/error_codes.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please
 #error incompatible with your Protocol Buffer headers.  Please
  ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/error_codes.pb.h:19:2: error: #error regenerate this file with a newer version of protoc.
 #error regenerate this file with a newer version of protoc.
  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:32:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:105:10: error: 'PROTOBUF_CONSTEXPR' does not name a type
   static PROTOBUF_CONSTEXPR int const kIndexInFileMessages =
          ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:135:30: error: 'google::protobuf::uint8* tensorflow::ResourceHandleProto::InternalSerializeWithCachedSizesToArray(bool, google::protobuf::uint8*) const' marked 'final', but is not virtual
   ::google::protobuf::uint8* InternalSerializeWithCachedSizesToArray(
                              ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h: In member function 'void tensorflow::ResourceHandleProto::clear_hash_code()':
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/resource_handle.pb.h:515:34: error: 'GOOGLE_ULONGLONG' was not declared in this scope
   hash_code_ = GOOGLE_ULONGLONG(0);
                                  ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:33:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h: At global scope:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:112:10: error: 'PROTOBUF_CONSTEXPR' does not name a type
   static PROTOBUF_CONSTEXPR int const kIndexInFileMessages =
          ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:142:30: error: 'google::protobuf::uint8* tensorflow::TensorShapeProto_Dim::InternalSerializeWithCachedSizesToArray(bool, google::protobuf::uint8*) const' marked 'final', but is not virtual
   ::google::protobuf::uint8* InternalSerializeWithCachedSizesToArray(
                              ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:254:10: error: 'PROTOBUF_CONSTEXPR' does not name a type
   static PROTOBUF_CONSTEXPR int const kIndexInFileMessages =
          ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:284:30: error: 'google::protobuf::uint8* tensorflow::TensorShapeProto::InternalSerializeWithCachedSizesToArray(bool, google::protobuf::uint8*) const' marked 'final', but is not virtual
   ::google::protobuf::uint8* InternalSerializeWithCachedSizesToArray(
                              ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h: In member function 'void tensorflow::TensorShapeProto_Dim::clear_size()':
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor_shape.pb.h:358:28: error: 'GOOGLE_LONGLONG' was not declared in this scope
   size_ = GOOGLE_LONGLONG(0);
                            ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:34:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h: At global scope:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/types.pb.h:140:101: error: expected class-name before '{' token
 template <> struct is_proto_enum< ::tensorflow::DataType> : ::google::protobuf::internal::true_type {};
                                                                                                     ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:112:10: error: 'PROTOBUF_CONSTEXPR' does not name a type
   static PROTOBUF_CONSTEXPR int const kIndexInFileMessages =
          ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:142:30: error: 'google::protobuf::uint8* tensorflow::TensorProto::InternalSerializeWithCachedSizesToArray(bool, google::protobuf::uint8*) const' marked 'final', but is not virtual
   ::google::protobuf::uint8* InternalSerializeWithCachedSizesToArray(
                              ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:463:10: error: 'PROTOBUF_CONSTEXPR' does not name a type
   static PROTOBUF_CONSTEXPR int const kIndexInFileMessages =
          ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:493:30: error: 'google::protobuf::uint8* tensorflow::VariantTensorDataProto::InternalSerializeWithCachedSizesToArray(bool, google::protobuf::uint8*) const' marked 'final', but is not virtual
   ::google::protobuf::uint8* InternalSerializeWithCachedSizesToArray(
                              ^
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h: In member function 'tensorflow::TensorShapeProto* tensorflow::TensorProto::release_tensor_shape()':
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:633:71: error: no matching function for call to 'DuplicateIfNonNull(tensorflow::TensorShapeProto*&, NULL)'
     temp = ::google::protobuf::internal::DuplicateIfNonNull(temp, NULL);
                                                                       ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:26:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
external/com_google_protobuf/src/google/protobuf/generated_message_util.h:139:4: note: candidate: template<class T> T* google::protobuf::internal::DuplicateIfNonNull(T*)
 T* DuplicateIfNonNull(T* message) {
    ^
external/com_google_protobuf/src/google/protobuf/generated_message_util.h:139:4: note:   template argument deduction/substitution failed:
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:26:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.pb.h:633:71: note:   candidate expects 1 argument, 2 provided
     temp = ::google::protobuf::internal::DuplicateIfNonNull(temp, NULL);
                                                                       ^
In file included from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/status.h:23:0,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/variant.h:29,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/allocator.h:26,
                 from bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/framework/tensor.h:20,
                 from ./waymo_open_dataset/metrics/ops/utils.h:23,
                 from waymo_open_dataset/metrics/ops/utils.cc:16:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/error_codes.pb.h: At global scope:
bazel-out/k8-opt/bin/external/local_config_tf/include/tensorflow/core/lib/core/error_codes.pb.h:115:104: error: expected class-name before '{' token
 template <> struct is_proto_enum< ::tensorflow::error::Code> : ::google::protobuf::internal::true_type {};
                                                                                                        ^
waymo_open_dataset/metrics/ops/utils.cc: In function 'int waymo::open_dataset::GetDesiredBoxDOF(waymo::open_dataset::Label_Box::Type)':
waymo_open_dataset/metrics/ops/utils.cc:91:1: warning: control reaches end of non-void function [-Wreturn-type]
 }
 ^

Any idea on what this build script is trying to depend on, or if I need to install further dependencies?

Does compute_range_image_polar and compute_range_image_cartesian return vehicle frame?

Does compute_range_image_polar and compute_range_image_cartesian return range images in vehicle frame?

I assume it does because in the tutorial extract_point_cloud_from_range_image is called with an extrinsic transform that is from lidar frame to vehicle frame:

extrinsic = np.reshape(np.array(c.extrinsic.transform), [4, 4])
range_image_tensor = tf.reshape(
tf.convert_to_tensor(range_image.data), range_image.shape.dims)
pixel_pose_local = None
frame_pose_local = None
if c.name == dataset_pb2.LaserName.TOP:
pixel_pose_local = range_image_top_pose_tensor
pixel_pose_local = tf.expand_dims(pixel_pose_local, axis=0)
frame_pose_local = tf.expand_dims(frame_pose, axis=0)
range_image_mask = range_image_tensor[..., 0] > 0
range_image_cartesian = range_image_utils.extract_point_cloud_from_range_image(
tf.expand_dims(range_image_tensor[..., 0], axis=0),
tf.expand_dims(extrinsic, axis=0),
tf.expand_dims(tf.convert_to_tensor(beam_inclinations), axis=0),
pixel_pose=pixel_pose_local,
frame_pose=frame_pose_local)

In the docs for compute_range_image_cartesian it says that it expect the range image in polar coordinates to be in sensor frame?

Drawing bounding boxes

Hi,
I am trying to draw bounding boxes on the images. Is this the right way to convert current data (center_x, center_y, width, length) to rectangular bounding box (left, top, right, bottom)?

left = center_x - (length/2)
top = center_y - (width/2)
right = center_x + (length/2)
bottom = center_y + (width/2)

Visualizing Camera Projection

Hi,
I am trying to visualize camera projection on other images by changing images index as follows

mask = tf.equal(cp_points_all_tensor[..., 0], images[3].name)

Here's a screenshot of the output
screen1

Is this the correct way to do it? Does the screenshot look correct?

pip install waymo-open-dataset

Hi.

Now I am setting my local linux envirionment as same as Colab Waymo Open Dataset Tutorial.
When I type command 'pip install waymo-dataset' , error messages occurs
2019-09-02 15;38;36

Could you help me please?

pedestrian id

I have obtained the frames and the 2D coordinates of pedestrians in each frame, but how do I know if the pedestrians appearing in different video frames are the same person?

resolved - metric: 101 point mAP

Hello,

Do you use 11 score_cutoffs to calculate the mAP?
Is it consistent with how KITTI evaluates 3D object detection or are there differences?

Regards

Cannot pip install waymo-open-dataset --user

When I want to install waymo-open-dataset in my system, it went wrong as below:

Collecting waymo-open-dataset
  Could not find a version that satisfies the requirement waymo-open-dataset (from versions: )
No matching distribution found for waymo-open-dataset

how do I get the lidar data ?

Hi!
In the tutorial, lidar data does not contain intensity,how to get lidar data with intensity and how to get a display like vehicle-3D-labeling-example,thanks

About compute detection metric

Hi.
I want to evaluate with your code "compute_detection_metrics_main.cc", but have a problem.

I tried to test "Fake" ones on Quick Start guid line (below command) and it worked well.

bazel-bin/waymo_open_dataset/metrics/tools/compute_detection_metrics_main waymo_open_dataset/metrics/tools/fake_predictions.bin waymo_open_dataset/metrics/tools/fake_ground_truths.bin

Now i want to test detection metrics on my own results tested by waymo open datset validation set (TYPE_2D).
But i confused with results file format.
I found format is following "waymo::open_dataset::Objects proto." , but feel difficulty on making pd_file and gt_file.

SO, may i asking the example of pd_file or gt_file?
for example, like,
"image_name center_x center_y length height score"

Thank you.

Failing the waymo_open_dataset utils test

$ bazel test waymo_open_dataset/utils/...
INFO: Analyzed 7 targets (1 packages loaded, 14 targets configured).
INFO: Found 4 targets and 3 test targets...
FAIL: //waymo_open_dataset/utils:range_image_utils_test (see /home/xxxxxxx/.cache/bazel/xxxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/range_image_utils_test/test.log)
FAIL: //waymo_open_dataset/utils:box_utils_test (see /home/xxxxxx/.cache/bazel/xxxxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/box_utils_test/test.log)
FAIL: //waymo_open_dataset/utils:transform_utils_test (see /home/xxxxxxx/.cache/bazel/xxxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/transform_utils_test/test.log)
INFO: Elapsed time: 13.571s, Critical Path: 13.11s
INFO: 3 processes: 3 linux-sandbox.
INFO: Build completed, 3 tests FAILED, 13 total actions
//waymo_open_dataset/utils:box_utils_test                                FAILED in 12.6s
  /home/xxxxxx/.cache/bazel/_bazel_xxxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/box_utils_test/test.log
//waymo_open_dataset/utils:range_image_utils_test                        FAILED in 12.3s
  /home/xxxxxx/.cache/bazel/_bazel_xxxxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/range_image_utils_test/test.log
//waymo_open_dataset/utils:transform_utils_test                          FAILED in 13.1s
  /home/xxxxxx/.cache/bazel/_bazel_xxxx/32a12db1b695731af2916d563fbc4343/execroot/__main__/bazel-out/k8-opt/testlogs/waymo_open_dataset/utils/transform_utils_test/test.log

I followed the quick start tutorial. My tensorflow version is the same (1.14.0).

Extract object labels?

How do you extract object labels from the 2D/3D data? I wonder if there is a demo code for it. Thx.

Support for non-eager execution (with noted workaround)

Issue: waymo-open-dataset requires Eager execution

The official tutorial uses eager mode: https://colab.research.google.com/github/waymo-research/waymo-open-dataset/blob/master/tutorial/tutorial.ipynb

But it doesn't actually make clear that eager mode is unfortunately required. Attempting to use the code in this python package *without eager mode can result in crashes, e.g.:

    def parse_range_image_and_camera_projection(frame):
      ...
      range_image_top_pose = None
      for laser in frame.lasers:
        if len(laser.ri_return1.range_image_compressed) > 0:  # pylint: disable=g-explicit-length-test
          range_image_str_tensor = tf.decode_compressed(
              laser.ri_return1.range_image_compressed, 'ZLIB')
          ri = dataset_pb2.MatrixFloat()
>         ri.ParseFromString(bytearray(range_image_str_tensor.numpy()))
E         AttributeError: 'Tensor' object has no attribute 'numpy'

This is very sad, because a ton of Tensorflow code (e.g. that in tensorflow/models, tensorflow/tpu, other 3rd party repos, etc etc) would need major changes in order to work with TF Eager execution. While Eager is the default for TF 2.0, public support for TPUs in TF 2.0 is not yet complete ( tensorflow/tensorflow#24412 ).

In the interest of having a library that is as composable as those written for other datasets (e.g. NuScenes, Lyft Level 5, and Argoverse), this is a request for waymo-open-dataset to follow Tensorflow's own recommendation that code support both eager and graph execution. Moreover, there is a large set of functionality in this repo that does not require Tensorflow at all; this is a further request to isolate away the bits that actually require Tensorflow (e.g. the C++ metrics).

Workaround

In the interim, users can use a py_func workaround (see below).

This is a demo of getting all lidar points, fused, using waymo-open-dataset code confined to its own tf.Session:

def get_fused_cloud_in_ego(waymo_frame):

  import tensorflow as tf
  import tensorflow.contrib.eager as tfe

  def get_all_points(wf_str):
    assert tf.executing_eagerly()
    
    from waymo_open_dataset import dataset_pb2 as open_dataset
    wf = open_dataset.Frame()
    wf.ParseFromString(wf_str.numpy())
    
    from waymo_open_dataset.utils import frame_utils
    range_images, camera_projections, range_image_top_pose = (
      frame_utils.parse_range_image_and_camera_projection(wf))

    # Waymo provides two returns for every lidar beam; we have to
    # fuse them manually
    points, cp_points = frame_utils.convert_range_image_to_point_cloud(
        wf,
        range_images,
        camera_projections,
        range_image_top_pose)
    points_ri2, cp_points_ri2 = frame_utils.convert_range_image_to_point_cloud(
        wf,
        range_images,
        camera_projections,
        range_image_top_pose,
        ri_index=1)

    # 3d points in vehicle frame.
    points_all = np.concatenate(points, axis=0)
    points_all_ri2 = np.concatenate(points_ri2, axis=0)
    all_points = np.concatenate((points_all, points_all_ri2), axis=0)
    return all_points

  
  # Run the `get_all_points()` helper in its own session to confine the Eager mode scope
  assert not tf.executing_eagerly()
  with tf.Session() as sess:
    wf = tf.placeholder(dtype=tf.string)
    pf = tfe.py_func(get_all_points, [wf], tf.float32)
    all_points = sess.run(pf, feed_dict={wf: waymo_frame.SerializeToString()})
    return all_points

How to generate bin files for evaluation?

Hi! I'm using Waymo Open Dataset for my own academic research. I'm still vague about how to use the provided tools to do evaluation on the results. Specifically, how to generate the bin files of my own results and the ground truths (for both detection and tracking)? And, is tracking evaluation tool ready for use now?

I really appreciate the effort you've put in producing this high-quality dataset. Thank you!

Building Python 3.7 Pip Package

It is not possible to create a python 3.7 linux pip package by editing the docker file in this page, right? https://github.com/waymo-research/waymo-open-dataset/tree/master/pip_pkg_scripts

I tried adding these commands to dockerfile but did not work:

RUN apt install python-pip
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt-get update
RUN apt-get install python3.7
RUN apt-get install python3.7-dev

Getitng error "No module named 'apt_pkg'" while adding ppa:deadsnakes/ppa repository.

I can install python 3.7 with above commands using base ubuntu 16.04 image in a docker, but not with your custom-op-ubuntu image..

Can not decode images

Hello,

I can't show images following the tutorial. What should I do?

Thanks!

This is the log

Traceback (most recent call last):
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/ops/gen_image_ops.py", line 1169, in decode_jpeg
"acceptable_fraction", acceptable_fraction, "dct_method", dct_method)
tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "Waymo.py", line 77, in
[3, 3, index+1])
File "Waymo.py", line 34, in image_show
plt.imshow(tf.image.decode_jpeg(data), cmap=cmap)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/ops/gen_image_ops.py", line 1178, in decode_jpeg
name=name, ctx=_ctx)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/ops/gen_image_ops.py", line 1255, in decode_jpeg_eager_fallback
contents = _ops.convert_to_tensor(contents, _dtypes.string)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 1087, in convert_to_tensor
return convert_to_tensor_v2(value, dtype, preferred_dtype, name)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 1145, in convert_to_tensor_v2
as_ref=False)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 1224, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/constant_op.py", line 305, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/constant_op.py", line 246, in constant
allow_broadcast=True)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/constant_op.py", line 254, in _constant_impl
t = convert_to_eager_tensor(value, ctx, dtype)
File "/home/user/.env/dl3/lib/python3.7/site-packages/tensorflow/python/framework/constant_op.py", line 115, in convert_to_eager_tensor
return ops.EagerTensor(value, handle, device, dtype)
TypeError: Cannot convert provided value to EagerTensor. Provided value: bytearray(b'

module 'tensorflow' has no attribute 'unsorted_segment_max'

This error comes from ' from waymo_open_dataset.utils import range_image_utils'
The version of tensorflow is 2.0.0
Full error message:
Traceback (most recent call last):
File "waymo_data_preprocess.py", line 12, in
from waymo_open_dataset.utils import range_image_utils
File "/home/shawn/.local/lib/python3.6/site-packages/waymo_open_dataset/utils/range_image_utils.py", line 59, in
pool_method=tf.unsorted_segment_max):
AttributeError: module 'tensorflow' has no attribute 'unsorted_segment_max'

Hope anyone can help. Thank you

2D bounding box

Where can I find the information of the 2D bounding boxes in the scene?

What 'Metrics Directory' do?

I am newbie of deep-learning.

Before question, I apologize with my poor English skill.

I want to know what 'metrics' directory in this repository do.
As I understand, 'metrics' directory give us the function to measure accuracy about result of objection detection.
But, I have any idea that what the example of metrics computation command (with bazel) in Quick Start document do. Also, no idea about output console message means...

Could anyone give me a information what metrics directory do, process of working metrics directory, relationship open dataset with metrics directory, why this directory exist in this repository (I think there is no relationship open dataset with metrics evaluation), when and how this directory will be using with open dataset.

There is no deep learning model in this repository isn't it?
Is 'metrics' repository is using with ground truth file (waymo open dataset) and output file (using custom deep-learning model)?

Could you give any information about this directory?

Thanks.

Is there any elegant way to load the data from the tfrecords for tracking

As reading all the tfrecords together will make it hard to pick consecutive frames for tracking purpose, I can only decode the tfrecords and then save it in the customized format for tracking. Is there any suggestion on how to extract trajectories of objects(from consecutive frames in same segment) directly from the tfrecord files provided?

Range Image Returns

In the website, it says that two range images are provided for each lidar, one for each of the two strongest returns. Why there are multiple range images in one frame? Is there any relationship between them?

message Laser {
  optional LaserName.Name name = 1;
  optional RangeImage ri_return1 = 2;
  optional RangeImage ri_return2 = 3;
}

Ego motion of the dataset

Hi,

Do you provide the ego-motion of the driving car with respect to some reference point in the world coordinate system?

This can be similar to other datasets like nuScenes and Argoverse.

Thanks in advance

ChauffeurNet Dataset

Hello,

Is the ChauffeurNet dataset available? Is it different from this dataset? It has to have map data I think.

Thank you

Bounding boxes question

Hi,
There's a bounding box with a vehicle label but it is behind a garage. How are these bounding boxes determined?

060

bazel build error with gcc-5 and gcc-7

firstly I compiled with gcc-5 and get error log as:
WARNING: /home/chris/.cache/bazel/_bazel_chris/b2fd8959c85968b8f97fc8dbb85c5c52/external/local_config_tf/BUILD:3823:1: target 'libtensorflow_framework.so' is both a rule and a file; please choose another name for the rule
INFO: Analyzed 62 targets (0 packages loaded, 0 targets configured).
INFO: Found 62 targets...
ERROR: /home/chris/work/waymo-od/waymo_open_dataset/math/BUILD:58:1: C++ compilation of rule '//waymo_open_dataset/math:polygon2d' failed (Exit 1) gcc failed: error executing command /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG -ffunction-sections ... (remaining 46 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
waymo_open_dataset/math/polygon2d.cc: In static member function 'static std::string waymo::open_dataset::Polygon2d::PrintPointsToString(const std::vectorwaymo::open_dataset::Vec2d&, bool)':
waymo_open_dataset/math/polygon2d.cc:484:59: error: 'DECIMAL_DIG' was not declared in this scope
::absl::StrAppendFormat(&result, "{%.*e, %.*e},\n", DECIMAL_DIG,
^
INFO: Elapsed time: 1.334s, Critical Path: 1.25s
INFO: 3 processes: 3 linux-sandbox.
FAILED: Build did NOT complete successfully

then I upgrade gcc to v7, the compile still cannot pass:
WARNING: /home/chris/.cache/bazel/_bazel_chris/b2fd8959c85968b8f97fc8dbb85c5c52/external/local_config_tf/BUILD:3823:1: target 'libtensorflow_framework.so' is both a rule and a file; please choose another name for the rule
INFO: Analyzed 62 targets (0 packages loaded, 0 targets configured).
INFO: Found 62 targets...
ERROR: /home/chris/work/waymo-od/waymo_open_dataset/math/BUILD:76:1: undeclared inclusion(s) in rule '//waymo_open_dataset/math:segment2d':
this rule is missing dependency declarations for the following files included by 'waymo_open_dataset/math/segment2d.cc':
'/usr/lib/gcc/x86_64-linux-gnu/7/include/stdint.h'
'/usr/lib/gcc/x86_64-linux-gnu/7/include/stddef.h'
'/usr/lib/gcc/x86_64-linux-gnu/7/include/stdarg.h'
'/usr/lib/gcc/x86_64-linux-gnu/7/include-fixed/limits.h'
'/usr/lib/gcc/x86_64-linux-gnu/7/include-fixed/syslimits.h'
INFO: Elapsed time: 1.341s, Critical Path: 1.25s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

Windows installation guide?

Hi,

Do you guys have any tutorials or guides available for windows installation? I cannot seem to find anything and all pip packages seem to be for linux.

Cheers,
Hamid

ImportError: cannot import name 'dataset_pb2'

When I run "from waymo_open_dataset import dataset_pb2 as open_dataset", there would report an error "ImportError: cannot import name 'dataset_pb2'".
What is the reason for this? What is the 'dataset_pb2'?
Looking forward to your reply!

The label.box.heading is not correct as visulizing in Open3D GUI.

I extracted out only the LiDAR label bounding box and visualized in Open3D GUI. It looks like as follow, but all the bboxes heading angle are verticle to their correct direction.
Screenshot from 2019-09-09 10-38-11
Thus, I add pi/2 to each heading, and it sames to be coorect, but I still don't know why.
Screenshot from 2019-09-09 10-44-51

Question about label

Is the heading also in vehicle frame? or it is in sensor frame?

message Label {
  // Upright box, zero pitch and roll.
  message Box {
    // Box coordinates in vehicle frame.
    optional double center_x = 1;
    optional double center_y = 2;
    optional double center_z = 3;

    // Dimensions of the box. length: dim x. width: dim y. height: dim z.
    optional double length = 5;
    optional double width = 4;
    optional double height = 6;

    // The heading of the bounding box (in radians).  The heading is the angle
    // required to rotate +x to the surface normal of the SDC front face.
    optional double heading = 7;

Coordinate of speed

I noticed that speed_x, speed_y, acc_x and acc_y are labeled under the laser labels for objects, can I know which coordinate are these value in, and for curiosity, I found that some of the speed values are 0, while some are numbers like 1e-15(which is also 0...), how do you get these values and why are them not consistent?

Projecting 3d cuboids to camera images

Hi,

I'm trying to draw 3d cuboid on 2d camera images, so the all corners could appear in the camera images. I can check there is projected_lidar_labels which is 2d bounding boxes for cuboids, but this is not I want to draw. For example, https://www.nuscenes.org/public/images/road.jpg is kind of projected image I would like to make.

I tried to use CameraCalibration and laser_labels in Context to draw cuboids, but I still couldn't figure it out. It seems like cuboids don't align well on objects in camera images.

Thanks.

range_image_top_pose do what?

Hi, I look at the source code of convert_range_image_to_point_cloud, and I find for TOP lidar point cloud , if I want to convert the point cloud to vehicle frame, the source code will first convert the point cloud according the TOP lidar's extrinsic to vehicle frame, and then use range_image_top_pose convert to another coordiante, and then use world_to_vehicle convert the point cloud to vehicle frame, why do that, the TOP lidar's extrinsic seems not enough? but for other lidars, just use the lidar's extrinsic is enough?

Two range images are provided for each lidar?

why a lidar frame can be converted to two range image?
what do you mean The strongest two intensity returns are provided for all five lidars?
one point in lidar point cloud only have one indensity, how you get the strongest two?

Panoramic Image

Hi,
a short question that how can I obtain the panoramic image (combining image from all cameras) from the dataset?
By the way, I am curious about the specific types of the lidars, and regarding the properties, what are the major differences between mid-range and short-range lidars?
Thanks.

dataset_pb2 not cannot be imported

After succesfully downloading all the dependencies, I tried running the test program of your colab on my local machine setup, but it cannot import dataset_pb2 because it does not exist in waymo-open-dataset.

Also running pip3 install waymo-open-dataset returns a "no matching distribution found"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.