Giter VIP home page Giter VIP logo

volumetriccapture's People

Contributors

ankarako avatar leosarog avatar papachra avatar tofis avatar vladsterz avatar zokin avatar zuru avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

volumetriccapture's Issues

NUC SSD size?

Hi, how much hard drive space is required on the Intel mini-pcs? is 128gb sufficient?

Intel Realsense 435 compatibility?

Your documentation lists Intel Realsense 415, are the 435 cameras also compatible?

Or are the results with the Kinect Azure cameras significantly better and worth waiting for that update to be pushed if I haven't yet ordered cameras?

Calibration Fails

I am following the Wiki and turning both iteration options in Calibration>Configure up to the max and then going to Calibration>Capture. It seems to successfully capture the images. I then go to Calibration>Calibrate and it runs for a little bit before giving me the following error:

Device: JUP01, correspondences: 1, error: 1.5940710465918552e-32
Traceback (most recent call last):
File "Data\Executables\Calibration\multisensor_calibration\calibration.py", line 99, in <module> main(sys.argv)
File "Data\Executables\Calibration\multisensor_calibration\calibration.py", line 47, in main "--save_pointclouds", "True"
File "C:\VCLVolumetric\Release_4_0_2\volumetric_capture\Data\Executables\Calibration\multisensor_calibration\src\structurenet_calibration.py", line 138, in main predicted_centers = np.stack([pcloud[np.transpose(label_map) == id + 1,:].mean(axis = 0) for id in valid_ids])
File "C:\Users\jesse\anaconda3\lib\site-packages\numpy\core\shape_base.py", line 412, in stack raise ValueError('need at least one array to stack')
ValueError: need at least one array to stack

I've made sure to update the calibration tool as explained in the Wiki by running 'python.exe install.py'.

I tried running calibration last night as well, and it seemed to work (no error messages) but when I clicked Load Last Config nothing happened and the point clouds didn't move.

One thing to note (this may be a separate bug so I'll open as a new issue as well) is that the pointclouds displayed are very shallow. I've increased the boundaries to the maximum but it still only captures up to about six feet away, which is much less than the Kinect is capable of. I've tested in the k4aviewer app and it can capture the entire distance of my room. Due to this issue, it seems like it's not getting much info from the calibration boxes.

Any help would be greatly appreciated. Thank you!

UPDATE:
I tried setting up two Kinects to test this process again. Since my depth units cannot be changed without crashing, I set the calibration boxes up as close to the Kinects as possible. Then I captured frames and calibrated. This time it did not give me the errors listed above and it seemed to complete successfully, but when I selected 'Load Latest' the pointclouds did not move or align at all. In the console, this was the output:

Changed calibration arguments to:
        --local_iters 50 --global_iters 50
Calib capture started /w curent laser power (nan) set to 300
Calib capture stopped and laser power is reset to nan
Loading extrinsic calibration info from C:\VCLVolumetric\Release_4_0_2\volumetric_capture\Data\Calibrations\20-08-05-21-09-56
Calibration file not found !
Loading external calibration results:
Method ID: 0
Loading extrinsic calibration info from C:\VCLVolumetric\Release_4_0_2\volumetric_capture\Data\Calibrations\20-08-05-21-09-56
Calibration file not found !
Loading external calibration results:
Method ID: 0
Changed calibration arguments to:
        --local_iters 50 --global_iters 50
Calib capture started /w curent laser power (nan) set to 300
Calib capture stopped and laser power is reset to nan
Exited with code { 0 } : The operation completed successfully.
Loading extrinsic calibration info from C:\VCLVolumetric\Release_4_0_2\volumetric_capture\Data\Calibrations\20-08-05-21-14-59
Calibration file not found !
Loading external calibration results:
Method ID: 0
Exited with code { 0 } : The operation completed successfully.
Exited with code { 0 } : The operation completed successfully.
Loading extrinsic calibration info from C:\VCLVolumetric\Release_4_0_2\volumetric_capture\Data\Calibrations\20-08-05-21-14-59
Calibration file not found !
Loading external calibration results:
Method ID: 0

It looks like it's not finding the calibration file for some reason. I posted a video of my calibration attempt here.

Another thing to note is that my pointclouds are displayed sideways in the viewport. Is it possible this is causing a calibration issue? Is it possible to rotate them to the correct orientation?

Workstation PC:
Windows 10 Home x64 v. 1909
Intel i9-9900K @ 3.60 GHz
32gb RAM
NVIDIA GTX 2060

Capture PCs:
Windows 10 Pro
(3) Intel NUC i7 (NUC7i7BNH)
8gb RAM
225gb SSD

(3) Kinects for Azure
Kinect for Azure SDK 1.4.1
Firmware v1.6.110079014

Network Switch:
Netgear Unmanaged 8-port Gigabit Switch

Connecting devices/sensors to Volcap

Hi team,

Greeting of the day,
as mention in the "https://vcl3d.github.io/VolumetricCapture/docs/configure/", i have created the device_repository.json, and i do have some issue also, as mention in the site, while opening the VolCap.exe. the device name will automatically listed in VolCap.exe as per device_repository. but its not showing the List of device.
Volcap.exe_Snapshot,

while checking in intel_nuc_led_utils ( Eye-NUC). am getting this Exception
intel_nuc_led_utils

i was able to find the log from the "remote_eye_service", the log pasted below.
"WARNINGS
* The manifest for this application does not have a signature. Signature validation will be ignored.

OPERATION PROGRESS STATUS
* [12-04-2021 02:05:23 PM] : Activation of C:\Capturer\remote_eye_service\remote_eye_service.application has started.
* [12-04-2021 02:05:23 PM] : Processing of deployment manifest has successfully completed.
* [12-04-2021 02:05:23 PM] : Installation of the application has started.

ERROR DETAILS
Following errors were detected during this operation.
* [12-04-2021 02:05:23 PM] System.Deployment.Application.InvalidDeploymentException (Zone)
- Deployment and application do not have matching security zones.
- Source: System.Deployment
- Stack trace:
at System.Deployment.Application.DownloadManager.DownloadApplicationManifest(AssemblyManifest deploymentManifest, String targetDir, Uri deploymentUri, IDownloadNotification notification, DownloadOptions options, Uri& appSourceUri, String& appManifestPath)
at System.Deployment.Application.ApplicationActivator.DownloadApplication(SubscriptionState subState, ActivationDescription actDesc, Int64 transactionId, TempDirectory& downloadTemp)
at System.Deployment.Application.ApplicationActivator.InstallApplication(SubscriptionState& subState, ActivationDescription actDesc)
at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl, Uri& deploymentUri)
at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivationWithRetry(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivationWithRetry(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl)
at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state)"

Thank you
Ganesh

Crash When Changing Depth Units Value

The default depth units setting is 1000 and this is too shallow for my capture space. When I try to change the value on this slider, the program crashes. I've tried restarting everything and attempting multiple times. Just to be sure, changing this value increases the depth of what is being captured, correct? If not, please let me know the correct way to achieve this.

I believe the capture depth being too shallow is also causing the issue I'm seeing with calibration (#29). I'm not able to capture enough information from the boxes due to the capture space being too shallow.

Please let me know if I can help by providing more info or logs of any kind. Thank you!

Workstation PC:
Windows 10 Home x64 v. 1909
Intel i9-9900K @ 3.60 GHz
32gb RAM
NVIDIA GTX 2060

Capture PCs:
Windows 10 Pro
(3) Intel NUC i7 (NUC7i7BNH)
8gb RAM
225gb SSD

(3) Kinects for Azure
Kinect for Azure SDK 1.4.1
Firmware v1.6.110079014

Network Switch:
Netgear Unmanaged 8-port Gigabit Switch

Using intel PC instead of mini PCs

Hi,

I am starting to explore Volumetric capture using D415 and VLC3D. Right now I do not have mini PCs. can i try with a PC instead of mini PC.

System configuration : Check

I have the following System configuration.... will it work
Intel xeon processor
(Processor Intel(R) Xeon(R) CPU E5-1607 0 @ 3.00GHz, 3000 Mhz, 4 Core(s), 4 Logical Processor(s))
16 gb ram
250 gb ssd
320 gb hdd
NVIDIA Quadro 2000
Windows 10 pro

Fail to stream with Kinect Azure

Hi,

I am able to succesfully connect an Azure Kinect. Once I click "Connect" in the "Available Devices" widget, a snapshot of the scene appears both in the 3D space and in the color preview. But it seems to work only for a couple of frames (kinect light turns on and then turns off again). The console shows no errors.

I've tried with 2 different network switches. Any clues on what could be wrong?

This is my remote_eye.log file:

2020-05-21 12:37:57.664 INFO  [6676] [main@66] Logging to: C:\Capturer\bin\remote_eye.log
2020-05-21 12:37:58.103 INFO  [6676] [main@73] Connected K4A device /w serial no: 000135500112
2020-05-21 12:37:58.103 INFO  [6676] [main@78] Connecting to RabbitMQ broker: amqp://volumetric:[email protected]:5672 # Handshake
2020-05-21 12:37:59.109 INFO  [6676] [remote::handshake::say_hello@50] Sent hello from 000135500112
2020-05-21 12:37:59.109 INFO  [6676] [main@221] Waiting for welcome (press 'q' to quit) ...
2020-05-21 12:38:25.187 DEBUG [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@104] Received Welcome:
{ msg = Welcome , serial = 000135500112 /w stream channel = Kinect01_c1280_x_720_d320_x_288_stream /w params channel = Kinect01_params /w workflow channel = Kinect01_workflow /w info channel = Kinect01_info /w preset = accuracy-100um /w sync type = Default
}
2020-05-21 12:38:25.187 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@116] Creating networking components...
2020-05-21 12:38:25.188 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@124] Streaming channel opened.
2020-05-21 12:38:25.193 ERROR [4276] [local::led_controller::led_controller@59] NUC Led controller app is not available @ C:\Capturer\bin\intel_nuc_led_utils.exe.
2020-05-21 12:38:25.193 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@134] Command channel opened.
2020-05-21 12:38:25.193 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@154] Initializing syncer on C:\Capturer\bin\PTPSync_slave.exe ...
2020-05-21 12:38:25.193 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@160] Syncer initialized.
2020-05-21 12:38:25.193 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@163] Initializing device w Color-Depth streams.
2020-05-21 12:38:25.193 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@170] Loading preset <accuracy-100um> from C:\Capturer\bin\Resources\accuracy-100um.json
2020-05-21 12:38:25.234 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@175] Acquiring device parameters...
2020-05-21 12:38:25.284 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@185] Device parameters initialized from configuration file: C:\Capturer\bin\Resources\accuracy-100um.json
2020-05-21 12:38:25.355 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 0
time = 66575301200362
action = 2
dev param = 1
value = 000001FA3254EBD0
}
2020-05-21 12:38:25.409 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@187] Device started.
2020-05-21 12:38:25.410 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@208] Device parameters session created.
2020-05-21 12:38:25.410 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@213] Encoding /w compression params:

ENTROPY: { Type = 3 , Level = 4 , Shuffle = 1 }
2020-05-21 12:38:25.410 INFO  [4276] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::operator ()@217] Streaming pipeline ready.
2020-05-21 12:38:25.504 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 1
time = 66575407463722
action = 2
dev param = 6
value = 000001FA32E58F90
}
2020-05-21 12:38:25.507 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 1
time = 66575407463722
action = 3
dev param = 6
value = 000001FA32E590B0
}
2020-05-21 12:38:25.656 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 2
time = 66575513731315
action = 2
dev param = 7
value = 000001FA32E59140
}
2020-05-21 12:38:25.806 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 3
time = 66575619986279
action = 2
dev param = 8
value = 000001FA32E58FC0
}
2020-05-21 12:38:25.958 DEBUG [18880] [main::<lambda_19922d6ce101d1688e5e6b4773c2ac7d>::()::<lambda_ee3d8ba1313f35c7f4e3ff6a4d2d9a7e>::operator ()@145] Received Workflow Update Command:
{
id = 4
time = 66575726235668
action = 4
dev param = 7
value = 000001FA32E59550
}

Device_repository needs more info on wiki

The wiki of device_repository needs to include a step by step explanation of how to create the final json file and where to put it (all the folders).
I notice when you put only on root folder the calibration step doesn't find the intrinsecs, but when you put it on ~/Resources and ~/Data/Executables/Calibration it works perfectly. Maybe you only need to put in root and one of those folders, but I noticed you already have the same json inside in both of them.

An attempt was made to access a socket in a way forbidden by its access permissions

  • At this point we have we have set up RabbitMQ, Setup the firewall with port TCP 5672(IN, OUT), TCP 15672(IN, OUT),UDP 11234(IN, OUT) & UDP 320,321. On master and Subordinate PCs
  • We have downloaded Volcap Files 5.0.0
  • Configured all services in subordinate PC (Win Update, remote desktop,etc)
  • Installed REMOTE_EYE_SERVICE on all PCs
  • In RabbitMQ we have set up Volumetric/capture User/Password
  • Device repository files has also been created for Microsoft Azure.

NOW, while starting the volcap.exe we are getting this error and we are not able to view our sensors.

Connecting to broker: { amqp://volumetric:[email protected]:5672}
Exception: send_to: An attempt was made to access a socket in a way forbidden by its access permissions

We have tried opening the Volcap application with this command which has the custom IP found on remote_eye_k4a.exe volcap.exe -b 195.251.117.104 -u Volumetric -p capture Still no luck. Getting the same error

Any help would be highly appreciated.
Thank You!

3 kinects 1 workstation

I have a powerful workstation with enough usb bandwidth to stream/record from 3 kinects. Can I setup 3 capturers on one station? Will the recorder find them locally with rabbitmq? Thanks!

Remote eye service is not working

I follow the instruction: installed RabbitMQ and open 5672,11234 port on work station and min-PCs.
Download remote_eye.zip and extract to C:/Capturer, then install service with administrator.
But the devices won't show up.
I use Volumetric Capture and Tools Release 3.0 and remote_eye.zip 2.0
Thank you.

Add additional step to installing remote eye to documentation

In my experience, you have to run the Remote Eye app as an administrator once and OK the security dialog that appears before running the install script, otherwise the install script will fail. Might be nice to note this in the documentation.

Intel D415's stream fails to start

In the volumetric_capture.exe, after hitting "Connect" to device, the stream fails to start, resulting in blank camera previews (see attached img of UI). The streams work occasionally, but this bug happens most of the time.

Edit: I am using volumetric_capture.exe from v3.0, and remote_eye_service from v2.0

stream_fail_bug

Upon inspecting the remote_eye.log on the mini-pcs, an error "No frames received, resetting device hardware ..." is given. Here is the full log of the incident:

2020-04-09 11:26:26.892 INFO [2688] [main@76] Logging to: C:\Capturer\bin\remote_eye.log
2020-04-09 11:26:27.435 INFO [2688] [main@87] Connected RS2 device /w serial no: 928222063567
2020-04-09 11:26:27.435 INFO [2688] [main@92] Connecting to RabbitMQ broker: amqp://volumetric:[email protected]:5672 # Handshake
2020-04-09 11:26:28.454 INFO [2688] [remote::handshake::say_hello@49] Sent hello from 928222063567
2020-04-09 11:26:28.454 INFO [2688] [main@230] Waiting for welcome (press 'q' to quit) ...
2020-04-09 11:26:32.109 DEBUG [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@119] Received Welcome:
{ msg = Welcome , serial = 928222063567 /w stream channel = d415_0_c1280_x_720_d320_x_180_stream /w params channel = d415_0_params /w workflow channel = d415_0_workflow /w info channel = d415_0_info /w preset = accuracy-100um /w sync type = Default
}
2020-04-09 11:26:32.110 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@131] Creating networking components...
2020-04-09 11:26:32.111 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@139] Streaming channel opened.
2020-04-09 11:26:32.117 ERROR [9208] [local::led_controller::led_controller@59] NUC Led controller app is not available @ C:\Capturer\bin\intel_nuc_led_utils.exe.
2020-04-09 11:26:32.117 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@149] Command channel opened.
2020-04-09 11:26:32.118 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@161] Initializing syncer on C:\Capturer\bin\PTPSync_slave.exe ...
2020-04-09 11:26:32.118 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@163] Syncer initialized.
2020-04-09 11:26:32.119 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@167] Initializing device...
2020-04-09 11:26:32.119 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@174] Loading preset from C:\Capturer\bin\Resources\accuracy-100um.json
2020-04-09 11:26:32.426 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@178] Acquiring device parameters...
2020-04-09 11:26:34.268 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@187] Device parameters initialized from configuration file: C:\Capturer\bin\Resources\accuracy-100um.json
2020-04-09 11:26:34.555 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@190] Device started.
2020-04-09 11:26:34.555 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@215] Device parameters session created.
2020-04-09 11:26:34.556 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@221] Encoding /w compression params:
JPEG: { Quality = 70 , Subsampling = 1 , Format = 1 , Fast DCT = on }
ENTROPY: { Type = 3 , Level = 4 , Shuffle = 1 }
2020-04-09 11:26:34.556 INFO [9208] [main::<lambda_241a8e4d42eec98c02aa0213c5be8deb>::operator ()@226] Streaming pipeline ready.
2020-04-09 11:26:35.062 ERROR [3452] [aqzn::realsense2::rs2rgbd_pipeline_autorecov_producer::reset@288] No frames received, resetting device hardware ...

Bad/Inconsistent Framerate with Three Kinects

Everything works well when two Kinects are activated and framerates are @ 30fps. When I activate the third Kinect, two of the streams are fine but the third lags badly and the framerate jumps around quite a bit. Here is a video of the issue. Let me know if I can provide anything else to help diagnose this.

Workstation PC:
Windows 10 Home x64 v. 1909
Intel i9-9900K @ 3.60 GHz
32gb RAM
NVIDIA GTX 2060

Capture PCs:
Windows 10 Pro
(3) Intel NUC i7 (NUC7i7BNH)
8gb RAM
225gb SSD

(3) Kinects for Azure
Kinect for Azure SDK 1.4.1
Firmware v1.6.110079014

Network Switch:
Netgear Unmanaged 8-port Gigabit Switch

No Devices Found

I've followed all the instructions and when I open volumetric_capture.exe, no devices show up on the Connect UI. When it opens it displays:

Connecting to broker: { amqp://volumetric:capture@IPADDRESS:5672}

Does something need to be started on the NUCs? Remote Eye is installed on them and all firewall settings are correct on all devices.

If I plug a Kinect directly into the workstation PC, the device shows up in the Connect UI, but if I select it and click 'Connect' the software crashes.

Another thing to note is when I use the device_repository.exe to add devices, it seems to add them successfully but then I receive the following errors:

2020-07-30 19:09:42.703 DEBUG [19672] [main@52] Press any key to exit.

[2020-07-30 19:09:45.254] [error] [t=19672] D:\C++_Libraries\Azure-Kinect-Sensor-SDK-develop\studio_17\Azure-Kinect-Sensor-SDK\src\dynlib\dynlib_windows.c (115): dynlib_create(). Failed to load DLL depthengine_2_0 with error code: 126
[2020-07-30 19:09:45.254] [error] [t=19672] D:\C++_Libraries\Azure-Kinect-Sensor-SDK-develop\studio_17\Azure-Kinect-Sensor-SDK\src\deloader\deloader.cpp (75): deloader_init_once(). Failed to Load Depth Engine Plugin (depthengine). Depth functionality will not work
[2020-07-30 19:09:45.254] [error] [t=19672] D:\C++_Libraries\Azure-Kinect-Sensor-SDK-develop\studio_17\Azure-Kinect-Sensor-SDK\src\deloader\deloader.cpp (76): deloader_init_once(). Make sure the depth engine plugin is in your loaders path

Any help would be greatly appreciated. Is there someone I can contact to get some more information about this software? Thank you!

synch circuit

hi there,

I want to realize a synchronized multicamera system as described in this repo.
the question might seem naive...but I am trying to understand the synchronization circuit as it is described in wiki

https://github.com/VCL3D/VolumetricCapture/wiki/Synchronization-Cables

and but I can't figure out how this can be equivalent to what is shown on the Intel white paper

https://realsense.intel.com/wp-content/uploads/sites/63/Intel_RealSense_Multi-camera_Webinar.pdf

would you please clarify why in the wiki instruction every camera has 2 resistors and 1 capacitor and why the connection are different?

thanks a lot

Are Intel NUCs a hard requirement?

I do not have Intel NUC's at the moment, but I can configure some PCs that were previously used in a render-farm in order to test VCL3D.
Are Intel NUCs a hard requirement? (e.g. I tried to install CUDA 8.0 on my desktop with GeForce RTX 2060 and apparently CUDA 8.0 is too old for it).

Thanks!

Extend with other camera

I would like to build this system with other common usb cameras, can they be synchronized so that the point cloud can be aligned with usb cameras.

Volcap not listing the devices from the device_repository.json

Hello,
I have installed everything in two computers, one as the main for the volcap app and another for the remote eye.

I have first configured the devices with the dev_repo.exe and it creates the file device_repository.json inside Resources succesfully.

I have tested it with the last release (5.0.0) and i am not getting any errors executing volcap.exe but no single device is listed on the app. I have tried this with the remote connected to the rabbitMQ but i understand that it should not be necessary.

I attach here the device_repository.json file that is generated.
Does anyone know what could be the issue?
Thanks in advance.

device_repository.json.txt

Need Cuda 9.2

The need to have CUDA 9.2 is missing in the wiki and Volumetric Capture needs it to calibrate the cameras

volumetric_capture.exe 'calibrate' fails to locate directories

I'm following the directions from the wiki on how to calibrate. Upon clicking Calibration > Calibrate, I see the following errors saying the directories / files could not be found:

calibration_dir_errors

I am using the folder structure from release v3.0. My device_repository.json file is in the same folder as my volumetric_capture.exe. The relative path to my calibration images is .\data\Calibrations\20-04-09-16-33-26\. Are these the expected directory locations?

2 Nodes Camera Handshake Error

If you have 2 nodes with 4 cameras each, you run 1 of them and then you "handshake" with your node's cameras and the other node as well. This is a problem because if you starts one of them, then the other blocks the first one when "handshaking" with the first node cameras. Of course the device_repository.json of each node only haves the 4 cameras of the corresponding node.

RealSense D435 support

Hi Team,
Just wanted to know is RealSense D435 supported with your latest release.
Regards,
Meenal

Cannot Dump Colored Pointcloud from Multiview Player GUI

After opening the recorded sequences in the multiview player GUI and selecting the 'Dump' options, I don't see a way to dump the colored pointclouds - only the color and depth image sequences. Is this possible through the GUI?

I've also attempted to dump the colored pointcloud from the command line tool, but I am unable to figure out the correct syntax. Can you please provide an example syntax for how to dump colored, calibrated pointclouds from three or more recorded .cdv files?

Is there a way to view the recordings as colored pointclouds in the software?

Thank you!

Issue with launching volcap.exe

Hi all,
I am having an issue launching volcap.exe I am unsure what I am doing wrong...
Dependent program versions:

  • python 3.9.1
  • erlang 23.2
  • rabbitmq 3.8.11
  • 2015-2019 C++ redistributable 14.23.27820.0 (on both machines)

I have installed rabbitmq and set up a volumetric admin user, added a local azure to my device_repo.json, and have one set up remote. All ports are open inbound and outbound: TCP: 5672, 15672 and UDP: 11234, 320 on the local machine hosting rabbitmq and 321 on the remote machine.

However, when I open volcap.exe it immediately closes. I opened it in VS19's debugger mode and this is what I was able to see in the command prompt

image

The exception that shows as "unhanded" in VS19 is

Unhandled exception at 0x00007FFCC7383B29 in volcap.exe: Microsoft C++ exception: boost::exception_detail::clone_impl<boost::exception_detail::error_info_injector<boost::system::system_error> > at memory location 0x0000001B430FD550.

Please get back to me with and recommendations or let me know if more details would be beneficial.

dev_repo.exe app Assertion failed

while Creating a Device Repository for the device realsense camera. i get following error.... can anyone help.

C:\Users\subsc\Downloads\release_5.0.0\volcap>dev_repo.exe --add device_name --cam_type 1
2021-03-25 17:03:40.929 DEBUG [224] [main@149] Many devices check
Assertion failed: IsArray(), file D:_dev_Projects\urealsenz\immerzion\include\deps\rapidjson\document.h, line 1725

Also i could not see any code on the following path "D:_dev_Projects\urealsenz\immerzion\include\deps\rapidjson\document.h" ,

Please share "device_repository.json" file for Realsense camera (D415) and exact location where I have to paste it.

Incorrect path for device_repository.json in wiki

Hi,

I followed the instructions in the wiki. According to https://github.com/VCL3D/VolumetricCapture/wiki/Calibration-&-Setup-for-Kinect-Azure,

The device_repository.json file should be placed under "path_to_volumetric_capture"/Data/Executables/Calibration/multisensor_calibration/Resources/data

However, when placed there, I get the following error:

device_repository.json does not exist. Throwing. Assertion triggered.
Assertion failed: false, file D:\Projects\vs\RealSenz\immerzion\include\io\device_repository_loader.h, line 101

I found that if I place the file besides volumetric_capture.exe , the application does run.

Throwing. Assertion triggered.

Hi Team,

 Greetings of the day, we are  are using  KINECT  AZURE  and trying to Connect the 3 azure kits using (1 -i7 computer as master and 2  i5 computer as eye ), I have followed everything as instructed  in this link  https://vcl3d.github.io/VolumetricCapture/docs/software/#volcap  till volcap and Copied the DEPTH ENGINE DLL too, but while Executing the Volcap.exe ... the  VolCap.exe(application)  Closes with an error showing " Resources\device_repository.json does not exist. Throwing. Assertion triggered.

Assertion failed: false, file D:_dev_Projects\urealsenz\immerzion\include\io\device_repository_loader.h, line 107". Please Guide us further..
thank you
ganesh

Device Repository - executable not found

Hi,
I am unable to find the device_repository.exe file in the extracted VolumetricReleasex64.rar file folder. I can only see the device_repository.json file.

Azure Kinect mis-identified as Intel camera

Camera appears in volumetric_capture as a camera:
image
but when selecting "Connect", the application crashes and the remote_eye.exe crashes as well with the following:
image

It appears I need to manually specify that the camera is an Azure Kinect, but I cannot do this due to #21

Point clouds are not showing in volcap viewer

When cameras are connected, point cloud streams are not displayed. After calibration the problem remains. I've put depthengine_2_0.dll from Azure Kinect SDK v1.4.1 next to k4a.dll on workstation as well as on NUCs. Also I've installed MICROSOFT VISUAL C++ REDISTRIBUTABLE FOR VISUAL STUDIO 2019 (X64) on all PCs. What am I missing?

Trying the system without NUCs

Hi,

I have followed pretty everything in https://vcl3d.github.io/VolumetricCapture/ including adding device repo, install rabbitmq and add user, etc, except 2 things

  1. No NUCs, all devices(2 of Azure Kinect) are directly connected to host machine(localhost) vua USB3
  2. No hardware sync cable between 2 Kinect

But when I starts volcap, there are no devices listed under "device".

So I was wondering if there are things I should do differently if I configure devices directly connected to localhost.
Let me know.
Thanks.

MultiView Player doesn't work

Hi, I recorded a cdv file, but can't play it with multi-view player.
When I drop the file, it shows "Color dt:0 Depth dt:8067672801865 Frame ID:-1".
I don't know how to check the cdv files.
Thank you.

Pointclouds in Visualizer Grid Are Upside-Down

The pointclouds in my 3D visualizer grid-space are in strange orientations. Before calibration, they are sideways. After calibration, they are upside-down. Any ideas why this is happening? Is there a way to shift orientation? Here's a screencap:

upsidedown

Parsing/using files in Unity

Hi,

I am mostly interested in using VCL3D to capture performers for experimental storytelling-based VR projects. I am currently in process of setting up the capture system, but I would also want to test how to import the captures in a 3D environment (I am currently using Unity). I have not found in your wiki, though, any indication on how your .cdv files are structured, or how to parse them. Any indications on this? Would you also be able to share some of your captures for testing?

Best,

Sergio

Assistance with Azure Kinect support

Hi all, you've noted in your readme that Azure Kinect support is in progress. If desired, I would love to collaborate / help in any form. I have a 5 K4A camera setup and have been working on my own solution. Your repo seems to already have the vast majority of plumbing I have been working on.

Is Copper Wire OK?

Firstly, thanks for this very detail tutorial, the most complete yet.

I accidentally bought copper wire for the HW sync cable, is it ok to use copper wire?
20190524_073107

I am a pure software person, not knowing much about hardware, eager to learn.

Thanks for your advice.

Getting camera extrinsics

I'm trying to load in the camera extrinsics that are given by the volumetric_capture.exe calibration (using v3.0). I found the .extrinsics files located in volumetric_capture\Data\Calibrations\20-04-14-15-08-05\. However I noticed the extrinsics in the file are slightly different from what is output in the volumetric_capture.exe console.

Example console extrinsics:
extrinsics_terminal

Example d4150.extrinsics file contents:

0.0124787 -0.999791 0.016157
-0.23122 0.0128353 0.972817
-0.972821 -0.0158753 -0.231011
-71.0407 -1803.81 494.83

Which is correct / what is the intended way for me to load camera extrinsics?
I'm using multiview-player.exe to generate calibrated clouds with multiview-player.exe -f d4150.cdv --pcloud --is_pcloud_calibrated, how does it know where to find the extrinsics, is it stored in the .cdv?

Non-Blocking Network Switch part#

Hello, I recently purchased 4 azure kinects and have been purchasing the required hardware to begin testing your volumetric capture system (workstation, NUCs, mount, etc). I am having trouble identifying a non-blocking network switch as 'non-blocking' has not been listed in the tech specs or easily recognized in google or marketplace search. Can you provide the part number for the Cisco switch you are using?

Also, if you collected links for products you recommend you could probably make some research money from Amazon's affiliate program https://affiliate-program.amazon.com/

Thanks for the great documentation!
Ryan

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.