Giter VIP home page Giter VIP logo

camera-streamer's People

Contributors

ayufan avatar dong-zeyu avatar foosel avatar kapji avatar meteyou avatar noahwilliamsson avatar robeeejay avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

camera-streamer's Issues

Feature Request - new capture snapshot endpoint to get the next frame

First, very nice project! Thank you for your time and effort.

Background

I have been testing compatibility with an Octoprint plugin I wrote in preparation for people using camera-streamer within OctoPrint. My plugin creates a timelapse for 3d printers by 'posing' the 3d printer (usually by parking the printhead in a specific X/Y location), grabbing a snapshot, and then resuming the print. Here is a sample video from a long while back. This video was made using snapshots taken from an mjpg-streamer fork, which returns the next available frame when requesting a snapshot from the camera. And here is comparison (not my video), with and without Octolapse so you have a better idea what I'm trying to achieve.

The Problem

In testing camera-streamer, I noticed that the frame returned from the /snapshot endpoint can be from up to one second in the past using my settings(15 frames behind in my case). This means the image I am capturing is from before the printer made it to the specified X/Y position. My plugin has a 'snapshot delay' that can compensate for this, but 1000MS delay is quite large, and causes the following issues: Filament oozes from the nozzle during this time, affecting print quality negatively. Since each timelapse could contain well over 1000 snapshots, the additional delay adds to the print time. Octolapse also has a feature called 'snap to print' which requires extremely low delays in snapshot acquisition order to function properly without seriously impacting print quality.

Possible Solution

First, are there any settings tweaks that exist already for dealing with my issue? If so, this feature request may be unnecessary.

If not, I believe adding a special endpoint, like /next-snapshot or /new-snapshot or something (I'm bad at naming endpoints, lol!) that would wait until the next frame is available to return an image would solve this problem 100%. The average delay in receiving a frame in this case would be 1/FPS * .5 seconds, and would minimize the print time and the oozing issues. Alternatively, perhaps there could be a query string parameter added to an existing endpoint (like /snapshot?next-frame=true or something). Thoughts?

Assistance

I am willing to attempt to add this feature myself if you don't have the time/desire to tackle it. I looked through the code, but it will take me quite a while to become familiar enough with how it works to make any progress. I would be extremely grateful for any ideas you might have for how to accomplish this, and any direction you can give me that will save time.

I appreciate the time you spent reading through this issue, as I'm sure you are a busy person. Thank you!

Feature request: Having a jpg or png if camera signal is lost

If the connection to the cam device is lost, would be good to show a jpeg or png, like ustreamer does.

-k path, --blank path
Path to JPEG file that will be shown when the device is discon‐
nected during the streaming. Default: black screen 640x480 with
'NO SIGNAL'.

/webrtc TypeError: Failed to fetch

image

Hi i have a CSI Camera OV5647, when i try to use webrtc i have this error, how can i solve it?
i try to use both legacy camera and libcamera

Fake Camera Sensor make Error

Hello again,

I am running this on a different pi, but when I run the make command I get this error:

device/libcamera/fake_camera.c: In function ‘fake_camera_sensor’:
device/libcamera/fake_camera.c:18:34: error: cast to pointer from integer of different size [-Werror=int-to-pointer-cast]
18 | struct media_v2_entity *ents = (struct media_v2_entity *)topology->ptr_entities;
| ^
cc1: all warnings being treated as errors
make: *** [Makefile:53: device/libcamera/fake_camera.o] Error 1
rm device/buffer_lock.o device/camera/camera_output.o device/buffer_queue.o http/http_h264.o device/buffer.o http/http_jpeg.o device/links.o opts/fourcc.o device/camera/camera_input.o device/camera/camera.o http/http.o http/http_ffmpeg.o ffmpeg/remuxer.o device/buffer_list.o device/dummy/buffer_list.o opts/log.o device/dummy/device.o device/dummy/dummy.o opts/control.o device/dummy/buffer.o http/http_methods.o device/camera/camera_isp.o opts/opts.o device/device.o

Can I still use the software with an imx219 sensor without this fake camera sensor? Or is there another way around this?
Thanks for the help.

Low latency with libcamera

I've been looking into this with the Raspberry Pi guys, to see what we can do to fix the latency issues you report so you don't need the manual configuration of the ISP.

It may be that we could add a control or such to 'request' low-latency configurations from a camera.

I can think of a couple of ways to measure the latency, but to be able to compare against your numbers can you confirm how you are measuring the latency ?

Are you just capturing an image with the camera of a stopwatch and the stream view in the background?

Request: Snapshot delay

I am trying to create snapshots for a time-lapse project. I do this by CURLing the snapshot endpoint. It appears my (V3) pi camera the API returns a picture before the camera completed its adjustments. The pictures appear over- or underexposed. It makes sense that this only happens when there is no client connected to watch a stream.

When I stop the service and use libcamera-jpeg (with sudo, regrettably) I can see output in the console which looks like it is taking over hundred samples before a file gets saved.

Is there a better way to solve this other than faking a streaming client?

USB Camera Service File Help

Hello,
I want to continue to use this software but on a usb camera instead, but when I try to start the usb camera service file, I end up with this message:

pi@raspberrypi:~ $ sudo systemctl start /camera-streamer/service/camera-streamer-usb-cam
Failed to start camera\x2dstreamer-service-camera\x2dstreamer\x2dusb\x2dcam.mount: Unit camera\x2dstreamer-service-camera\x2dstreamer\x2dusb\x2dcam.mount not found.

Am I doing something wrong here? Or is the way to setup a usb camera with this a bit different? Any help is appreciated, as well as any general tips on how to use the streamer with a usb camera.

No h264 Bitrate adjustment

I have an application where I want to minimize the network bandwidth as much as possible.

When using libcamera_camera.sh with the option:
--camera-video.options=video_bitrate=1000000 (and using all sorts of different bitrates)
None of the bitrates actually affect the stream rate.
I'm checking the network stream with the command:
iftop -i wlan0
I'm wondering if the video_bitrate option isn't being passed on to libcamera?
Also,
-camera-h264.options=bitrate=1000000
throws an error too.

kernel or camera-streamer bug ?

Hi,

Camera-streamer doesnt work anymore for me, dont know if it's a kernel bug. Can you help with these logs ?

'4l2-ctl --list-devices
bcm2835-codec-decode (platform:bcm2835-codec):
/dev/video10
/dev/video11
/dev/video12
/dev/video18
/dev/video31
/dev/media1

bcm2835-isp (platform:bcm2835-isp):
/dev/video13
/dev/video14
/dev/video15
/dev/video16
/dev/video20
/dev/video21
/dev/video22
/dev/video23
/dev/media2
/dev/media3

unicam (platform:fe801000.csi):
/dev/video0
/dev/video1
/dev/media4

rpivid (platform:rpivid):
/dev/video19
/dev/media0
'

'DMA free:27500kB min:712kB low:888kB high:1064kB reserved_highatomic:0KB active_anon:2968kB inactive_anon:3648kB active_file:224kB inactive_file:0kB unevictable:0kB writepending:4kB present:131072kB>
janv. 25 14:48:57 raspberrypi kernel: Node 0 active_anon:102408kB inactive_anon:212632kB active_file:207828kB inactive_file:270016kB unevictable:28kB isolated(anon):0kB isolated(file):0kB mapped:92140kB dirty:376kB writeback:0kB shmem:1>
janv. 25 14:48:57 raspberrypi kernel: active_anon:25602 inactive_anon:53158 isolated_anon:0
active_file:51957 inactive_file:67504 isolated_file:0
unevictable:7 dirty:94 writeback:0
slab_reclaimable:9346 slab_unreclaimable:10568
mapped:23035 shmem:350 pagetables:2697 bounce:0
kernel_misc_reclaimable:0
free:37392 free_pcp:0 free_cma:113
janv. 25 14:48:57 raspberrypi kernel: Mem-Info:
janv. 25 14:48:57 raspberrypi kernel: el0t_64_sync+0x1a0/0x1a4
janv. 25 14:48:57 raspberrypi kernel: el0t_64_sync_handler+0x90/0xb8
janv. 25 14:48:57 raspberrypi kernel: el0_svc+0x24/0x60
janv. 25 14:48:57 raspberrypi kernel: do_el0_svc+0x2c/0x90
janv. 25 14:48:57 raspberrypi kernel: el0_svc_common.constprop.3+0x98/0x120
janv. 25 14:48:57 raspberrypi kernel: invoke_syscall+0x4c/0x110
janv. 25 14:48:57 raspberrypi kernel: __arm64_sys_ioctl+0xb0/0xf0
janv. 25 14:48:57 raspberrypi kernel: v4l2_ioctl+0x48/0x68 [videodev]
janv. 25 14:48:57 raspberrypi kernel: video_ioctl2+0x20/0x38 [videodev]
janv. 25 14:48:57 raspberrypi kernel: video_usercopy+0x310/0x7d0 [videodev]
janv. 25 14:48:57 raspberrypi kernel: __video_do_ioctl+0x188/0x410 [videodev]
janv. 25 14:48:57 raspberrypi kernel: v4l_reqbufs+0x54/0x68 [videodev]
janv. 25 14:48:57 raspberrypi kernel: vb2_ioctl_reqbufs+0x8c/0xc8 [videobuf2_v4l2]
janv. 25 14:48:57 raspberrypi kernel: vb2_core_reqbufs+0x200/0x480 [videobuf2_common]
janv. 25 14:48:57 raspberrypi kernel: __vb2_queue_alloc+0x220/0x488 [videobuf2_common]
janv. 25 14:48:57 raspberrypi kernel: vb2_dc_alloc+0x70/0x130 [videobuf2_dma_contig]
janv. 25 14:48:57 raspberrypi kernel: dma_alloc_attrs+0xac/0xc0
janv. 25 14:48:57 raspberrypi kernel: dma_direct_alloc+0x7c/0x328
janv. 25 14:48:57 raspberrypi kernel: __dma_direct_alloc_pages.isra.22+0x168/0x1b0
janv. 25 14:48:57 raspberrypi kernel: __alloc_pages+0x2b0/0x330
janv. 25 14:48:57 raspberrypi kernel: __alloc_pages_slowpath.constprop.155+0xb78/0xba0
janv. 25 14:48:57 raspberrypi kernel: warn_alloc+0x11c/0x1a0
janv. 25 14:48:57 raspberrypi kernel: dump_stack+0x18/0x34
janv. 25 14:48:57 raspberrypi kernel: dump_stack_lvl+0x8c/0xb8
janv. 25 14:48:57 raspberrypi kernel: show_stack+0x20/0x30
janv. 25 14:48:57 raspberrypi kernel: dump_backtrace+0x0/0x1b8
janv. 25 14:48:57 raspberrypi kernel: Call trace:
janv. 25 14:48:57 raspberrypi kernel: Hardware name: Raspberry Pi 4 Model B Rev 1.5 (DT)
janv. 25 14:48:57 raspberrypi kernel: CPU: 2 PID: 10811 Comm: camera-streamer Tainted: G C 5.15.84-v8+ #1613
janv. 25 14:48:57 raspberrypi kernel: camera-streamer: page allocation failure: order:10, mode:0xcc1(GFP_KERNEL|GFP_DMA), nodemask=(null),cpuset=/,mems_allowed=0
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: device/libcamera/buffer_list.cc: CAMERA:capture: Can't allocate buffers
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.983092414] [10811] ERROR V4L2 v4l2_videodevice.cpp:1241 /dev/video14[26:cap]: Unable to request 2 buffers: Cannot allocate memory
janv. 25 14:48:57 raspberrypi kernel: cma: cma_alloc: reserved: alloc failed, req-size: 1001 pages, ret: -12
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.976747464] [10811] INFO RPI raspberrypi.cpp:805 Sensor: /base/soc/i2c0mux/i2c@1/imx219@10 - Selected sensor format: 1640x1232-SBGGR10_1X10 - Selected unicam format: 1640x12>
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.976424132] [10799] INFO Camera camera.cpp:1026 configuring streams: (0) 1640x1232-YUYV
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: device/libcamera/device.cc: CAMERA: Device path=/base/soc/i2c0mux/i2c@1/imx219@10 opened
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.975561803] [10811] INFO RPI raspberrypi.cpp:1425 Registered camera /base/soc/i2c0mux/i2c@1/imx219@10 to Unicam device /dev/media4 and ISP device /dev/media2
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.974317420] [10811] WARN RPI raspberrypi.cpp:1308 Mismatch between Unicam and CamHelper for embedded data usage!
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: [0:49:14.938322414] [10799] INFO Camera camera_manager.cpp:299 libcamera v0.0.2+55-5df5b72c
janv. 25 14:48:57 raspberrypi camera-streamer[10799]: output/rtsp/rtsp.cc: ?: Running RTSP server on '8554'
janv. 25 14:48:57 raspberrypi systemd[1]: Started camera-streamer web camera'

No signal... Address already in use

Hi, I can't get the ArduCam MPx streaming to work.

I found this error in the logs:

Feb 06 19:01:10 mainsailos camera-streamer[8373]: bind: Address already in use
Feb 06 19:01:10 mainsailos systemd[1]: camera-streamer-arducam-16MP.service: Main process exited, code=exited, status=255/EXCEPTION

I searched the dark corners of the internet but couldn't find anything...

I use: MainsailOs, 32b

Make error

Am a noob with c / make, but just trying to get my imx219 camera (arducam b0390) to work with my fluidd setup.

I'm getting this error when I try to make:

$ make
cc -std=gnu17 -MMD -Werror -Wall -g -I/home/pi/camera-streamer -DUSE_FFMPEG -DUSE_LIBCAMERA -I/usr/include/libcamera -c -o device/buffer_queue.o device/buffer_queue.c
In file included from device/buffer_queue.c:4:
device/buffer_queue.c: In function ‘buffer_list_enqueue’:
/home/pi/camera-streamer/opts/log.h:45:110: error: format ‘%lu’ expects argument of type ‘long unsigned int’, but argument 10 has type ‘uint64_t’ {aka ‘long long unsigned int’} [-Werror=format=]
   45 | #define LOG_DEBUG(dev, _msg, ...)  do { if (log_options.debug || filter_log(__FILENAME__)) { fprintf(stderr, "%s: %s: " _msg "\n", __FILENAME__, dev_name(dev), ##__VA_ARGS__); } } while(0)
      |                                                                                                              ^~~~~~~~~~
device/buffer_queue.c:130:5: note: in expansion of macro ‘LOG_DEBUG’
  130 |     LOG_DEBUG(buf, "mmap copy: dest=%p, src=%p (%s), size=%zu, space=%zu, time=%luus",
      |     ^~~~~~~~~
cc1: all warnings being treated as errors
make: *** [Makefile:53: device/buffer_queue.o] Error 1

this is with

$ uname -a
Linux nickse3pro 5.15.32-v7+ #1538 SMP Thu Mar 31 19:38:48 BST 2022 armv7l GNU/Linux

Thanks.

ISP Can't queue buffer.

Hey, when i try to run the raspi v2 cam with that script i get that error: device/v4l2/buffer.c: CAMERA:capture:buf0: ioctl(ret=-1): Can't queue buffer.

script:

#!/bin/bash

SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
cd "$SCRIPT_DIR"

CAMERA_PATH=/dev/v4l/by-path/platform-fe800000.csi-video-index0

if [[ "$1" == "dump" ]]; then
  shift
  libcamera-still "$@"
  echo
  v4l2-ctl -d /dev/v4l-subdev0 \
    -C exposure -C vertical_blanking -C analogue_gain -C digital_gain | \
    sed -e 's/ //g' -e 's/:/=/g' -e 's/^/-camera-options=/g' -e 's/$/ \\/g'
  v4l2-ctl -d /dev/video13 -C red_balance -C colour_correction_matrix \
    -C black_level -C green_equalisation -C gamma -C denoise -C sharpen \
    -C defective_pixel_correction -C colour_denoise | \
    sed -e 's/ //g' -e 's/:/=/g' -e 's/^/-camera-isp.options=/g' -e 's/$/ \\/g'
  exit 0
fi

set -xeo pipefail
make -j$(nproc)
$GDB ./camera-streamer -camera-path=/dev/v4l/by-path/platform-fe800000.csi-video-index0 \
   -camera-format=YUYV \
#  -camera-options=vertical_blanking=728 \
#  -camera-options=exposure=2444 \
#  -camera-options=analogue_gain=600 \
#  -camera-options=digital_gain=256 \
#  -camera-isp.options=digital_gain=2015 \
#  -camera-isp.options=red_balance=1852 \
#  -camera-isp.options=blue_balance=2146 \
#  -camera-isp.options=colour_correction_matrix=1,0,0,0,146,4,0,0,232,3,0,0,211,255,255,255,232,3,0,0,132,255,255,255,232,3,0,0,91,255,255,255,232,3,0,0,34>
#  -camera-isp.options=black_level=1,0,0,0,0,16,0,16,0,16,38,134 \
#  -camera-isp.options=green_equalisation=1,0,0,0,89,2,0,0,47,0,0,0,232,3,0,0 \
#  -camera-isp.options=gamma=1,0,0,0,0,0,0,4,0,8,0,12,0,16,0,20,0,24,0,28,0,32,0,36,0,40,0,44,0,48,0,52,0,56,0,60,0,64,0,72,0,80,0,88,0,96,0,104,0,112,0,12>
#  -camera-isp.options=denoise=1,0,0,0,0,0,0,0,240,30,0,0,232,3,0,0,238,2,0,0,232,3,0,0 \
#  -camera-isp.options=sharpen=1,0,0,0,208,7,0,0,232,3,0,0,244,1,0,0,232,3,0,0,244,1,0,0,232,3,0,0 \
#  -camera-isp.options=defective_pixel_correction=1,0,0,0,1,0,0,0 \
#  -camera-isp.options=colour_denoise=0,0,0,0,127,0,0,0 \
  "$@"

Feature Request: High Resolution Snapshots

I see that the current snapshot takes the image in the video buffer and saves it. This is efficient and doesn't need to adjust capture configuration while camera-streamer is running. This is great if you want a consistently smooth video stream, and you don't really care that the snapshot is of the same low resolution as the stream. However, I'm specifically looking at the use case of making time-lapse with camera-streamer, which cares about the output image (snapshot) resolution.

Usually, we don't need higher-quality and guaranteed-smooth streams, but we do want time-lapse images that are as high-quality as possible. This means hiccups (frame drops) in the stream while the time-lapse image is taken is acceptable. So I think maybe a dual-buffer mode can be helpful: one buffer will handle the lower-resolution video streaming while the other buffer will provide higher-resolution image capturing.

I'm proposing the flow could be:

  1. "/high-res-snapshot" endpoint triggered
  2. low-res buffer drops
  3. load high-res buffer
  4. Take image and return to endpoint
  5. high-res buffer drops
  6. continue with low-res buffer

A few potential drawbacks to this approach would be:

  1. Hiccups in the low-res buffer when the high-res buffer is active
  2. Snapshots would have a short delay
  3. Added code complexity of juggling between 2 buffers

Fairly new to camera-streamer, so still trying to figure things out. Not sure how feasible this idea is or if something already exists to achieve high resolution snapshots while maintaining a lower-resolution video feed.

fps Performance with camera-streamer over libcamera

Hi
That´s a really cool project, playing around with it for a couple of evenings now...

But I do have a question on running the Arducam AR0234 with your camera-streamer through libcamera on 3B+ and Zero 2W

I am using this for normal udp tests:
libcamera-vid -t 0 --width 1920 --height 1080 --framerate 50 --level 4.2 --bitrate 8000000 --denoise off -n -o udp://10.0.0.20:9998

It is running out of specs for a raspberry, but for the 2-3 minutes I need it and overclocked, it runs better than expected. When I record the stream and put it in a Adobe Premiere timeline with 50fps I can visually confirm it that it makes 50fps over UDP.

Now when I try to mimic the same with camera-streamer with libcamera and webrtc I never get the same fps out of it. It basically runs at 25fps (sometimes it goes even a little below that). Same with rtsp and and mp4.

Here is an example of my libcamera_camera.sh I am using for testing at the moment:

#!/bin/bash

SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
cd "$SCRIPT_DIR/.."

set -xeo pipefail
make -j$(nproc)
$GDB ./camera-streamer \
  -camera-type=libcamera \
  -camera-option=noise_reduction_mode=4 \
  -camera-width=1920 \
  -camera-height=1080 \
  -camera-fps=50 \
  -camera-video.height=1080 \
  -camera-video.options=h264_level=13 \
  -camera-video.options=videobitrate=8000000 \
  "$@"

I also tested through my available noise reduction modes [0..4] in the options to switch it off. But it doesn't seem to do much (except 0 is cdn_hq I think)

Should it be possible to get around the same performance out of camera-streamer over webrtc like it would with a libcamera UDP stream?
If so, my best guess is that I cannot switch off denoise completely (spatial and colour noise) like I can with --denoise off ?
Any hint would be greatly appreciated

That is the camera I am testing:
https://docs.arducam.com/Raspberry-Pi-Camera/Pivariety-Camera/AR0234/
And here is the camera dump from tools:
Dump_camera.txt

Goam

ps.
do you have any "buymeacoffee" link? ;-)

RaspberryPi v1.3 camera

Great streamer!
I managed to make it work for the widespread older RPI v1.3 camera. Adjustments of your example:

camera-streamer-pi5647-5MP.service

; Official Raspberry Pi v1.3 5MP camera based on the OmniVision OV5647 chip
; https://www.raspberrypi.com/documentation/accessories/camera.html#hardware-specification
;
[Unit]
Description=camera-streamer web camera
After=network.target
ConditionPathExists=/sys/bus/i2c/drivers/ov5647/10-0036/video4linux

[Service]
ExecStart=/usr/local/bin/camera-streamer \\
  -camera-path=/base/soc/i2c0mux/i2c@1/ov5647@36 \\
  -camera-type=libcamera \\
  -camera-format=YUYV \\
  -camera-width=2592 -camera-height=1944 \\
  -camera-fps=30 \\
  ; use two memory buffers to optimise usage
  -camera-nbufs=2 \\
  ; the high-res is 1296x972
  -camera-high_res_factor=2 \\
  ; the low-res is 648x486
  -camera-low_res_factor=4 \\
...

How to change port

Hi,

I would like to change the port used for http streaming, is that possible ?
Thanks for your work !

Flip option

Hi again ^^,

Do you knwo if its possible to flip horizontally the image/video, like "--flip-horizontal 1" in ustreamer ?

Input/Output error

Yesterday the services was working with that command, but today. This error, i dont understand why.

/usr/local/bin/camera-streamer   -camera-path=/base/soc/i2c0mux/i2c@0/imx219@10   -camera-type=libcamera   -camera-format=YUYV   -camera-nbufs=2   -http=8081   -camera-width=1640   -camera-height=1232   -camera-options=rotation=90   -camera-fps=15 -log-verbose
device/v4l2/device_list.c: bcm2835-isp: Device (/dev/video23) does not support capture (skipping)
device/v4l2/device_list.c: bcm2835-isp: Device (/dev/video20) does not support capture (skipping)
device/v4l2/device_list.c: bcm2835-isp: Device (/dev/video16) does not support capture (skipping)
device/v4l2/device_list.c: bcm2835-isp: Device (/dev/video13) does not support capture (skipping)
[0:04:42.543821854] [2083]  INFO Camera camera_manager.cpp:293 libcamera v0.0.1+21-7c855784
[0:04:42.581297734] [2099]  WARN RPI raspberrypi.cpp:1297 Mismatch between Unicam and CamHelper for embedded data usage!
[0:04:42.582301378] [2099]  INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@0/imx219@10 to Unicam device /dev/media2 and ISP device /dev/media4
[0:04:42.591641765] [2099]  WARN RPI raspberrypi.cpp:1297 Mismatch between Unicam and CamHelper for embedded data usage!
[0:04:42.592775333] [2099]  INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@1/imx219@10 to Unicam device /dev/media3 and ISP device /dev/media5
device/libcamera/device.cc: CAMERA: Device path=/base/soc/i2c0mux/i2c@0/imx219@10 opened
[0:04:42.593671999] [2083]  INFO Camera camera.cpp:1026 configuring streams: (0) 1640x1232-YUYV
[0:04:42.594084341] [2099]  INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@0/imx219@10 - Selected sensor format: 1640x1232-SBGGR10_1X10 - Selected unicam format: 1640x1232-pBAA
device/buffer_list.c: CAMERA:capture: Using: 1640x1232/YUYV, bytesperline=3328
device/v4l2/device.c: H264: Device path=/dev/video11 fd=42 opened
device/v4l2/device_media.c: H264: Opened '/dev/media1' (fd=43)
device/v4l2/device_media.c: H264: Link '../../devices/platform/soc/fe00b840.mailbox/bcm2835-codec/video4linux/video10' does not contain '/v4l-subdev'
device/v4l2/device_options.c: H264: The 'codeccontrols' is read-only
device/v4l2/device_options.c: H264: Available control: videobitratemode (009909ce, type=3)
device/v4l2/device_options.c: H264: Available control: videobitrate (009909cf, type=1)
device/v4l2/device_options.c: H264: Available control: sequenceheadermode (009909d8, type=3)
device/v4l2/device_options.c: H264: Available control: repeatsequenceheader (009909e2, type=2)
device/v4l2/device_options.c: H264: Available control: forcekeyframe (009909e5, type=4)
device/v4l2/device_options.c: H264: Available control: h264minimumqpvalue (00990a61, type=1)
device/v4l2/device_options.c: H264: Available control: h264maximumqpvalue (00990a62, type=1)
device/v4l2/device_options.c: H264: Available control: h264iframeperiod (00990a66, type=1)
device/v4l2/device_options.c: H264: Available control: h264level (00990a67, type=3)
device/v4l2/device_options.c: H264: Available control: h264profile (00990a6b, type=3)
device/v4l2/buffer_list.c: H264:output:mplane: Adapting size to 32x32 block: 1640x1232 vs 1632x1216
device/buffer_list.c: H264:output:mplane: Using: 1632x1216/YUYV, bytesperline=3328
device/v4l2/buffer_list.c: H264:capture:mplane: Adapting size to 32x32 block: 1632x1216 vs 1632x1216
device/buffer_list.c: H264:capture:mplane: Using: 1632x1216/H264, bytesperline=0
device/v4l2/device.c: JPEG: Device path=/dev/video31 fd=45 opened
device/v4l2/device_media.c: JPEG: Opened '/dev/media1' (fd=46)
device/v4l2/device_media.c: JPEG: Link '../../devices/platform/soc/fe00b840.mailbox/bcm2835-codec/video4linux/video10' does not contain '/v4l-subdev'
device/v4l2/device_options.c: JPEG: The 'jpegcompressioncontrols' is read-only
device/v4l2/device_options.c: JPEG: Available control: compressionquality (009d0903, type=1)
device/v4l2/buffer_list.c: JPEG:output:mplane: Adapting size to 32x32 block: 1640x1232 vs 1632x1216
device/buffer_list.c: JPEG:output:mplane: Using: 1632x1216/YUYV, bytesperline=3328
device/v4l2/buffer_list.c: JPEG:capture:mplane: Adapting size to 32x32 block: 1632x1216 vs 1632x1216
device/buffer_list.c: JPEG:capture:mplane: Using: 1632x1216/JPEG, bytesperline=0
device/v4l2/device_options.c: JPEG: Configuring option compressionquality (009d0903) = 80
device/v4l2/device_options.c: H264: Configuring option repeatsequenceheader (009909e2) = 1
device/v4l2/device_options.c: H264: Configuring option videobitratemode (009909ce) = 0
device/v4l2/device_options.c: H264: Configuring option videobitrate (009909cf) = 2000000
device/v4l2/device_options.c: H264: Configuring option repeatsequenceheader (009909e2) = 5000000
device/v4l2/device_options.c: H264: Configuring option h264iframeperiod (00990a66) = 30
device/v4l2/device_options.c: H264: Configuring option h264level (00990a67) = 11
device/v4l2/device_options.c: H264: Configuring option h264profile (00990a6b) = 4
device/v4l2/device_options.c: H264: Configuring option h264minimumqpvalue (00990a61) = 16
device/v4l2/device_options.c: H264: Configuring option h264maximumqpvalue (00990a62) = 32
device/links.c: ?: Link 0: CAMERA:capture[1640x1232/YUYV/2] => [H264:output:mplane[1632x1216/YUYV/2], JPEG:output:mplane[1632x1216/YUYV/2]]
device/links.c: ?: Link 1: H264:capture:mplane[1632x1216/H264/2] => [H264-CAPTURE]
device/links.c: ?: Link 2: JPEG:capture:mplane[1632x1216/JPEG/2] => [JPEG-CAPTURE]
[0:04:43.646075325] [2099] ERROR V4L2 v4l2_videodevice.cpp:1903 /dev/video0[23:cap]: Failed to start streaming: Input/output error
device/libcamera/buffer_list.cc: CAMERA:capture: Failed to start camera.
device/links.c: CAMERA:capture: Failed to start streaming

ScalerCrop option

Hello!

I've been trying to use the ScalerCrop option to zoom in on the image, but I'm not seeing any changes to the stream, any reason why this would be?

I'm trying e.g. option?ScalerCrop=(0,0)/1000x1000, and nothing changes. Anything specific I'm doing wrong with the format, or is there something else I'm not understanding regarding this option?

Using other options work fine, and I'm using an arducam 64MP

Future Reqeues

Hello, great work, I really like using the streamer.

Would it be possible to add a display for the number of active clients?
Like ustreamer for example at http://127.0.0.1:8080/state

... "clients": 1 .....

IMX462 Low Light

How can I make it work with an IMX462 sensor, bullseye and kernel 5.15?

Edit: I modified imx519 to imx290 from sh /tools/libcamera (that's how my sensor is identified in libcamera-hello) and got it to work!

One more thing, how to apply the options? I'm trying to change bitrate and rotation but I can't see in which file to edit it.

No options passed.

Set: /option?name=value

CAMERA Properties:
- property: SensorSensitivity (00000009, type=5): 1.000000
- property: ScalerCropMaximum (00000008, type=7): (8, 8)/1920x1080
- property: ColorFilterArrangement (0000000a, type=3): 0
- property: PixelArrayActiveAreas (00000007, type=7): [ (4, 12)/1937x1097 ]
- property: PixelArraySize (00000005, type=8): 1937x1097
- property: Rotation (00000002, type=3): 0
- property: Location (00000001, type=3): 2
- property: UnitCellSize (00000004, type=8): 2900x2900
- property: Model (00000003, type=6): imx290

CAMERA Options:
- available option: AeConstraintMode (00000004, type=3): [0..3]
- available option: Saturation (00000011, type=5): [0.000000..32.000000]
- available option: AeExposureMode (00000005, type=3): [0..3]
- available option: ColourCorrectionMatrix (00000015, type=5): [-16.000000..16.000000]
- available option: AwbEnable (0000000c, type=1): [false..true]
- available option: AeEnable (00000001, type=1): [false..true]
- available option: AnalogueGain (00000008, type=5): [1.000000..31.622776]
- available option: ExposureTime (00000007, type=3): [14..7229147]
- available option: ScalerCrop (00000016, type=7): [(0, 0)/64x64..(0, 0)/1920x1080]
- available option: AeMeteringMode (00000003, type=3): [0..3]
- available option: NoiseReductionMode (00000027, type=3): [0..4]
- available option: ColourGains (0000000f, type=5): [0.000000..32.000000]
- available option: ExposureValue (00000006, type=5): [-8.000000..8.000000]
- available option: Sharpness (00000013, type=5): [0.000000..16.000000]
- available option: Contrast (0000000a, type=5): [0.000000..32.000000]
- available option: Brightness (00000009, type=5): [-1.000000..1.000000]
- available option: AwbMode (0000000d, type=3): [0..7]
- available option: FrameDurationLimits (00000019, type=4): [16666..7230033]

JPEG Options:
- available option: compressionquality (009d0903, type=1): [1..100]

H264 Options:
- available option: videobitratemode (009909ce, type=3): [0..1]
		0: Variable Bitrate
		1: Constant Bitrate
- available option: videobitrate (009909cf, type=1): [25000..25000000]
- available option: sequenceheadermode (009909d8, type=3): [0..1]
		0: Separate Buffer
		1: Joined With 1st Frame
- available option: repeatsequenceheader (009909e2, type=2): [0..1]
- available option: forcekeyframe (009909e5, type=4): button
- available option: h264minimumqpvalue (00990a61, type=1): [0..51]
- available option: h264maximumqpvalue (00990a62, type=1): [0..51]
- available option: h264iframeperiod (00990a66, type=1): [0..2147483647]
- available option: h264level (00990a67, type=3): [0..15]
		0: 1
		1: 1b
		2: 1.1
		3: 1.2
		4: 1.3
		5: 2
		6: 2.1
		7: 2.2
		8: 3
		9: 3.1
		10: 3.2
		11: 4
		12: 4.1
		13: 4.2
		14: 5
		15: 5.1
- available option: h264profile (00990a6b, type=3): [0..4]
		0: Baseline
		1: Constrained Baseline
		2: Main
		4: High

Raspberry Pi HQ Camera

Hi Kamil.

Support for the Official Raspberry Pi HQ Camera, is that something you can add to the Streamer? Or is it already supported?

Regards

Feature Request: Providing prebuilt binarys

Hi,

It would be good to have prebuilt binarys, possibly static linked, with as few dependencies as possible.

This should be done with the following in mind.

Having different Branches for development and Release.
As an example:

main/master - Release Branch.
develop - "nightly" branch.

The idea behind that is you could provide stable packages with the main/master branch,
but for testing it might be better to compile on the host instead using prebuilt images.

Supported architectures should be:
x86(_x64), armhf ( Raspi ), arm64 ( Raspi and armbian ) and armel. This would cover the most use cases.

`--camera-{v,h}flip` throw an error now when they used to work with the same camera

Hey there, as already mentioned in this comment on the OctoPrint project I'm seeing a regression with the current code with regards to flipping.

I just built a new binary of v0.1-16-gcdb62ef (cdb62ef) (current master) against the current RPi libcamera0 package, 0.0.4 (debian packaged version: 0~git20230302+923f5d70-1): https://github.com/OctoPrint/camera-streamer-archive/releases/tag/0.1-16-gcdb62ef-1

This runs great on my test image with RPiCam v3 and USB Camera, however I can no longer use --camera-{v,h}flip with the RPiCam like I could before (the USB camera says it doesn't support that in the first place). If I try, camera configuration fails:

octo@octopib:~ $ /usr/bin/camera-streamer --http-port=8080 --camera-type=libcamera --camera-path=/base/soc/i2c0mux/i2c@1/imx708@1a --camera-format=YUYV --camera-vflip=1 --camera-hflip=1
/usr/bin/camera-streamer Version: v0.1-16-gcdb62ef (cdb62ef)
[0:12:52.107127607] [1308]  INFO Camera camera_manager.cpp:299 libcamera v0.0.4+22-923f5d70
[0:12:52.260933243] [1319]  INFO RPI raspberrypi.cpp:1476 Registered camera /base/soc/i2c0mux/i2c@1/imx708@1a to Unicam device /dev/media2 and ISP device /dev/media0
device/libcamera/device.cc: CAMERA: Device path=/base/soc/i2c0mux/i2c@1/imx708@1a opened
[0:12:52.261933303] [1308] ERROR Camera camera.cpp:1016 Can't configure camera with invalid configuration
device/libcamera/buffer_list.cc: CAMERA:capture: Failed to configure camera

I am absolutely sure this used to work with an earlier version, because back then I tested this and documented it on the OctoPrint community forums on February 22nd. The first user report of this no longer working reached me on March 6th with a (develop branch) build from February 28th.

I've created an strace for you just in case this might help: flip-strace.txt. I'm happy to test anything or provide any further info you need.

From my understanding, based on the command output, this is failing inside libcamera, so either the interface changed, or there's a regression upstream?

Feature Request: Abilty to set custom static root path

It would be great to have the ability to set a custom path for html static files.

As described in #13 ,

--static path
Path to dir with static files instead of embedded root index
page. Symlinks are not supported for security reasons. Default:
disabled.

This would allow to customize the webpage for camera-streamer to use it in other usecases ( standalone as IP cam for ex.)

Error running `sudo make install`

Hi,

Trying to test this out I ran into an issue with sudo make install. I got the following error on a RPi4 with the latest kernel:

pi@rpi4:~/camera-streamer $ sudo make install
cc -std=gnu17 -MMD -Werror -Wall -g -I -DUSE_FFMPEG -DUSE_LIBCAMERA -I/usr/include/libcamera -c -o device/buffer.o device/buffer.c
device/buffer.c:1:10: fatal error: device/buffer.h: No such file or directory
    1 | #include "device/buffer.h"
      |          ^~~~~~~~~~~~~~~~~
compilation terminated.
make: *** [Makefile:53: device/buffer.o] Error 1

Reference:

pi@rpi4:~ $ uname -a; v4l2-ctl --list-devices
Linux rpi4 5.15.32-v8+ #1538 SMP PREEMPT Thu Mar 31 19:40:39 BST 2022 aarch64 GNU/Linux
bcm2835-codec-decode (platform:bcm2835-codec):
	/dev/video10
	/dev/video11
	/dev/video12
	/dev/video18
	/dev/video31
	/dev/media2

bcm2835-isp (platform:bcm2835-isp):
	/dev/video13
	/dev/video14
	/dev/video15
	/dev/video16
	/dev/video20
	/dev/video21
	/dev/video22
	/dev/video23
	/dev/media0
	/dev/media1

Cannot open device /dev/video0, exiting.

Ghost webrtc streams

@ayufan OK, this time a real bug. 😅

I noticed that when I watch a WebRTC stream and I relaunch the browser (or just go back to the index and click again on /webrtc), most of the time, I get a ghost UDP stream...

Here's a log view of the phenomenon :
util/http/http.c: HTTP8080/6: Client connected 192.168.1.36 (fd=10).
util/http/http.c: HTTP8080/6: Request 'GET' '/webrtc' ''
util/http/http.c: HTTP8080/6: Client disconnected 192.168.1.36.
util/http/http.c: HTTP8080/7: Client connected 192.168.1.36 (fd=11).
util/http/http.c: HTTP8080/7: Request 'POST' '/webrtc' ''
output/webrtc/webrtc.cc: rtc-tonwouvavrrzosxdjudo: Stream requested.
util/http/http.c: HTTP8080/7: Client disconnected 192.168.1.36.
util/http/http.c: HTTP8080/8: Client connected 192.168.1.36 (fd=12).
util/http/http.c: HTTP8080/8: Request 'POST' '/webrtc' ''
output/webrtc/webrtc.cc: rtc-tonwouvavrrzosxdjudo: Answer received.
util/http/http.c: HTTP8080/8: Client disconnected 192.168.1.36.

util/http/http.c: HTTP8080/9: Client connected 192.168.1.36 (fd=13).
util/http/http.c: HTTP8080/9: Request 'GET' '/webrtc' ''
util/http/http.c: HTTP8080/9: Client disconnected 192.168.1.36.
util/http/http.c: HTTP8080/0: Client connected 192.168.1.36 (fd=4).
util/http/http.c: HTTP8080/0: Request 'POST' '/webrtc' ''
output/webrtc/webrtc.cc: rtc-wpesdkknzakphjitiknw: Stream requested.
util/http/http.c: HTTP8080/0: Client disconnected 192.168.1.36.
util/http/http.c: HTTP8080/1: Client connected 192.168.1.36 (fd=5).
util/http/http.c: HTTP8080/1: Request 'POST' '/webrtc' ''
output/webrtc/webrtc.cc: rtc-wpesdkknzakphjitiknw: Answer received.
util/http/http.c: HTTP8080/1: Client disconnected 192.168.1.36.

When things go well, you get this line :
output/webrtc/webrtc.cc: rtc-vohzdpemelhsnkocaalk: Client removed: stream closed.

But most of the time you don't...
At some point I had like 10 ghosted streams in parallel and my WiFi didn't like it. 😅

Here's a graph with 3 streams, the non-ghosted one is in red, the ghosted ones in green and blue :
image

So it looks like a kind of stream timeout handling or detection is missing on the C side (or the JS needs to send a signal to tell the server to stop streaming)...

Support for Raspicam v2 camera?

Does this support the v2 camera as well? I tried to get it working but when I view a stream I have the error "Can't queue buffer" :(

camera-options doesn't have any effect

I'm using camera-streamer to stream video from ov5647 sensor using libcamera. however, trying to do both horizontal and vertical flip doesn't work. even after i set the option. Please help. thank you

systemd unit configuration file for Pi Camera Module V3

Not a bug, just a hint for people trying to use camera-streamer with the Pi Camera Module V3. The following works for me as a systemd unit configuration file. I use it with the file/service name camera-streamer-pi3.service.

; Official Raspberry Pi v3 12MP camera based on the Sony IMX708 chip
; with autofocus
; https://www.raspberrypi.com/products/camera-module-3/
;
[Unit]
Description=camera-streamer web camera
After=network.target
ConditionPathExists=/sys/bus/i2c/drivers/imx708/10-001a/video4linux

[Service]
ExecStart=/usr/local/bin/camera-streamer \
  -camera-path=/base/soc/i2c0mux/i2c@1/imx708@1a \
  -camera-type=libcamera \
  -camera-format=YUYV \
  -camera-fps=25 \
  ; use two memory buffers to optimise usage
  -camera-nbufs=2 \
  ; the high-res is 1640x1232
  -camera-high_res_factor=2 \
  ; the low-res is 820x616
  -camera-low_res_factor=4 \
 -camera-options="AfMode=2" \
 -rtsp-port

DynamicUser=yes
SupplementaryGroups=video i2c
Restart=always
RestartSec=10
Nice=10
IOSchedulingClass=idle
IOSchedulingPriority=7
CPUWeight=20
AllowedCPUs=1-2
MemoryMax=250M

[Install]
WantedBy=multi-user.target

@ayufan you could add this to your examples.

Feature request - Add Audio stream

Hello there!

Really thanks for the camera-streamer, it works flawlessly!!

Can we add audio stream somehow, in the current context of the code, let's say from an USB mic? Is it even possible?

Thanks in advance!

ArduCam 64MP issue

Hey first off thanks for the program. It has all needed features and is pretty neat.

I have compiled everything successful and placed the service into respecitve /service folder.

But it doesn't work, I have installed the official drivers, libcamera-apps and libcamera-dev. LibCamera works.

Now, when I try to start the server it runs but as soon as I want to access for example /snapshot I get a "Server Error". The respecitve service also doesn't run, I just get a timeout error when trying to enable it.

Here a screenshot, when trying to access /stream?res=low
Bildschirmfoto 2023-01-17 um 18 47 41

Streamer locks up, restarting service puts it into zombie state

I've had this happen a couple of times now, if I leave it running for a day or two it eventually stops responding. To attempt and resolve it I restart the service but it never successfully does and ends up stuck as a zombie process.

65218 4036 26.4 0.0 0 0 ? ZNsl Jan18 221:49 [camera-streamer] <defunct>

The only way I can find to fix it is to reboot the Pi, not great if I am running a print.

Compatibility with Libcamera Apps

Hello,

After a fresh installation of Raspberry Pi OS and Camera-Streamer, I'm not able to use the Libcamera Apps anymore, I get:
libcamera-hello: symbol lookup error: /lib/arm-linux-gnueabihf/libpreview.so: undefined symbol: _ZN9libcamera10ColorSpace4JpegE

Work with Arducam Multicam?

Hi, this looks like an amazing setup.

I was wondering if you know if it would work with a multicamera from arducam? One of the older models, not the newer camarray. Particularly this one:

https://www.arducam.com/product/multi-camera-v2-1-adapter-raspberry-pi/

I am trying to use the libcamera or the csi -camera scripts but neither is able to see the camera to show me the output, but I can get the camera working via other means, so I know the board and camera work. Do you have any idea how I might go modify the scripts to work with this?

Additionally, is the /video feed able to be seen by multiple clients at once or is it one client at a time?

Thanks,
Zak

video4linux was not met

Hi!
When I start daemon I obtain error:

sudo systemctl status camera-streamer-arducam-16MP
● camera-streamer-arducam-16MP.service - camera-streamer web camera
Loaded: loaded (/......../camera-streamer/service/camera-streamer-arducam-16MP.service; enabled; vendor preset: enabled)
Active: inactive (dead)
Condition: start condition failed at Sat 2023-01-14 21:56:41 CET; 2s ago
└─ ConditionPathExists=/sys/bus/i2c/drivers/imx519/10-001a/video4linux was not met

Do you know what is the problem?
I have raspberryPi 4, 5.15.84-v8+ aarch64 GNU/Linux, bullseye and HQ reapberryPi camera

When I run ./libcamera_camera.sh:

++ nproc

  • make -j4
    make: 'camera-streamer' is up to date.
  • ./camera-streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV
    [1:31:43.022122477] [5575] INFO Camera camera_manager.cpp:299 libcamera v0.0.2+55-5df5b72c
    [1:31:43.054707230] [5586] INFO RPI raspberrypi.cpp:1425 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media1 and ISP device /dev/media3
    device/libcamera/device.cc: CAMERA: Available cameras (1)
    device/libcamera/device.cc: CAMERA: - /base/soc/i2c0mux/i2c@1/imx477@1a
    device/libcamera/device.cc: CAMERA: Camera /base/soc/i2c0mux/i2c@1/imx519@1a was not found.
    device/device.c: CAMERA: Can't open device: /base/soc/i2c0mux/i2c@1/imx519@1a

and ./camera-streamer
device/v4l2/device.c: CAMERA: Device path=/dev/video0 fd=14 opened
device/v4l2/device_options.c: CAMERA: Configuring option horizontalflip (00980914) = 0
device/v4l2/device_options.c: CAMERA: Configuring option verticalflip (00980915) = 0
device/buffer_list.c: CAMERA:capture: Using: 1920x1080/pBCC, bytesperline=2880
device/camera/camera_input.c: CAMERA: Unsupported camera format=pBCC

webrtc not working anymore

Everything was working fine and for whatever reasons my 15 days old release stopped working for webrtc (the rest is fine)...
So I tried a newer version, just in case, but same problem.

Here is a strace :

[pid  1632] write(2, "util/http/http.c: HTTP8080/1: Client connected 192.168... (fd=5).\n", 69util/http/http.c: HTTP8080/1: Client connected 192.168... (fd=5).) = 69
...
[pid  1632] read(5, "POST /webrtc HTTP/1.1\r\nHost: 192.168...:8080\r\nConnection: keep-alive\r\nContent-Length: 18\r\nPragma: no-cache\r\nCache-Control: no-cache\r\nUser-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36\r\nContent-Type: application/json\r\nAccept: */*\r\nOrigin: http://192.168...:8080\r\nReferer: http://192.168...:8080/webrtc\r\nAccept-Encoding: gzip, deflate\r\nAccept-Language: fr,en-US;q=0.9,en;q=0.8\r\nCookie: motion_detected_1=false; monitor_info_1=\"\"; capture_fps_1=5.0\r\n\r\n{\"type\":\"request\"}", 4096) = 549
[pid  1632] write(2, "util/http/http.c: HTTP8080/1: Request 'POST' '/webrtc' ''\n", 58util/http/http.c: HTTP8080/1: Request 'POST' '/webrtc''') = 58
[pid  1632] lseek(5, -18, SEEK_CUR)     = -1 ESPIPE (Illegal seek)
[pid  1632] close(5)                    = 0
[pid  1632] write(2, "util/http/http.c: HTTP8080/1: Client disconnected 192.168...\n", 65util/http/http.c: HTTP8080/1: Client disconnected 192.168...) = 65

image

I tried to debug a bit more... we go through http_404 and http_write_response successfully, so I don't get what's going on...
All other streams and snapshots are working just fine.

In fact, what I understand is that the server seeks something on the socket right after the POST, but fails... thus the code doesn't have time to write back to the socket. Only question is if this illegal seek occurs when reading or just before writing...

Here is the payload, just in case :
image

Camera Device not Found

Hello,
I have successfully compiled the software however when I try to use it I get this error log in the journal:

░░ Subject: A start job for unit camera-streamer-arducam-16MP.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://www.debian.org/support
░░
░░ A start job for unit camera-streamer-arducam-16MP.service has finished successfully.
░░
░░ The job identifier is 2581.
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: [0:02:38.646971896] [2416] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3700-f30ad033
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: [0:02:38.682950964] [2427] WARN RPI raspberrypi.cpp:1252 Mismatch between Unicam and CamHelper for embedded data usage!
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: [0:02:38.683611493] [2427] INFO RPI raspberrypi.cpp:1368 Registered camera /base/soc/i2c0mux/i2c@1/imx219@10 to Unicam device /dev/media2 and ISP device /dev/media0
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: device/libcamera/device.cc: CAMERA: Available cameras (1)
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: device/libcamera/device.cc: CAMERA: - /base/soc/i2c0mux/i2c@1/imx219@10
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: device/libcamera/device.cc: CAMERA: Camera /base/soc/i2c0mux/i2c@1/imx519@1a was not found.
Jul 18 09:02:55 raspberrypi camera-streamer[2416]: device/device.c: CAMERA: Can't open device: /base/soc/i2c0mux/i2c@1/imx519@1a
Jul 18 09:02:55 raspberrypi systemd[1]: camera-streamer-arducam-16MP.service: Main process exited, code=exited, status=255/EXCEPTION
░░ Subject: Unit process exited
░░ Defined-By: systemd
░░ Support: https://www.debian.org/support
░░
░░ An ExecStart= process belonging to unit camera-streamer-arducam-16MP.service has exited.
░░
░░ The process' exit code is 'exited' and its exit status is 255.
Jul 18 09:02:55 raspberrypi systemd[1]: camera-streamer-arducam-16MP.service: Failed with result 'exit-code'.
░░ Subject: Unit failed
░░ Defined-By: systemd
░░ Support: https://www.debian.org/support
░░
░░ The unit camera-streamer-arducam-16MP.service has entered the 'failed' state with result 'exit-code'.

It seems like its looking for an imx519 sensor when an imx219 is plugged in instead, is there a way to change this such that the software looks for an imx219 sensor instead? Thanks in advance for the help.

Unsupported Camera Format

Hi,
I get this when trying to run any of the scripts to start. I'm using the Arducam 16MP.
image

This is the output of checking the formats available.

image

What is the issue here?
Thanks!

Using AfWindows

I am attempting to use the AfWindows parameter to adjust where autofocus is triggered from on a Pi Camera 3. I was attempting to test using the URL method first but I can not seem to get it to take properly. Example; /option?AfWindows=(0,0,4608,250)
Also, is there any method to view what the parameters are currently set to for verification?

libcamera is not supported

Hello,

I tried this command on a RPi3A+ with the IMX477 camera (HQ):
./camera-streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx477@1a -camera-type=libcamera -camera-format=YUYV -camera-width=4032 -camera-height=3040 -camera-fps=30 -camera-nbufs=2 -camera-high_res_factor=2 -camera-low_res_factor=4

But I get the following response:
device/libcamera/libcamera.cc: ?: libcamera is not supported

I'm using the last Raspberry Pi OS version (32 bits) and the libcamera apps works well.

Feature Request: streamline commandline options.

Hi,

To use camera-streamer as an drop in replacement you should use ustreamer like coomandline options.
Also would it be great to match unix philosophy/style commandline options with short hand and long descriptive style.

For ustreamer compatibility:

-d /dev/path, --device /dev/path
Path to V4L2 device. Default: /dev/video0.

-r WxH, --resolution WxH
Initial image resolution. Default: 640x480.

-m fmt, --format fmt
Image format. Available: YUYV, UYVY, RGB565, RGB24, JPEG; default: YUYV.
In this special case you should use it if there is a inbuilt HW decoder in the desired device.

-f N, --desired-fps N
Desired FPS. Default: maximum possible.

-c type, --encoder type
Use specified encoder. It may affect the number of workers.

          CPU ─ Software MJPEG encoding (default).

          HW ─ Use pre-encoded MJPEG frames directly from camera hardware.

          M2M-VIDEO ─ GPU-accelerated MJPEG encoding.

          M2M-IMAGE ─ GPU-accelerated JPEG encoding.

          NOOP ─ Don't compress MJPEG stream (do nothing).

This should also match your behavior and is highly discaussable if there is a need for.

This would be a general recommendation from my side.
Those are flags that are users might be used to.

In terms of reusability would be great to have

-s address, --host address
Listen on Hostname or IP. Default: 127.0.0.1.

-p N, --port N
Bind to this TCP port. Default: 8080.

--static path
Path to dir with static files instead of embedded root index
page. Symlinks are not supported for security reasons. Default:
disabled.

For the last option I will also open an Feature Request.

Regards Kwad

Issue with Manual Focus on Arducam 16MP

Hello,

Thanks for this fantastic streamer. I am not very much into code and development. I just a normal 3d printer guy.

I got this install and working on Klipper. but the only issue which i am facing is focus. I am not able to focus using the guide on the main page. Please can you provide more detailed guide on how to get focus working.

Thank you

RTSP broken in certain resolutions, only 30 fps max

Hello, good job on this application, finally I can stream a usb mjpeg camera with h264 and very low latency (WebRTC), there are still some bugs though:

*If the video is set to any common resolution like 720p, 480p, the rtsp stream breaks. (See ttached picture). Setting the camera to 1080p and the video to 720p works, but only because for some reason the scaled video resolution becomes 1312x736 instead of the requested 1280x720.

I tested with both software only and hardware decoders in my laptop, the same issue is seen with both.

*When streaming at 60 fps, the rtsp seems to be set at 30.
I lowered the resolution back to 240p60 and I can see that WebRTC is at 60 fps, but the rtsp is always just 30 fps.

Screenshot from 2023-04-30 22-39-16
weird_resolution

SPS ans PPS Header on RTSP

Good evening again,

I add a problem that I am also experiencing.

I normally burn RTSP streams to disk with the following command:

ffmpeg -hide_banner -y -loglevel error -rtsp_transport tcp -use_wallclock_as_timestamps 1 -i rtsp://localhost:8554/stream.h264 -vcodec copy -acodec copy -f segment -reset_timestamps 1 -segment_time 3600 -segment_format mkv -segment_atclocktime 1 -strftime 1 /home/pi/Videos/IMX462_%Y%m%d_%H%M%S.mkv

It is functional with rtsp from other streamers like v4l2rtspserver or from manufacturers like Dahua, Hikvision, Avigilon etc....

The following output is obtained:


[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] non-existing PPS 0 referenced
    Last message repeated 1 times
[h264 @ 0xf92db0] decode_slice_header error
[h264 @ 0xf92db0] no frame!
[h264 @ 0xf92db0] error while decoding MB 1 19, bytestream -7
[segment @ 0xfa9d70] Failed to open segment '/home/pi/Videos/IMX462_20230308_180712.mkv'
Could not write header for output file #0 (incorrect codec parameters ?): No such file or directory

However with this streamer it doesn't, because apparently the SPS and PPS header data is missing, which I think would be added with the "--inline" command in libcamera-vid.

--inline, -ih #Insert PPS, SPS header

Is there any other way to record this video stream locally? Obviously with libcamera-vid I can't since it is in use.

Many thanks for your support!

Pi Camera Module 3 Autofocus

I just found this project while trying to setup an efficient stream using the new camera module 3 for OctoPrint.

I have gotten this running successfully with the following command: sudo tools/libcamera_camera.sh -camera-format=YUYV -camera-path=/base/soc/i2c0mux/i2c@1/imx708@1a

However, I am trying to get it to enable autofocus. This works on the most recent version of libcamera like so: libcamera-hello -t 0 --autofocus-mode continuous

Is there a way to set the autofocus options that I am missing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.