Giter VIP home page Giter VIP logo

Comments (10)

Qengineering avatar Qengineering commented on May 25, 2024

@crudo0-arch,

Could you get the 1080 x 720 UDP stream working?
It's given as an example in the Readme.

from bananapi-m2-zero-ov5640.

crudo0-arch avatar crudo0-arch commented on May 25, 2024

The UDP from the example gives this output:

gst-launch-1.0 -v v4l2src device=/dev/video1 num-buffers=-1 ! video/x-raw, width=1280, height=720, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.1.54 port=5200
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)30/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)0, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)1662937395, timestamp-offset=(uint)4055759921, seqnum-offset=(uint)13665
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)1662937395, timestamp-offset=(uint)4055759921, seqnum-offset=(uint)13665
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, sof-marker=(int)0, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 4055765965
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 13665
^Chandling interrupt.

but on VLC i the streaming did not start... The streaming to desktop command on the page gives this output:

gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! autovideosink
Setting pipeline to PAUSED ...
gbm: failed to open any driver (search paths /usr/lib/arm-linux-gnueabihf/dri:$${ORIGIN}/dri:/usr/lib/dri)
gbm: Last dlopen error: /usr/lib/dri/sun4i-drm_dri.so: cannot open shared object file: No such file or directory
failed to load driver: sun4i-drm
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"(GstGLDisplayGBM)\ gldisplaygbm0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
Additional debug info:
gstv4l2src.c(658): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed
Execution ended after 0:00:00.099238745
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

from bananapi-m2-zero-ov5640.

crudo0-arch avatar crudo0-arch commented on May 25, 2024

when we tryed the opencv script it gives this:

[ WARN:0] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (2076) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src0 reported: Failed to allocate required memory.
[ WARN:0] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (1053) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (616) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created

from bananapi-m2-zero-ov5640.

Qengineering avatar Qengineering commented on May 25, 2024

@crudo0-arch,

Several things come into play here.

You only have a small amount of RAM on board.
500Mb gives you little space, all the more so because the Armbian OS also consumes a part.
We just managed to send 1280x720@30FPS UDP, 1280x960 is no longer possible.

You probably don't have enough memory to record video. Remember that mp4, flv or mkv are all h264 encoded. This compression is too demanding for the BananaPi zero, especially if you run it via OpenCV.

We just managed to record a simple avi with this pipeline.

gst-launch-1.0 -v v4l2src device=/dev/video1 num-buffers=-1 ! video/x-raw, width=1280, height=720, framerate=15/1 ! videoconvert ! jpegenc ! avimux ! filesink location=video.avi

I think you'll have to adjust your expectations.

P.S. Your UDP test failed because you use the wrong receiver pipeline.
Transmittor (BananaPi)

sudo media-ctl --device /dev/media1 --set-v4l2 '"ov5640 2-003c":0[fmt:YUYV8_2X8/1280x720]'
gst-launch-1.0 -v v4l2src device=/dev/video1 num-buffers=-1 ! video/x-raw, width=1280, height=720, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.29 port=5200

Receiver (connected to 192.168.178.29 !)

gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink

from bananapi-m2-zero-ov5640.

crudo0-arch avatar crudo0-arch commented on May 25, 2024

whit the metioned pipline i get

gst-launch-1.0 -v v4l2src device=/dev/video1 num-buffers=-1 ! video/x-raw, width=1280, height=720, framerate=15/1 ! videoconvert ! jpegenc ! avimux ! filesink location=video.avi
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, framerate=(fraction)15/1, format=(string)YUY2, colorimetry=(string)2:0:0:0, interlace-mode=(string)progressive
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
Additional debug info:
gstv4l2src.c(658): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed
Execution ended after 0:00:00.046684556
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

there's a way to boot on CLI and increase the swap memory?

from bananapi-m2-zero-ov5640.

Qengineering avatar Qengineering commented on May 25, 2024

Unfortunately you can't boot on CLI. Not sure if more swap space will solve anything here. 4 GB is a lot
BTW, how much memeory do you have available after a reboot, not running any app? $ free -m
image

from bananapi-m2-zero-ov5640.

crudo0-arch avatar crudo0-arch commented on May 25, 2024

immagine

so there's no way to turn the banapi on a wifi hd camera??

from bananapi-m2-zero-ov5640.

Qengineering avatar Qengineering commented on May 25, 2024

I don't think so.

from bananapi-m2-zero-ov5640.

crudo0-arch avatar crudo0-arch commented on May 25, 2024

the raspberry pi zero has 512mb of ram but it can work as a videcamera...

from bananapi-m2-zero-ov5640.

Qengineering avatar Qengineering commented on May 25, 2024

Indeed.

Your problem is the lack of proper video software tailored to the hardware.
The Raspberry Pi foundation has many programmers, some solely working on the Broadcam video stack.

The BananaPi only has a forum; several enthusiastic Linux programmers are trying to build extensions.
The problem is that there is not one generic operating system but many. Every operating system has its fans. And there is no coordination between jobs.

You end up with several operating systems, all more or less doing something, but (usually) without any solid foundation or documentation. It is the case with many boards, such as the Orange Pi, Rock Pi, and VIM3, not just the Banana Pi.
Make money by selling hardware. But developing proper software is more or less left to a forum.

I'm sure the Banana PI will eventually be able to handle HD video over WiFi. You will have to develop your own GPU driver software at a very low level

from bananapi-m2-zero-ov5640.

Related Issues (10)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.