Comments (5)
You're in luck - the docker-compose file should take care of all of this for you. I probably need to update the documentation a bit since I've gotten a bit better with docker myself over the years.
Once you've made your local config.toml file for srtrelay, you should be able to just do docker-compose up
Here's a quick walkthrough (comments added #):
services:
this is your voc/srtrelay image - I made a quick container and pushed it to dockerhub.
srtrelay:
image: "ravenium/srtrelay:latest"
ports:
# this maps 1935:1935 udp to the host, since that's what we'll be using for the SRT protocol.
# note that SRT doesn't technically have a "traditional" port like RTMP does, it just needs an assigned UDP port
- "1935:1935/udp"
# this is a docker "private" network that the containers share for communication
networks:
- fragforce
# I'm mapping config.toml so the srtrelay container can use it. Make sure it exists in the same dir as docker-compose.yaml
# Grab a copy from voc/srtrelay and follow his instructions. Change the port to match - it's 1337 in his example
volumes:
- $PWD/config.toml:/conf/config.toml
This is the "stream preview" container.
frag_restream_record:
image: "ravenium/fragconsole:latest"
# open port 3000 so you can browse to it from your host
ports:
- "3000:3000"
# This command starts the app to do the following:
# send all streams it sees to the web console (-s)
# poll the srtrelay api at our other container's address. Notice 8080 isn't exposed outside of our docker formation?
# grab the streams from the other container's stream url
# listen on port 3000 (matches the ports above) so you can browse to it from your host.
command: -s -serverurl http://srtrelay:8080/streams -streamurl srt://srtrelay:1935 -listen 0.0.0.0:3000
networks:
- fragforce
# maps a local folder called record to /record on the host. If you're not using -r you can comment this out.
volumes:
- $PWD/record:/record
networks:
fragforce:
from fragconsole.
So, thanks. Mucho appreciated. So am I right in saying that I grab the config.toml file, edit it for my own purposes and make sure the port is set to 1935 as my server is at home, and then carry on?
Cheers
from fragconsole.
I still cannot get this to work. I know I'm doing something really silly.... I just get errors running the docker commands
from fragconsole.
I'd definitely try first with a few sample compose files just to get the basic syntax down. If you've got a more specific error I can try to spot what's going wrong, too.
from fragconsole.
Yeah it's not an issue I'd just like to get my head around it. I have a docker hub account and I think what I shall do is to get a github account, edit the changes I need for my particular needs and then upload the resulting files to docker hub and use them like that. For me a 200ms latency is far too low, I have edited it to 2000ms and this results in much better results. I have a livestream coming up on October 30th and for then I want the srtrelay and fragconsole combo up and running so I can record the srt feeds to me clean before I add overlays etc. In my quick tests a 2000ms latency with a 50000 buffersize offers a really good mix of latency versus quality and recovery if frames are dropped (it's a music livestream and dropped frames resulting in audio dropouts isn't good)
from fragconsole.
Related Issues (2)
- Disable record/preview? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fragconsole.