sdr-enthusiasts / docker-tar1090 Goto Github PK
View Code? Open in Web Editor NEWMulti-arch tar1090 container for visualising ADSB data
Multi-arch tar1090 container for visualising ADSB data
Is it necessary readsb to start in the container if tar1090 getting data from an external source? If not, is it possible to add env var READSB_ENABLE for example.
Thanks!
First of all thank you for providing this great docker solution!
Unfortunately I'm periodically getting these errors and i.e. replay does not work:
tar1090 | [2023-11-30 16:16:05.667][collectd] [error] table plugin: Failed to open file "/sys/class/thermal/thermal_zone0/temp": No such file or directory.
tar1090 | [2023-11-30 16:16:05.667][collectd] [notice] read-function of plugin `table' failed. Will suspend it for 240.000 seconds.
I don't see an issue in my docker-compose:
version: "3.8"
services:
tar1090:
image: ghcr.io/sdr-enthusiasts/docker-tar1090:latest
tty: true
container_name: tar1090
restart: always
environment:
- TZ=Europe/Berlin
- BEASTHOST=10.0.1.90
- MLATHOST=10.0.1.90
- LAT=[redacted]
- LONG=[redacted]
volumes:
- /opt/adsb/tar1090/globe_history:/var/globe_history
- /opt/adsb/tar1090/timelapse1090:/var/timelapse1090
- /opt/adsb/tar1090/graphs1090:/var/lib/collectd
- /proc/diskstats:/proc/diskstats:ro
# - /run/airspy_adsb:/run/airspy_adsb
ports:
- 8078:80
tmpfs:
- /run:exec,size=64M
- /var/log
Hi, First of all thanks for your great work!
I've setup readsb connected to my RTLSDR receiver and the following containers are pulling form it: tar1090, graph1090, piaware, rbfeeder, fr24feed, adsbx, opensky
Everything works fine except that when accessing http://dockerhost:8078/?heatmap
or http://dockerhost:8078/?heatmap&realHeat
like mentioned in the readme, I get an empty map...
The folder /var/globe_history
does contain data for each day (ttf files).
Am'I missing something ?
Enabling 978 and pointing url to skyaware978 looks like it grabs json using wget. Which is not installed in container. Install wget fixes issue.
[tar1090] /usr/local/share/tar1090/tar1090.sh: line 209: wget: command not found
Hello,
The type code and picture are missing on Safari on OS X, a Chrome browser on the same machine shows the type code and picture. Chrome, Safari, and OS X have all been updated to the latest versions.
The mikenye/docker-tar1090 location does not correctly map to the lat/lon set in mikenye/docker-readsb
The default location is over Manhattan, NY.
Pressing the H (H)ome/Reset Map key on the browser will reset the map on NY.
@wiedehopf suspects that the tar1090 container has its own readsb, if so it will need its own lat lon set as well
I have set "READSB_EXTRA_ARGS=--json-trace-interval=1" but it doesn't seem to be honored. When I exec into the container I see the variable set to the correct value but ps shows me
/usr/local/bin/readsb --net-only --quiet --lat 61.255098 --lon 24.056373 --net-connector=adsbx_feeder,30105,beast_in --net-connector=adsbx_readsb,30005,beast_in -
Hi Team,
I currently use this container with an RTL-SDR dongle but I'd like to migrate to RTL_TCP so the SDR can sit in a more convenient location. Is it possible to configure/add support for RTL_TCP as a data source?
Cheers!
Add support for PTRACKS variable, requested by user joem on ADSBx discord.
References:
I'm trying to feed the Avare EFB with the JSON output from tar1090/ultrafeeder. Avare takes a base URL as input and appends data.json to that base URL. tar1090/ultrafeeder uses aircraft.json as the JSON feed for aircraft data.
I only need the Dockerfile for tar1090 to add a symbolic link for aircraft.json to data.json in /run/readsb. I couldn't find the exact place in Dockerfile to add that step, so could you please add it in the appropriate place after /run/readsb is populated?
ln -s /run/readsb/aircraft.json /run/readsb/data.json
(that's lower case L at the beginning, not upper case i)
With that symbolic link in place, I can provide http://:8080/data as the base URL, and Avare appends data.json to tap into the JSON feed. It works and I see the JSON flowing. Unfortunately, Avare currently doesn't like something in the JSON output, but that's a different issue for the makers of Avare. I need to present the JSON to the app first, and this fix makes that happen.
Thanks!
Edit: For cross-reference: https://groups.google.com/g/apps4av-forum/c/svuHklVh8FE
Would it be possible to include the Timelapse functionality from wiedehopf/timelapse1090 either in the docker-piaware
(uses dump1090-fa), or better, in the docker-readsb
or docker-tar1090
containers?
This would be a nice addition.
Thanks again for the great work!
At the moment it is not possible to configure the range for graphs1090 as seen here: https://github.com/wiedehopf/graphs1090/blob/master/default#L13
Running the latest version of docker-tar1090 container. timelapse1090 data does not persist if the container needs to be stopped but heatmap data does. Docker must stop for 10 minutes each night so the server can complete a backup and update cycle. When docker restarts the container, the timelapse1090 data is wiped from the volume when it is restarted. I have tried multiple workarounds to solve this and am coming up empty including mapping the timelapse1090 volume to an external docker volume by adding the following to docker-compose.yml:
volumes:
timelapse:
external:
name: timelapse
Add the following to tar1090
volumes:
- timelapse:/var/timelapse1090
The container will use the external volume but will wipe the volume when it starts up. I have a cron script running that copies the chunk_XXX.gz files every hour and synchronizing the data back to the docker volume path after the sever reboots so I am not loosing data but it would be much more convenient if there were a TIMELAPSE1090_PERSIST flag where if set to true would not wipe the docker storage volume when the container is restarted.
Here is the tar1090 config
tar1090:
image: mikenye/tar1090:latest
tty: true
container_name: tar1090
restart: always
depends_on:
- readsb
environment:
- UPDATE_TAR1090=false
- TZ=${FEEDER_TZ}
- BEASTHOST=readsb
- LAT=${FEEDER_LAT}
- LONG=${FEEDER_LONG}
- TAR1090_DEFAULTCENTERLAT=${FEEDER_LAT}
- TAR1090_DEFAULTCENTERLON=${FEEDER_LONG}
- MLATHOST=mlathub
- ENABLE_TIMELAPSE1090=true
- TIMELAPSE1090_INTERVAL=1
- TIMELAPSE1090_HISTORY=168
- GZIP_LVL=1
- TAR1090_FLIGHTAWARELINKS=true
- TAR1090_PLANECOUNTINTITLE=true
volumes:
- timelapse:/var/timelapse1090
- tar1090_heatmap:/var/globe_history
ports:
- 80:80
tmpfs:
- /run:exec,size=64M
- /var/log
please add an environment variable to enable FlightAware links
(the first sed expression worked for me )
ENABLE:
sudo sed -i -e 's?.flightawareLinks.?flightawareLinks = true;?' /usr/local/share/tar1090/html/config.js
ENABLE if the above doesn't work (updated from previous version)
echo 'flightawareLinks = true;' | sudo tee -a /usr/local/share/tar1090/html/config.js
DISABLE:
sudo sed -i -e 's?.flightawareLinks.?flightawareLinks = false;?' /usr/local/share/tar1090/html/config.js
Just highlighting this issue sdr-enthusiasts/airspy_adsb#4 here as well. I don't not know in which repo this issue best sits.
Quick summary - graphs do not show airspy data when running the containerised version of airspy_adsb
I'm having trouble pinpointing what's going on here, but the basics are:
I have two RTL usb sticks for 1090 and 978 plugged into a Raspberry Pi 4 - when the pi boots, both RTL (which were serialized) display just fine. As soon as I start either ultrafeeder or dump978, the respective RTL then blanks out - which causes both docker images to say they cannot find an RTL that has the supplied serialnumber.
For instance, I can stop the docker service and see RTLs:
$ sudo rtl_sdr -d 0
Found 2 device(s):
0: AIRNAV, ADSB_1090, SN: 1090
1: Realtek, RTL2838UHIDIR, SN: 978
Using device 0: Generic RTL2832U OEM
rtl_sdr, an I/Q recorder for RTL2832 based DVB-T receivers
However, as soon as I restart the docker service:
$ sudo service docker restart
$ sudo rtl_sdr -d 0
Found 2 device(s):
0: , , SN:
1: , , SN:
Using device 0: Generic RTL2832U OEM
rtl_sdr, an I/Q recorder for RTL2832 based DVB-T receivers
I thought this might be a power thing, but I've tried both USB-C from an adequate power source, the POE+ hat, and even the RTLs on a powered USB hub that's connected to the pi - all have the same issue.
Any ideas as to what's going on?
Possibly related to wiedehopf/tar1090#77 , the aircraft database js files are gzipped and not being parsed correctly in the client, leaving the aircraft type blank and resulting in browser dev console errors:
7c1c6f: Database load error: parsererror at URL: db2/7.js planeObject.js:2023
76bd49: Database load error: parsererror at URL: db2/7.js planeObject.js:2023
7c7181: Database load error: parsererror at URL: db2/7.js planeObject.js:2023
I am able to manually decompress the js files and rename them which solves the issue, it looks like nginx is not returning the Content-Encoding: "gzip"
header.
$ curl -I http://192.168.0.216:8078/db2/7.js
HTTP/1.1 200 OK
Server: nginx
Date: Mon, 16 Nov 2020 10:31:01 GMT
Content-Type: application/javascript
Content-Length: 147661
Last-Modified: Mon, 16 Nov 2020 10:29:22 GMT
Connection: keep-alive
ETag: "5fb25482-240cd"
Cache-Control: public, max-age=7776000
Accept-Ranges: bytes
I changed the location stanza in /usr/local/share/tar1090/nginx.conf from
location ~ db-.*\.js$ {
gzip off;
add_header Cache-Control "public, max-age=7776000";
add_header Content-Encoding "gzip";
}
to
location ~ db2/.*\.js$ {
gzip off;
add_header Cache-Control "public, max-age=7776000";
add_header Content-Encoding "gzip";
}
and it seems to be working correctly:
$ curl -I http://192.168.0.216:8078/db2/7.js
HTTP/1.1 200 OK
Server: nginx
Date: Mon, 16 Nov 2020 10:33:08 GMT
Content-Type: application/javascript
Content-Length: 147661
Last-Modified: Mon, 16 Nov 2020 10:29:22 GMT
Connection: keep-alive
ETag: "5fb25482-240cd"
Cache-Control: public, max-age=7776000
Content-Encoding: gzip
Accept-Ranges: bytes
Hi, great works!
When adding ENABLE_978=yes
to the environment options, I got this in docker logs:
tar1090 | [tar1090] /usr/local/share/tar1090/tar1090.sh: line 104: 978.json: No such file or directory
Then I tried these the error goes away, but the default URL_978
(at http://127.0.0.1/skyaware978) keeps returning 404 Not Found:
docker exec -it tar1090 bash
touch /run/readsb/978.json
Am I missing anything? Thanks!
Reported via Discord:
I am using mikenye/docker-tar1090 and believe that I have timelapse configured properly. However, I only appear to be getting around 35 minutes of timelapse data on playback - and that is from previous day.
e.g. I am in the UK, so it is currently 13:57 BST - but the timelapse data is only showing 22:06 and 22:41 as start and end times from yesterday.
My config includes the lines:
- ENABLE_TIMELAPSE1090=true - TIMELAPSE1090_INTERVAL=10 - TIMELAPSE1090_HISTORY=24 volumes: - tar1090_timelapse:/var/timelapse1090
EDIT: I have noticed that I have the following entries repeating in the log for this container:
[timelapse1090] /opt/timelapse1090/timelapse1090.sh: line 91: 7za: command not found [timelapse1090] sed: couldn't write 397 items to stdout: Broken pipe [timelapse1090] mv: cannot stat 'temp.gz': No such file or directory
Add a Telegraf instance that provides data for Influx and Prometheus to ingest. Ideally, the fields and values should be compatible with what docker-readsb-protobuf
already provides so the existing Grafana dashboards can be reused.
For Prometheus, I already mapped http://xxx/metrics -> /run/readsb/stats.prom
but the data available there is different and much less than what’s generated by docker-readsb-protobuf. Ideally, the additional data should just be added to /run/readsb/stats.prom
For Influx -- I have no idea how that works
Thanks!
The base tool allows for you to include a range outline - https://github.com/wiedehopf/tar1090#heywhatsthatcom-range-outline
It would be good to be able to allow a map to a local file to enable this functionality
Currently, the title of the page is "tar1090". It would be awesome if that would be configurable through environment variables.
Hi, is there a file or directory I can map to a volume to persist the received range outline between container restarts?
After updating all containers today, I notice a new (thick) outline on the map, showing the max range of aircrafts having seen. I miss the options to configre them:
(similar to range_outline_*)
It does a LOT of disk writes, and even if you keep the files around, they will not persist after a restart of readsb
according to @wiedehopf.
He advises to write to /run/timelapse1090
instead of to /var/timelapse1090
[[tracking issue so we don't forget about it]]
Originally logged here: sdr-enthusiasts/docker-adsb-ultrafeeder#5
I've tried to enable the InfluxDB functionality, but getting error reports that I need to provide an Org or OrgId
There does not appear to be an environment variable to set this detail
[telegraf] 2023-04-01T03:40:03Z E! [outputs.influxdb_v2] Failed to write metric to adsb_tbsh (will be dropped: 400 Bad Request): invalid: Please provide either orgID or org
tar1090 allows to configure the color of each range ring defined by SiteCirclesDistances
by way of similar array named SiteCirclesColors
. If less colors are givem in the array than range rings are defined, the last color used for the remaining.
E.g. by appending
SiteCirclesColors = new Array('#ffffff');
to
/usr/local/share/tar1090/html/config.js
inside your running tar1090 container, all range rings become white colored.
The change seems trivial: use another environment variable, e.g. TAR1090_RANGERINGSCOLORS
, in
rootfs/etc/cont-init.d/04-tar1090-configure
similar to the already existing TAR1090_RANGERINGSDISTANCES
. Only pitfall may be the need of single quotes around the colors. Needs mentioning in documentation or addition while handling the $TAR1090_RANGERINGSCOLORS
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.