Giter VIP home page Giter VIP logo

spiderweb's Introduction

SPIDERWEB

Ham radio cluster web viewer for DxSpider

GPLv3 license made-with-python made-with-javascript bootstrap CodeFactor

  • Release: v2.5.4
  • Author: Corrado Gerbaldo - IU1BOW
  • Mail: [email protected]
  • Licensing: Gpl V3.0 see LICENSE file.
  • Languages: This application is written in Python 3.12/flask,Javascript and HTML

DXSpider is a great DX Cluster software that has useful telnet interface. I wrote this application in order to add a web user interface to DXSpider and show the spots collected. The user could see 50 spots at time and filter them by band, spotter continent and spotted continent.

For this application, I've used:

  • Bootstrap for stylesheet CSS
  • Apache ECharts for managing charts
  • qrz.com For each callsing, found you can click on lens and you'll see him on qrz.com
  • flag-icon-css https://github.com/lipis/flag-icon-css I used it to show the country flags
  • ng3k.com ng3k.com I used to get information about "Announced Dx Operations". Thanks to Bill/NG3K !!!
  • silso sidc.be/silso used to show propagation trend in "Chart & stats" section
  • hamqsl www.hamqsl.com/solar.html used to show solar and band conditions
  • kc2g prop.kc2g.com used to show MUF map

You can find my web site at https://www.iu1bow.it

Changelog

see file "CHANGELOG.md"

Install

1) DXSpider First of all you have to installed DXspider and connected with some other cluster nodes.

2) MariaDB / MySQL Then you have to install MariaDB on your web server, on the same server where DXSpider is running and configure DXSpider to use it: in your spider folder edit local/DXVars.pm adding these lines:

# the SQL database DBI dsn
$dsn = "dbi:mysql:dxcluster:localhost:3306";
$dbuser = "your-user";
$dbpass = "your-password"; 

If you would change some MariaDB parameters, then you can find them in /etc/mysql/my.cnf or /etc/my.cnf, depending on your distro. If the database will not be created automatically, please see "DB_ISSUES.md"

Is it possible that have also to install mariadb libraries. For Ubuntu:

 foo@bar:~$ sudo apt-get install -y libmariadb-dev             

3) Python modules
You could install python modules using automatic or manual way.

3.1) Automatic modules install
after downloaded spiderweb move in the main folder and install using requirements.txt file

 foo@bar:~$ cd spiderweb                                
 foo@bar:~$ pip install -r requirements.txt

3.2) Manual modules install
First of all you have to install the python3 pip installer

foo@bar:~$ sudo apt install python3-pip

This application is based on Flask To install Flask:

foo@bar:~$ pip install flask 
foo@bar:~$ pip install Flask-minify
foo@bar:~$ pip install flask_wtf
foo@bar:~$ pip install pandas

Configuration

In the path spiderweb/cfg/ rename config.json.template in config.json:

foo@bar:~$ mv config.json.template config.json

then edit it and set the user and password of your database, the menu items, and other stuffs (callsign, mail address...). There is also a specific parameter, named "enable_cq_filter" used to enable the CQ Zone filtering.

Othewhise, if you prefer, you could use a utility to edit your configuration and menu. Go in "script" folder and run ./config.sh

foo@bar:~$ cd scripts
foo@bar:~$ ./config.sh

*** DxSpider configuration ***
Configuration file loaded from: ../cfg/config.json

   h:  help
   vc: view config.
   ec: edit config.
   vm: view menu
   em: edit menu
   s:  save
   t:  load config. from template

   x:  exit

Make your choice: 

Crontab

Starting from version 2.4, since all activities are managed by the application, you don't need to schedule anything.

Run test

Now, you can run your web application with the following command:

foo@bar:~$ python3 webapp.py

The flask default port is 5000, so you can see your web app, typing http://localhost:5000 in your web browser. Keep in mind that the flask web server, usually is used as a test server.

Production

There are some ways to use it in production.

My configuration is: Cloudflare + Nginx + Bjoern

Bjoern is a lightweight WSGI for python.

for installing it:

foo@bar:~$ sudo apt install libev-dev libevdev2
foo@bar:~$ pip3 install bjoern

If you want you can make it as a daemon service. Create and edit a file named for example spiderweb.service (in the systemd folder)

foo@bar:~$ sudo nano /etc/systemd/system/spiderweb.service

Below an example of configuration, keep in mind that it is just an example (you have to put your data for your application here a little guide ):

[Unit]
Description=bjoern instance spiderweb
After=network.target
After=multi-user.target

[Service]
User=web
Group=www-data
Type=simple
WorkingDirectory=/home/web/spiderweb
Environment="PATH=/home/web/spiderweb"
ExecStart=/usr/bin/python3 /home/web/spiderweb/wsgi.py

[Install]
WantedBy=multi-user.target

Then you can install and start the daemon:

foo@bar:~$ sudo systemctl enable spiderweb.service
foo@bar:~$ sudo systemctl start spiderweb.service
foo@bar:~$ sudo systemctl status spiderweb.service

● spiderweb.service - bjoern instance spiderweb
   Loaded: loaded (/etc/systemd/system/spiderweb.service; enabled; vendor preset: enabled)
   Active: active (running) since Sun 2020-10-25 09:56:35 UTC; 8h ago
 Main PID: 6518 (python3)
    Tasks: 1 (limit: 420)
   CGroup: /system.slice/spiderweb.service
           └─6518 /usr/bin/python3 /home/web/spiderweb/wsgi.py

Oct 25 09:56:35 dxcluster01 systemd[1]: Started bjoern instance spiderweb.

Now you can install and configure NGINX

Install with:

foo@bar:~$ sudo apt install nginx

Configure:

sudo nano /etc/nginx/sites-available/myapp
server {
    listen 80;
    server_name iu1bow.it www.iu1bow.it;
    location ^~ /.well-known/ {
      alias /home/web/verify/.well-known/;
    }

    location / {
        ssi off;
        include proxy_params;
        proxy_pass http://localhost:8080/;
        proxy_set_header Host $host;
    }
}

For SSL, I'm using Cloudflare. This is a free service that allow you to use https and a proxy cache.

Search engine indexing: when you are on-line, if you would to index your website on search engines, you have to generate a file named sitemap.xml and put it in /static/ folder. There are many tools to generate sitemap.xml, for example www.xml-sitemaps.com

Index on MySQL: if you would to increase speed on callsign search, you could define some index on the table 'spot'. You can see more details on "create_mysql_index.sql"

Mobile

This application is designed for desktop and mobile phone. It is a PWA, so it could be installed and used like an app on mobile.

API

Spot list

You can retrieve last spots calling "/spotlist"; For example www.iu1bow.it/spotlist

Country of a callsign

You can retrive some information about a callsign with callsign; for example: www.iu1bow.it/callsign?c=IU1BOW

Development

Directory structure

/                 . main application files
├── cfg           . configuration files (put here your config.json with your setting)
├── docs          . documentation
├── lib           . python libs used for the application
├── log           . application log
├── scripts       . utility scripts for testing, build etc.
├── static        . static files css, js, data, html, images etc.
│   ├── css       .
│   │   ├── dev   . development css not minifyed/uglifyed
│   │   └── rel   . release css minifyed/uglifyed  (do not change these files)
│   ├── data      . application data (world.json)
│   ├── html      .
│   │   └── dev   . html templates used for build release static html (for offline)
│   │   └── rel   . release static html (for offline)
│   ├── images    . static images
│   │   └── icons . static icons
│   └── js        .
│       ├── dev   . development js not minifyed/uglifyed
│       └── rel   . release js minifyed/uglifyed  (do not change these files)
└── templates     . html templates used by python flask for serve the web pages

Application description

The main server application webapp.py is in the root folder. In this application there are routing to html dynamic templates and serves also back-end API. This is wrapped by wsgi.py for using with bjoern server.

Static files (css, js...) are in static directory: here there are subdirectories:

  • dev where you can edit and modify sources
  • rel here there are release files created with the building process and used in production

Lint

For lint javascript I use ESLint. You can install with npm init @eslint/config pylint pip install pylint

Building process

Prerequisites:

Component Description Install command
npm a packet manager for javascript depend on your os. See official page
uglify-js npm component used to minify and uglify javascript npm install uglify-js -g
css-minify npm component used to minify css npm install css-minify -g
staticjinja python module used to create static page starting from a html template pip install staticjinja

You can build the software for test (dev), or for production (release) environments. In scripts directory launch:

  • ./build.sh -d for dev environment

  • ./build.sh -r for release environment

Screenshots


desktop

mobile

spiderweb's People

Contributors

coulisse avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

spiderweb's Issues

Wrong flag for Cocos (Keeling) Islands

Describe the bug
The flag of Cocos (Keeling) Islands is not correct

To Reproduce
Search for VK9CM and see the flag is orizontal blue / white / red (COSTA RICA) - Iso Code: CR

Expected behavior
The right flag is green and yellow - Iso Code: CC

Screenshots
If applicable, add screenshots to help explain your problem.
Error:
image

Expected:
image

user + password telnet access

To be able to use telnet access with a user who has set the password
add/edit the following lines to dxtelnet.py

def who(host, port, user):

WAIT_FOR = b"dxspider >"
WAIT_PASS = b"password:"
TIMEOUT = 1
res = 0
user = "username"
password = "your_password"
 try:
     tn = telnetlib.Telnet(host, port, TIMEOUT)
     try:
        tn.read_until(b"login: ", TIMEOUT)
        tn.write(user.encode("ascii") + b"\n")
        tn.read_until(WAIT_PASS, TIMEOUT)
        tn.write(password.encode("ascii") + b"\n")
        res = tn.read_until(WAIT_FOR, TIMEOUT)
        tn.write(b"who\n")
        res = tn.read_until(WAIT_FOR, TIMEOUT)
        tn.write(b"exit\n")

73 de Roby IV3JDV

dxcluster.sql

i think this is the right table ready to import ?

CREATE TABLE spot (
rowid int(11) NOT NULL AUTO_INCREMENT,
freq double NOT NULL,
spotcall varchar(14) NOT NULL,
time int(11) NOT NULL,
comment varchar(255) DEFAULT NULL,
spotter varchar(14) NOT NULL,
spotdxcc smallint(6) DEFAULT NULL,
spotterdxcc smallint(6) DEFAULT NULL,
origin varchar(14) DEFAULT NULL,
spotitu tinyint(4) DEFAULT NULL,
spotcq tinyint(4) DEFAULT NULL,
spotteritu tinyint(4) DEFAULT NULL,
spottercq tinyint(4) DEFAULT NULL,
spotstate char(2) DEFAULT NULL,
spotterstate char(2) DEFAULT NULL,
ipaddr varchar(40) DEFAULT NULL,
PRIMARY KEY (rowid),
KEY spot_ix1 (time),
KEY spot_ix2 (spotcall),
KEY spiderweb_spotter (spotter)
) ENGINE=InnoDB AUTO_INCREMENT=2598318 DEFAULT CHARSET=utf8mb4;

Minor v2.5.2 corrections

  1. /templates/plots.html

Change line 52 from: strong>Physically connected callsigns to {{ mycallsign }}</strong
to: {{ mycallsign }} telnet nodes & users online: Nodes counts.get('NODE') Users counts.get('USER')
reading data from "who" table in "Type" column.

  1. /templates/plots.html

Increase the number of characters in the table in the "Callsign" field to 12 characters. currently eight characters
are allocated and anything more than that is cut out.

  1. Footer

Add simple visit counter service to count visitor as unique visit only once per 12 or 24 hours.
© Copyleft: 2020-2024 IU1BOW - Spiderweb v2.5.2 | website hits {{visit_counter}} since 1st January 2024

  1. Browser cache

Is web browser caching to be improved for plots and propagation html pages or is it just me at S50CLX? Ctrl-f5 works but it is somehow annoying. FF 121.0.1

Ciao, Dan S50U

not nodes and users show

Hello why does it always show me 0 nodes and 0 connected users?
I am connected to 3 nodes and have 1 user connected.

image

not working after update to 2.5.4

hello this is error after update to last version 2.5.4
why?

 sudo systemctl restart spiderweb.service
root@dxspider:/home/sysop# sudo systemctl status spiderweb.service
× spiderweb.service - bjoern instance spiderweb
     Loaded: loaded (/etc/systemd/system/spiderweb.service; enabled; vendor preset: enabled)
     Active: failed (Result: exit-code) since Mon 2024-04-01 20:51:45 UTC; 5s ago
    Process: 1136 ExecStart=/usr/bin/python3 /home/sysop/spiderweb/wsgi.py (code=exited, status=1/FAILURE)
   Main PID: 1136 (code=exited, status=1/FAILURE)
        CPU: 228ms

Apr 01 20:51:45 dxspider python3[1136]: Traceback (most recent call last):
Apr 01 20:51:45 dxspider python3[1136]:   File "/home/sysop/spiderweb/wsgi.py", line 2, in <module>
Apr 01 20:51:45 dxspider python3[1136]:     from webapp import app
Apr 01 20:51:45 dxspider python3[1136]:   File "/home/sysop/spiderweb/webapp.py", line 12, in <module>
Apr 01 20:51:45 dxspider python3[1136]:     from lib.adxo import get_adxo_events
Apr 01 20:51:45 dxspider python3[1136]:   File "/home/sysop/spiderweb/lib/adxo.py", line 10, in <module>
Apr 01 20:51:45 dxspider python3[1136]:     import feedparser
Apr 01 20:51:45 dxspider python3[1136]: ModuleNotFoundError: No module named 'feedparser'
Apr 01 20:51:45 dxspider systemd[1]: spiderweb.service: Main process exited, code=exited, status=1/FAILURE
Apr 01 20:51:45 dxspider systemd[1]: spiderweb.service: Failed with result 'exit-code'.

Band plan

Creating a page with band plan in graphical mode

Plot & Stats generates an exception

Hi!

The plots& stats pages throws an exception...

2021-11-05 18:32:45,656 [ERROR] (app.py) Exception on /plots.html [GET]
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/flask/app.py", line 2051, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/dist-packages/flask/app.py", line 1501, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.7/dist-packages/flask/app.py", line 1499, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.7/dist-packages/flask/app.py", line 1485, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/opt/spiderweb-master/webapp.py", line 206, in plots
whoj=who_is_connected()
File "/opt/spiderweb-master/webapp.py", line 170, in who_is_connected
response=who(host_port[0],host_port[1],cfg['mycallsign'])
File "/opt/spiderweb-master/lib/dxtelnet.py", line 61, in who
return parse_who(res)
File "/opt/spiderweb-master/lib/dxtelnet.py", line 14, in parse_who
lines = lines.splitlines()
AttributeError: 'int' object has no attribute 'splitlines'

Can you advice on how to resolve the issue?
Thanks!

World clock

Hi

It would be very useful to see real utc time clock with current date. Maybe it would fit just before "callsign" search field.

73 Danilo S50U

Error on installation

I followed the guide and when when i issue the python3 webapp.py i receive the following message

'
sysop@DX-Spider:/spider/spiderweb $ python3 webapp.py
Traceback (most recent call last):
File "/home/sysop/spider/spiderweb/webapp.py", line 13, in
from lib.qry import query_manager
File "/home/sysop/spider/spiderweb/lib/qry.py", line 10, in
import pandas as pd
File "/home/sysop/.local/lib/python3.9/site-packages/pandas/init.py", line 16, in
raise ImportError(
ImportError: Unable to import required dependencies:
numpy: Error importing numpy: you should not try to import numpy from
its source directory; please exit the numpy source tree, and relaunch
your python interpreter from there.
sysop@DX-Spider:
/spider/spiderweb $
'

Thanks
Mike N3BSQ

Please add a filter - reset button to the filter pane

On the spot page when one selects a filter e.g. VHF and hits the filter button this works fine.

There is no easy way to remove the filter option. One has to (Crtl - Mouseclick) in the list to select all bands again to reset the filter.
Please add a "clear filter" or "Show all" button to restore the default view.

Thanks a lot.
Joe, OE5JFE

Improving the spots filter

Ciao

Would you please consider improving the DXDe and DX spots filtering. The idea is to download Jim's AD1C free special version of CTY.DAT once a week from http://www.country-files.com/dx-cluster/dx-spider/ and used it for better spots filtering. The cty.dat format file structure is Column | Lenght | Description. Seems using 2 letter continental abbreviation would be a good starting point.

1 | 26 | Country Name
27 | 5 | CQ Zone
32 | 5 | ITU Zone
37 | 5 | 2-letter continent abbreviation
42 | 9 | Latitude in degrees, + for North
51 | 10 | Longitude in degrees, + for West
61 | 9 | Local time offset from GMT
70 | 6 | Primary DXCC Prefix (A “*” preceding this prefix indicates that the country is on the DARC WAEDC list, and counts in CQ-sponsored contests, but not ARRL-sponsored contests).

More on file format at http://www.country-files.com/cty-dat-format/

Grazie, Danilo S50U

new installation

hello I will try to install SPiderweb on ubuntu server.

after test there are this error list:

image

then How I can insert table manually on my db from command?

So manually the spot table with the following command:

CREATE TABLE 'spot' ( 'rowid' int(11) NOT NULL AUTO_INCREMENT, 'freq' double NOT NULL, 'spotcall' varchar(14) NOT NULL, 'time' int(11) NOT NULL, 'comment' varchar(255) DEFAULT NULL, 'spotter' varchar(14) NOT NULL, 'spotdxcc' smallint(6) DEFAULT NULL, 'spotterdxcc' smallint(6) DEFAULT NULL, 'origin' varchar(14) DEFAULT NULL, 'spotitu' tinyint(4) DEFAULT NULL, 'spotcq' tinyint(4) DEFAULT NULL, 'spotteritu' tinyint(4) DEFAULT NULL, 'spottercq' tinyint(4) DEFAULT NULL, 'spotstate' char(2) DEFAULT NULL, 'spotterstate' char(2) DEFAULT NULL, 'ipaddr' varchar(40) DEFAULT NULL, PRIMARY KEY ('rowid'), KEY 'spot_ix1' ('time'), KEY 'spot_ix2' ('spotcall'), KEY 'spiderweb_spotter' ('spotter') ) ENGINE=InnoDB AUTO_INCREMENT=2598318 DEFAULT CHARSET=utf8mb4


Filter on individual callsign(s)

Please add a feature to enter individual callsign(s) to filter on. Handy if you want to see only Spots from a DXPedition

Thanks

Proposal for improvements

After upgrading to the latest spiderweb version 2.3.4 I have some suggestions for improvements.

What is QSO and what is DX SPOT?

- QSO is a contact between two amateur radio operators.
- DX SPOT is a piece of information sent from one station to every other one logged in on the DX Cluster in real time.

Under this definition I propose a change a QSO to DX SPOTS in Graphs & stats plots page as:

1. /lib/qso_hour_band.py

plt.suptitle("DX SPOTS per hour in last month")
plt.ylabel("DX SPOTS")

2. /lib/qso_monts.py

plt.suptitle("DX SPOTS per month")
plt.ylabel("DX SPOTS")

3. /lib/qso_trend.py

plt.suptitle("DX SPOTS trend")
plt.ylabel("DX SPOTS")

4. /lib/qso_world_map.py

plt.suptitle("World DX SPOTS in last month")

Connected nodes title may be replaced with "Physically connected callsigns to ?node_callsign?" because we show both nodes and users listed

5. /templates/plots.html

<h3 class="text-center">Physically connected callsigns to <_node_callsign_></h2>

Seems 'Callsign' field is limited to 8 characters, anything more than that is cut out. An increase to 12 characters might be sufficient.

6. DIGI filter

DIGI filtering is not working as expected some FT4 are left to pass through. I would like to suggest exactly the same filter as for FT8 spots. Filtering by spot comment field text "FT4" and by FT4 frequencies.

160m |
80m | 3.575
60m |  
40m | 7.0475
30m | 10.140
20m | 14.080
17m | 18.104
15m | 21.140
12m | 24.919
10m | 28.180
6m | 50.318

Thanks, Dan S50U

requirements costraint

I had to make that modification on requirements.txt, otherwise pip couldn't resolve conflicts:

root@cluster:/opt/spiderweb# git diff
diff --git a/requirements.txt b/requirements.txt
index 4cd9eee..9e82c0b 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -17,7 +17,7 @@ lazy-object-proxy==1.9.0
 lesscpy==0.15.1
 MarkupSafe==2.1.1
 mccabe==0.7.0
-mysql-connector-python==8.0.31
+mysql-connector-python
 numpy==1.24.1
 pandas==1.5.2
 platformdirs==2.6.2

Is it me?

propagation_heatmaps.sh fails with 'Passing a Normalize instance simultaneously with vmin/vmax is not supported.'

Description
When trying to generate heatmaps with the script, it fails after plotting message with error below
Traceback (most recent call last): File "/spiderweb/scripts/../lib/propagation_heatmaps.py", line 142, in <module> im = plt.imshow(np.array(number_ar), cmap='YlOrRd', interpolation='none', norm=LogNorm(vmin=10, vmax=35),vmin=max(np.array(number_ar).min(), LOGMIN)) File "/usr/local/lib/python3.9/dist-packages/matplotlib/_api/deprecation.py", line 454, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/matplotlib/pyplot.py", line 2611, in imshow __ret = gca().imshow( File "/usr/local/lib/python3.9/dist-packages/matplotlib/_api/deprecation.py", line 454, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/matplotlib/__init__.py", line 1423, in inner return func(ax, *map(sanitize_sequence, args), **kwargs) File "/usr/local/lib/python3.9/dist-packages/matplotlib/axes/_axes.py", line 5577, in imshow im._scale_norm(norm, vmin, vmax) File "/usr/local/lib/python3.9/dist-packages/matplotlib/cm.py", line 405, in _scale_norm raise ValueError( ValueError: Passing a Normalize instance simultaneously with vmin/vmax is not supported. Please pass vmin/vmax directly to the norm when creating it.

To Reproduce
Steps to reproduce the behavior:

  1. Go to scripts/
  2. Run propagation_heatmaps.sh
  3. After a few seconds, script fails

Expected behavior
Script should generate images with propagation heat maps.

Screenshots
image

Additional info:

  • OS: Debian 5.10.140-1 (2022-09-02)
  • Version v2.3.3

Protect plots.html

Due to the dxtelnet module it's mandatory to protect plots.html to blocks IP addresses or http clients that make an unusually high number of concurrent requests or that make a large number of requests over small period of time. 🤔

Thanks, Dan S50U

mysql version not compatible?

hello I have installed spiderweb on ubuntu 22.04 with mysql 8.0.36, see below:
image

after start test this is error from webapp.log

2024-02-07 07:07:16,578 [INFO] (webapp.py) Starting SPIDERWEB
2024-02-07 07:07:16,579 [INFO] (webapp.py) Version:v2.5.2
2024-02-07 07:07:16,580 [INFO] (cty.py) CTY: start initialization
2024-02-07 07:07:16,580 [INFO] (cty.py) /home/sysop/spiderweb/lib/../static/data/cty_wt_mod.dat updated (1.0 days), is not necessary to download it
2024-02-07 07:07:16,583 [INFO] (cty.py) number of lines reads: 8536
2024-02-07 07:07:16,583 [INFO] (cty.py) number of valid lines: 7561
2024-02-07 07:07:16,660 [INFO] (cty.py) number of entities: 346
2024-02-07 07:07:16,661 [INFO] (cty.py) number of single alias: 28410
2024-02-07 07:07:17,222 [INFO] (cty.py) memory used for prefix: 1310792 bytes
2024-02-07 07:07:17,222 [INFO] (cty.py) CTY: initialization complete
2024-02-07 07:07:17,222 [INFO] (qry.py) [Errno 2] No such file or directory: '../cfg/config.json'
2024-02-07 07:07:17,222 [INFO] (qry.py) trying with other path...
2024-02-07 07:07:17,223 [INFO] (qry.py) config file loaded
2024-02-07 07:07:17,237 [INFO] (qry.py) db connection pool created
2024-02-07 07:07:17,237 [INFO] (adxo.py) connection to: http://dxcal.kj4z.com/dxcal
2024-02-07 07:07:17,551 [INFO] (adxo.py) number ADXO events: 0
2024-02-07 07:07:17,552 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init start
2024-02-07 07:07:17,552 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider refresh data
2024-02-07 07:07:17,552 [INFO] (plot_data_provider.py) Start
2024-02-07 07:07:17,552 [INFO] (plot_data_provider.py) doing query...
2024-02-07 07:07:17,555 [INFO] (plot_data_provider.py) query done
2024-02-07 07:07:17,556 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init end
2024-02-07 07:07:17,556 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init start
2024-02-07 07:07:17,556 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider refresh data
2024-02-07 07:07:17,556 [INFO] (plot_data_provider.py) Start
2024-02-07 07:07:17,556 [INFO] (plot_data_provider.py) doing query...
2024-02-07 07:07:17,556 [ERROR] (qry.py) 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'int) as current_year,
			cast(sum(
				case 
					when YEAR(s1.ym)=YEAR(now())-1' at line 8
2024-02-07 07:07:17,557 [INFO] (plot_data_provider.py) query done
2024-02-07 07:07:17,557 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init end
2024-02-07 07:07:17,557 [INFO] (plot_data_provider.py) Class: SpotsTrend init start
2024-02-07 07:07:17,558 [INFO] (plot_data_provider.py) Class: SpotsTrend refresh data
2024-02-07 07:07:17,558 [INFO] (plot_data_provider.py) Start
2024-02-07 07:07:17,558 [INFO] (plot_data_provider.py) doing query...
2024-02-07 07:07:17,563 [INFO] (plot_data_provider.py) query done
2024-02-07 07:07:17,567 [INFO] (plot_data_provider.py) Class: SpotsTrend init end
2024-02-07 07:07:17,567 [INFO] (plot_data_provider.py) Class: HourBand init start
2024-02-07 07:07:17,567 [INFO] (plot_data_provider.py) Class: HourBand refresh data
2024-02-07 07:07:17,567 [INFO] (plot_data_provider.py) Start
2024-02-07 07:07:17,567 [INFO] (plot_data_provider.py) doing query...
2024-02-07 07:07:17,568 [ERROR] (qry.py) 1055 (42000): Expression #3 of SELECT list is not in GROUP BY clause and contains nonaggregated column 'dxcluster.spot.freq' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by
2024-02-07 07:07:17,569 [INFO] (plot_data_provider.py) query done

Spots fail to refresh automatically

Describe the bug
The table with spots does not refresh automatically. They can still be refreshed manually by entering the page again.

To Reproduce
Steps to reproduce the behavior:

  1. Go to the page
  2. Wait
  3. Spots do not refresh

Additional info:

Uncaught TypeError: can't access property "selectedOptions", document.getElementById(...) is null
    getFilter http://127.0.0.1:8080/static/js/table.min.js:5
    myTimer http://127.0.0.1:8080/static/js/table.min.js:6

DB name in config ignored: always uses 'dxcluster'

Even if in the config file there is:

{
    "mysql": {
        "host": "localhost",
        "user": "myuser",
        "passwd": "mypassword",
        "db": "cluster"
    },
[...]

the webapp uses dxcluster as DB name.

From the logs in fact:

[ERROR] (qry.py) 1142 (42000): SELECT command denied to user 'spider'@'localhost' for table dxcluster.spot``

I had to modify some lines in webapp.py to hardcode there my DB name.
Can you use the DB name from the config file?

Here's the diff:

diff --git a/webapp.py b/webapp.py
index 9f99a2a..26a3b5a 100644
--- a/webapp.py
+++ b/webapp.py
@@ -77,14 +77,14 @@ def query_build_callsign(callsign):
     query_string = ""
     if len(callsign) <= 14:
         query_string = (
-            "(SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from dxcluster.spot WHERE spotter='"
+            "(SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from cluster.spot WHERE spotter='"
             + callsign
             + "'"
         )
         query_string += " ORDER BY rowid desc limit 10)"
         query_string += " UNION "
         query_string += (
-            "(SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from dxcluster.spot WHERE spotcall='"
+            "(SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from cluster.spot WHERE spotcall='"
             + callsign
             + "'"
         )
@@ -175,7 +175,7 @@ def query_build():
             last_rowid = 0
 
         query_string = (
-            "SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from dxcluster.spot WHERE rowid > "
+            "SELECT rowid, spotter AS de, freq, spotcall AS dx, comment AS comm, time, spotdxcc from cluster.spot WHERE rowid > "
             + last_rowid
         )

visitor counter

Hi, it would be interesting to have a good visit counter at the bottom of the page with various statistics by country.
there are several scripts in php.
why don't you think about adding it?

Allow (moderated) user registration via Web interface

Would be helpful to have user registration as part of the website in order to submit Spots via Telnet or via a possible future web interface. #StopClusterSpam

Background

Too much BS on Clusters from anonymous users. Many sysops no longer accept anonymous submission of Spots and require Registration and individual Password prior to Spot submissions.

Registration Process:

  1. User enters Callsign in Field1
  2. User enters an email address in Field2 and hits submit
  3. Email sent to the sysop for user validation (Qrz.com etc)

Sysop validates the account and adds the user <set/register CALL> and sets password <set/password CALL password> then sends the User an email with username and password. User can then start to submit Spots via Telnet or via a possible future web interface.

Once the user is validated and has an Account, there are many new feature possibilities like a per-user web login where the user can set preferences like filters, change password, change email, a user chat, submit Spots via WebUI, etc.

73
Roland
HB9VQQ

Error running

(myenv) piju@kelubi:~/spiderweb-2.5.4$ python3 webapp.py
2024-04-26 23:41:16,610 [INFO] (webapp.py) Starting SPIDERWEB
2024-04-26 23:41:16,612 [INFO] (webapp.py) Version:v2.5.4
2024-04-26 23:41:16,612 [INFO] (webapp.py) visit saved on: data/visits.json
2024-04-26 23:41:16,613 [INFO] (cty.py) CTY: start initialization
2024-04-26 23:41:16,613 [INFO] (cty.py) /home/piju/spiderweb-2.5.4/lib/../data/cty_wt_mod.dat updated (0.0 days), is not necessary to download it
2024-04-26 23:41:16,616 [INFO] (cty.py) number of lines reads: 8613
2024-04-26 23:41:16,617 [INFO] (cty.py) number of valid lines: 7641
2024-04-26 23:41:16,700 [INFO] (cty.py) number of entities: 346
2024-04-26 23:41:16,701 [INFO] (cty.py) number of single alias: 28663
2024-04-26 23:41:16,701 [INFO] (cty.py) loading:/home/piju/spiderweb-2.5.4/lib/../cfg/country.json
2024-04-26 23:41:17,158 [INFO] (cty.py) memory used for prefix: 961264 bytes
2024-04-26 23:41:17,158 [INFO] (cty.py) CTY: initialization complete
2024-04-26 23:41:17,159 [INFO] (qry.py) [Errno 2] No such file or directory: '../cfg/config.json'
2024-04-26 23:41:17,159 [INFO] (qry.py) trying with other path...
2024-04-26 23:41:17,159 [INFO] (qry.py) config file loaded
2024-04-26 23:41:17,161 [INFO] (qry.py) db connection pool created
2024-04-26 23:41:17,854 [INFO] (adxo.py) number ADXO events: 16
2024-04-26 23:41:17,856 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init start
2024-04-26 23:41:17,856 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider refresh data
2024-04-26 23:41:17,856 [INFO] (plot_data_provider.py) Start
2024-04-26 23:41:17,856 [INFO] (plot_data_provider.py) doing query...
2024-04-26 23:41:17,857 [WARNING] (plot_data_provider.py) no data found
2024-04-26 23:41:17,857 [INFO] (plot_data_provider.py) query done
2024-04-26 23:41:17,858 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init end
2024-04-26 23:41:17,858 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init start
2024-04-26 23:41:17,858 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider refresh data
2024-04-26 23:41:17,858 [INFO] (plot_data_provider.py) Start
2024-04-26 23:41:17,858 [INFO] (plot_data_provider.py) doing query...
2024-04-26 23:41:17,859 [INFO] (plot_data_provider.py) query done
2024-04-26 23:41:17,859 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init end
2024-04-26 23:41:17,859 [INFO] (plot_data_provider.py) Class: SpotsTrend init start
2024-04-26 23:41:17,859 [INFO] (plot_data_provider.py) Class: SpotsTrend refresh data
2024-04-26 23:41:17,859 [INFO] (plot_data_provider.py) Start
2024-04-26 23:41:17,860 [INFO] (plot_data_provider.py) doing query...
/home/piju/spiderweb-2.5.4/lib/qry.py:77: UserWarning: pandas only supports SQLAlchemy connectable (engine/connection) or database string URI or sqlite3 DBAPI2 connection. Other DBAPI2 objects are not tested. Please consider using SQLAlchemy.
self.__data = pd.read_sql(qs, con=cnx)
2024-04-26 23:41:17,863 [INFO] (plot_data_provider.py) query done
2024-04-26 23:41:17,863 [WARNING] (plot_data_provider.py) no data found
Traceback (most recent call last):
File "/home/piju/spiderweb-2.5.4/webapp.py", line 162, in
line_graph_st = SpotsTrend(logger, qm)
^^^^^^^^^^^^^^^^^^^^^^
File "/home/piju/spiderweb-2.5.4/lib/plot_data_provider.py", line 297, in init
super().init(logger, qm, [], [])
File "/home/piju/spiderweb-2.5.4/lib/plot_data_provider.py", line 41, in init
self.refresh()
File "/home/piju/spiderweb-2.5.4/lib/plot_data_provider.py", line 337, in refresh
qry_data = self.__load_data()
^^^^^^^^^^^^^^^^^^
File "/home/piju/spiderweb-2.5.4/lib/plot_data_provider.py", line 330, in __load_data
df = df.rolling("30D").mean()
^^^^^^^^^^^^^^^^^
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/generic.py", line 12577, in rolling
return Rolling(
^^^^^^^^
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/window/rolling.py", line 170, in init
self._validate()
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/window/rolling.py", line 1877, in _validate
self._validate_datetimelike_monotonic()
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/window/rolling.py", line 1921, in _validate_datetimelike_monotonic
if self._on.hasnans:
^^^^^^^^^^^^^^^^
File "properties.pyx", line 36, in pandas._libs.properties.CachedProperty.get
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/indexes/base.py", line 2840, in hasnans
return bool(self._isnan.any())
^^^^^^^^^^^
File "properties.pyx", line 36, in pandas._libs.properties.CachedProperty.get
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/indexes/base.py", line 2810, in _isnan
return isna(self)
^^^^^^^^^^
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/dtypes/missing.py", line 178, in isna
return _isna(obj)
^^^^^^^^^^
File "/home/piju/spiderweb-2.5.4/myenv/lib/python3.11/site-packages/pandas/core/dtypes/missing.py", line 203, in _isna
raise NotImplementedError("isna is not defined for MultiIndex")
NotImplementedError: isna is not defined for MultiIndex

Regex or wildcard search for callsigns/spotters

On the main page the search (top right) is very "exact" on the callsign. Only exact matches for spotter (DE) or spotted callsign (DX) are returned as result.
I think a regex or wildcard based search would be great in that case.

So searching for all "OE5" callsigns could be like:
Regex:
^OE5
or with wildcards:
OE5*

Same goes for the filter view. Not sure if a "like" sql search would help here.

DXSpider cluster statistic

Corrado

According to the TODO.md

add page with connected nodes

I can provide scripts with an explanation for some DXSpider cluster statistic if you wanted to include them on a new page of spiderweb app?

73 Danilo S50U

Installation issues

Using Centos 7.
dxspider up and running well with the name pa0esh-3. I have followed the instructions to set up the webinterface, however i get stuck here:

[sysop@pa0esh spiderweb]$ pip install -r requirements.txt
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
Defaulting to user installation because normal site-packages is not writeable
ERROR: Could not find a version that satisfies the requirement attrs==22.1.0 (from -r requirements.txt (line 1)) (from versions: 15.0.0a1, 15.0.0, 15.1.0, 15.2.0, 16.0.0, 16.1.0, 16.2.0, 16.3.0, 17.1.0, 17.2.0, 17.3.0, 17.4.0, 18.1.0, 18.2.0, 19.1.0, 19.2.0, 19.3.0, 20.1.0, 20.2.0, 20.3.0, 21.1.0, 21.2.0, 21.3.0, 21.4.0)
ERROR: No matching distribution found for attrs==22.1.0 (from -r requirements.txt (line 1))
[sysop@pa0esh spiderweb]$ 

I have both python 2.7 as well as python 3.6.8 in stalled.
Centos 7 seems to revert to python 2.7. Using alternatives i managed to get the deault to 3.6.8:
sudo alternatives --config python and then choose 3.6.8

[sysop@pa0esh spiderweb]$ python --version
Python 3.6.8
[sysop@pa0esh spiderweb]$

It still gives me the same error.

Any help is realyy appreciated,

Erik, PA0ESH

Conversion of band.pl (mojo branch) causes JSONException

Dear OM,

I have your great web frontend for our newly created DXspider Cluster setup.
https://dxcluster.oevsv.at/

Todo so we have packed it into a docker compose (I will publish it later this week) that includes the dxspider, a mariadb and your webinterface. So every service has it's own container.
Anyhow I have shared the spider/data folder to allow this:

python /app/lib/get_dxcluster_modes.py /spider/data/bands.pl

Which results in this

08/06/2023 03:25:08 [INFO]: RDxSpider band file conversion starting...
08/06/2023 03:25:08 [ERROR]: An exception of type JSONDecodeError occurred. Arguments:
("Expecting ',' delimiter: line 1 column 2368 (char 2367)",)
08/06/2023 03:25:08 [ERROR]: /spider/data/bands.pl
08/06/2023 03:25:08 [ERROR]: error on parsing input file

I have looked into the new file version of bands.pl (mojo branch).
[http://www.dxcluster.org/gitweb/gitweb.cgi?p=spider.git;a=blob;f=data/bands.pl;h=1bd60f671b6d0bc70268ad20ebcba7874df3038e;hb=f43ac25669a7b9368d9537eed2043a864a906303]

I think the issue is that "hf" and "vhf" don't have single quotes. Not sure why this is done like that...

'pmruhf' => bless ( { band => [425000, 430000, 440000, 471000], }, 'Bands'), hf => bless ( { band => [1800, 29999], }, 'Bands'), vhf => bless ( { band => [30000, 299999], }, 'Bands'), );

I guess that is something that needs fixing on the DXspider side. Just to let you know.

73 Joe

no data found

Until this morning all was working fine until I noticed there are no new Spots incoming. In the log I found.

2023-02-23 09:46:23,076 [INFO] (_internal.py) 84.73.221.190 - - [23/Feb/2023 09:46:23] "GET /static/images/icons/spider_ico_master.svg HTTP/1.1" 200 -
2023-02-23 09:46:23,764 [INFO] (_internal.py) 84.73.221.190 - - [23/Feb/2023 09:46:23] "GET /static/images/icons/favicon.ico HTTP/1.1" 200 -
2023-02-23 09:46:32,010 [WARNING] (webapp.py) no data found

Startup looking good

2023-02-23 09:50:13,290 [INFO] (webapp.py) Start
2023-02-23 09:50:13,292 [INFO] (cty.py) CTY: start initialization
2023-02-23 09:50:13,292 [INFO] (cty.py) /spiderweb/lib/../static/data/cty_wt_mod.dat updated (6.0 days), is not necessary to download it
2023-02-23 09:50:13,296 [INFO] (cty.py) number of lines reads: 8368
2023-02-23 09:50:13,296 [INFO] (cty.py) number of valid lines: 7398
2023-02-23 09:50:13,436 [INFO] (cty.py) number of entities: 346
2023-02-23 09:50:13,436 [INFO] (cty.py) number of single alias: 27834
2023-02-23 09:50:14,311 [INFO] (cty.py) memory used for prefix: 1310792 bytes
2023-02-23 09:50:14,311 [INFO] (cty.py) CTY: initialization complete
2023-02-23 09:50:14,312 [INFO] (qry.py) [Errno 2] No such file or directory: '../cfg/config.json'
2023-02-23 09:50:14,312 [INFO] (qry.py) trying with other path...
2023-02-23 09:50:14,312 [INFO] (qry.py) config file loaded
2023-02-23 09:50:14,315 [INFO] (qry.py) db connection pool created
2023-02-23 09:50:14,315 [INFO] (adxo.py) connection to: http://dxcal.kj4z.com/dxcal
2023-02-23 09:50:14,374 [INFO] (adxo.py) number ADXO events: 62
2023-02-23 09:50:14,375 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init start
2023-02-23 09:50:14,375 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider refresh data
2023-02-23 09:50:14,375 [INFO] (plot_data_provider.py) Start
2023-02-23 09:50:14,376 [INFO] (plot_data_provider.py) doing query...
2023-02-23 09:50:14,378 [WARNING] (plot_data_provider.py) no data found
2023-02-23 09:50:14,378 [INFO] (plot_data_provider.py) query done
2023-02-23 09:50:14,379 [INFO] (plot_data_provider.py) Class: ContinentsBandsProvider init end
2023-02-23 09:50:14,379 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init start
2023-02-23 09:50:14,379 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider refresh data
2023-02-23 09:50:14,379 [INFO] (plot_data_provider.py) Start
2023-02-23 09:50:14,379 [INFO] (plot_data_provider.py) doing query...
2023-02-23 09:50:14,465 [INFO] (plot_data_provider.py) query done
2023-02-23 09:50:14,466 [INFO] (plot_data_provider.py) Class: SpotsPerMounthProvider init end
2023-02-23 09:50:14,466 [INFO] (plot_data_provider.py) Class: SpotsTrend init start
2023-02-23 09:50:14,466 [INFO] (plot_data_provider.py) Class: SpotsTrend refresh data
2023-02-23 09:50:14,466 [INFO] (plot_data_provider.py) Start
2023-02-23 09:50:14,466 [INFO] (plot_data_provider.py) doing query...
/spiderweb/lib/qry.py:77: UserWarning: pandas only supports SQLAlchemy connectable (engine/connection) or database string URI or sqlite3 DBAPI2 connection. Other DBAPI2 objects are not tested. Please consider using SQLAlchemy.
self.__data = pd.read_sql(qs, con=cnx)
2023-02-23 09:50:14,529 [INFO] (plot_data_provider.py) query done
2023-02-23 09:50:14,537 [INFO] (plot_data_provider.py) Class: SpotsTrend init end
2023-02-23 09:50:14,537 [INFO] (plot_data_provider.py) Class: HourBand init start
2023-02-23 09:50:14,538 [INFO] (plot_data_provider.py) Class: HourBand refresh data
2023-02-23 09:50:14,538 [INFO] (plot_data_provider.py) Start
2023-02-23 09:50:14,538 [INFO] (plot_data_provider.py) doing query...
2023-02-23 09:50:14,644 [INFO] (plot_data_provider.py) query done
2023-02-23 09:50:14,644 [INFO] (plot_data_provider.py) Class: HourBand init end
2023-02-23 09:50:14,644 [INFO] (plot_data_provider.py) Class: WorldDxSpotsLive init start
2023-02-23 09:50:14,644 [INFO] (plot_data_provider.py) Class: WorldDxSpotsLive refresh data
2023-02-23 09:50:14,645 [INFO] (plot_data_provider.py) Start
2023-02-23 09:50:14,645 [INFO] (plot_data_provider.py) doing query...
2023-02-23 09:50:14,651 [WARNING] (plot_data_provider.py) no data found
2023-02-23 09:50:14,651 [INFO] (plot_data_provider.py) query done
2023-02-23 09:50:14,657 [INFO] (plot_data_provider.py) Class: WorldDxSpotsLive init end
Serving Flask app 'webapp'

Any idea?

No results for callsign search

Hello,

in the new version something going wrong with callsign search.
Not appear right results.

73 de Yiannis, SV5FRI

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.