opennms-forge / ansible-opennms Goto Github PK
View Code? Open in Web Editor NEWDeployment of OpenNMS components in your infrastructure using Ansible
License: GNU General Public License v3.0
Deployment of OpenNMS components in your infrastructure using Ansible
License: GNU General Public License v3.0
Increase Ansible code quality by having ansible-lint running as a quality gate for PR's.
As an infrastructure engineer, I want to deploy a Minion in a remote location that connects to a core system, so that I can start monitoring my devices from this Minion.
Acceptance
Install and configure the Sentinel with a minimal configuration to:
When components are deployed on different servers, the time synchronization is important. Add a Chrony tasks to ensure the user has a synched time on all servers serving in the OpenNMS stack.
When using a dictionary with host_vars
and group_vars
they are not combined by default. If you set a few parameters in the group
, the dictionary, the whole defaults dictionary is overwritten. We need to find a way how we can combine them.
Here is an example of the Kafka defaults. One way would be renaming the dictionaries from kafka_server_properties
to kafka_server_properties_group
and adding a new one kafka_server_properties_host
with the defaults, and using the combine()
method.
Ansible roles can be installed and used via Ansible-Galaxy which allows to easily share a collection of roles.
When you run the Sentinel role with defaults you the following error message with undefined values:
TASK [opennms_sentinel : Configure the distributed data sources] ******************************************************
fatal: [core-srv]: FAILED! => {"msg": "{'datasource.url': 'jdbc:postgresql://{{ opennms_datasource_db_host }}:{{ opennms_datasource_db_port }}/{{ opennms_datasource_db_name }}', 'datasource.username': '{{ opennms_datasource_db_user }}', 'datasource.password': '{{ opennms_datasource_db_password }}', 'datasource.databaseName': '{{ opennms_datasource_db_name }}'}: 'opennms_datasource_db_host' is undefined. 'opennms_datasource_db_host' is undefined. {'datasource.url': 'jdbc:postgresql://{{ opennms_datasource_db_host }}:{{ opennms_datasource_db_port }}/{{ opennms_datasource_db_name }}', 'datasource.username': '{{ opennms_datasource_db_user }}', 'datasource.password': '{{ opennms_datasource_db_password }}', 'datasource.databaseName': '{{ opennms_datasource_db_name }}'}: 'opennms_datasource_db_host' is undefined. 'opennms_datasource_db_host' is undefined"}
Install a single node Kafka instance using Kraft.
If you have a shared Kafka instance, the user has to configure an OpenNMS instance id which can be set in the custom.system.properties. Allow users to set arbitrary key-value pairs in the custom.system.properties.
Allow users to configure the Java Virtual Machine for Horizon Core using with opennms.conf.
As a user, you want to install automatic system and security updates for the operating system. OpenNMS Horizon requires manual steps when the system upgrades. To avoid unplanned maintenance windows dealing with upgrades we should disable automatic updates with apt hold
and dnf config-manager --disable opennms-repo-stable-*
.
Disable the pyroscope profiling from the default configuration.
Create a stub role for a single node Elasticsearch node with our Drift plugin and Grafana with the OpenNMS plugin for Grafana:
Acceptance:
When the database is installed on a dedicated node, the iplike stored procedure needs to be installed on the database server and can't run on the OpenNMS core server.
When using the role defaults the Kafka restart handler fails with the following error message:
RUNNING HANDLER [opennms_kafka : Restart kafka] ***********************************************************************
fatal: [core-srv]: FAILED! => {"changed": false, "msg": "Could not find the requested service kafka.service.j2: host"}
We want to give users a way to explore the whole stack with all the external dependencies. We can't provide a full stack of Ansible roles for Elasticsearch, Kafka, and Mimir to run in a production environment. Having this all as a dependency on external roles is also tedious for a user to learn all these bits and pieces at once just to get something going. The idea as a compromise is to provide stub roles for these dependencies, which deploy just tiny single-node nonproduction instances that get our users going quickly. They can later, externalize dependencies and can remove them from the playbook.
Hello,
Tried to enable horizon core to connect to the kafka created by this ansible playbook.
[kafka-consumer-0] o.a.k.c.NetworkClient: [Consumer clientId=consumer-OpenNMS-5, groupId=OpenNMS] Node 1 disconnected.
WARN [kafka-consumer-15]
o.a.k.c.NetworkClient: [Consumer clientId=consumer-OpenNMS-33, groupId=OpenNMS] Connection to node 1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available
This is my horizon core kafka.properties:
org.opennms.activemq.broker.disable=true
org.opennms.core.ipc.strategy=kafka
org.opennms.core.ipc.sink.initialSleepTime=60000
org.opennms.core.ipc.kafka.bootstrap.servers=a.b.c.d:9092
The kafka is in the same VM as minion. But do not understand why it is trying to connect to localhost/127.0.0.1:9092 if i have the correct kafka ip in the config.
Hi,
Running the playbook as it is it fails due to the dict not being defined:
FAILED! => {"msg": "'opennms_minion_customer_system_properties' is undefined. 'opennms_minion_customer_system_properties' is undefined"}
How can i skip this error if i not have any customer parameter to add to the dict? Adding just the line below in the default vars does not work as it expects a dict.
opennms_minion_customer_system_properties:
Thanks
The playbooks we have are working great to set OpenNMS up. We should also have a feature to run backups and restores based on the inventory.
You can deploy services like Elasticsearch, Kafka, Mimir, and Grafana with the OpenNMS plugin easily. If the user wants to enable flows, he has to configure the Elasticsearch persistence and Mimir persistent manually. It would be easier if we could let the user set some variables like flows: enabled
or timeseries_integration: mimir|rrdtool
and the stack gets configured automatically.
The Ansible role for the Minion does not define anything for the sink configuration, only for IPC.
When you install the PostgreSQL database and configure the listening interface, the role fails with the error message:
"Unsupported parameters for (ansible.builtin.lineinfile) module: search_string Supported parameters include: attributes, backrefs, backup, create, firstmatch, group, insertafter, insertbefore, line, mode, owner, path, regexp, selevel, serole, setype, seuser, state, unsafe_writes, validate"
Update the Horizon package from 32.0.2 to 32.0.3.
When executing kafka role, the kafka service does not start due to the error:
[2024-05-08 14:48:54,082] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)
org.apache.kafka.common.KafkaException: Cannot start server since `meta.properties` could not be loaded from /var/log/kafka/combined-logs
at kafka.server.KafkaRaftServer$.initializeLogDirs(KafkaRaftServer.scala:145)
at kafka.server.KafkaRaftServer.<init>(KafkaRaftServer.scala:57)
at kafka.Kafka$.buildServer(Kafka.scala:83)
at kafka.Kafka$.main(Kafka.scala:91)
at kafka.Kafka.main(Kafka.scala)
File files below do not have the right permissions:
[root@stldnonmswt01 combined-logs]# ls -lrth
total 8.0K
-rw-r-----. 1 root root 86 May 8 14:48 meta.properties
-rw-r-----. 1 root root 249 May 8 14:48 bootstrap.checkpoint
To test the deployment, create a default inventory that installs Kafka, Minion, PostgreSQL, Core, and Sentinel on a single-host.
We have changed our OpenNMS data sources XML and the configuration template needs to be upgraded to be fully compatible with Horizon 32.0.4+.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.