Giter VIP home page Giter VIP logo

homeassistant-elasticsearch's Introduction

Elasticsearch Component for Home-Assistant build hacs_badge

Publish Home Assistant events to your Elasticsearch cluster!

Documentation: https://legrego.github.io/homeassistant-elasticsearch/

Table of Contents

Getting started

Visit our documentation site for instructions on installing, configuring, and using this component.

Features

  • Efficiently publishes Home-Assistant events to Elasticsearch using the Bulk API
  • Automatically sets up Datastreams using Time Series Data Streams ("TSDS"), Datastream Lifecycle Management ("DLM"), or Index Lifecycle Management ("ILM") depending on your cluster's capabilities
  • Supports Elastic's stack security features via optional username, password, and API keys
  • Selectively publish events based on domains or entities

Inspiration

HVAC Usage

Graph your home's climate and HVAC Usage:

img

Weather Station

Visualize and alert on data from your weather station:

img

img

Additional examples

Some usage examples inspired by real users:

  • Utilizing a Raspberry Pi in kiosk mode with a 15" display, the homeassistant-elasticsearch integration enables the creation of rotating fullscreen Elasticsearch Canvas. Those canvas displays metrics collected from various Home Assistant integrations, offering visually dynamic and informative dashboards for monitoring smart home data.
  • To address temperature maintenance issues in refrigerators and freezers, temperature sensors in each appliance report data to Home Assistant, which is then published to Elasticsearch. Kibana's alerting framework is employed to set up rules that notify the user if temperatures deviate unfavorably for an extended period. The Elastic rule engine and aggregations simplify the monitoring process for this specific use case.
  • Monitoring the humidity and temperature in a snake enclosure/habitat for a user's daughter, the integration facilitates the use of Elastic's Alerting framework. This choice is motivated by the framework's suitability for the monitoring requirements, providing a more intuitive solution compared to Home Assistant automations.
  • The integration allows users to maintain a smaller subset of data, focusing on individual stats of interest, for an extended period. This capability contrasts with the limited retention achievable with Home Assistant and databases like MariaDB/MySQL. This extended data retention facilitates very long-term trend analysis, such as for weather data, enabling users to glean insights over an extended timeframe.

Create your own cluster health sensor

Versions prior to 0.6.0 included a cluster health sensor. This has been removed in favor of a more generic approach. You can create your own cluster health sensor by using Home Assistant's built-in REST sensor.

# Example configuration
sensor:
  - platform: rest
    name: "Cluster Health"
    unique_id: "cluster_health" # Replace with your own unique id. See https://www.home-assistant.io/integrations/sensor.rest#unique_id
    resource: "https://example.com/_cluster/health" # Replace with your Elasticsearch URL
    username: hass # Replace with your username
    password: changeme # Replace with your password
    value_template: "{{ value_json.status }}"
    json_attributes: # Optional attributes you may want to include from the /_cluster/health API response
      - "cluster_name"
      - "status"
      - "timed_out"
      - "number_of_nodes"
      - "number_of_data_nodes"
      - "active_primary_shards"
      - "active_shards"
      - "relocating_shards"
      - "initializing_shards"
      - "unassigned_shards"
      - "delayed_unassigned_shards"
      - "number_of_pending_tasks"
      - "number_of_in_flight_fetch"
      - "task_max_waiting_in_queue_millis"
      - "active_shards_percent_as_number"

Support

This project is not endorsed or supported by either Elastic or Home-Assistant - please open a GitHub issue for any questions, bugs, or feature requests.

Contributing

Contributions are welcome! Please see the Contributing Guide for more information.

homeassistant-elasticsearch's People

Contributors

chrgro avatar dependabot[bot] avatar geekpete avatar imotov avatar jakommo avatar jeanfabrice avatar ktibow avatar legrego avatar markwalkom avatar philippkahr avatar robin13 avatar ronspawnson avatar strawgate avatar wrt54g avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

homeassistant-elasticsearch's Issues

exclude attribute change

I have an outlet that contains an attribute "current_power_w". The value of this attribute changes constantly which is causing a log entry in Elastic. Is there a way to ignore the change of this attribute without excluding the entire outlet on/off state change?

Create service endpoints

This component should expose service endpoints to enable automations and scripts to write data to the Elasticsearch cluster

strict_dynamic_mapping_exception

Environment
Home-Assistant version: 0.91.4
Elasticsearch version: 6.6.0

Relevant configuration.yml settings:

elastic:
  url: https://my-elastic-cluster:9200
  username: elastic
  password: <redacted>
  exclude:
    domains: [ "home", "group"]

Describe the bug
Logs full of errors like this:

Log Details (ERROR)
Tue Apr 23 2019 16:14:31 GMT+0200 (Central European Summer Time)
Error publishing documents to Elasticsearch: ('102 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'g8OMSmoBMDSTfnrZVKK1', 'status': 400, 'error': {'type': 'strict_dynamic_mapping_exception', 'reason': 'mapping set to strict, dynamic introduction of [hass] within [doc] is not allowed'}, 'data': {'hass.domain': 'sensor', 'hass.object_id': 'es_publish_queue', 'hass.entity_id': 'sensor.es_publish_queue', 'hass.attributes': {'last_publish_time': datetime.datetime(2019, 4, 23, 16, 13, 31, 221399)}, 'hass.value': 0.0, '@timestamp': datetime.datetime(2019, 4, 23, 14, 13, 36, 24849, tzinfo=<UTC>), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.91.4', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 48.1042189, 'lon': 10.9854111}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, 
...

It looks to me as if the mapping has not been applied correctly, or rather that it has not been recognised that there is already an index with incorrect mapping and a new index should be created.

GET /hass-events-v2-000001/_mapping

{
  "hass-events-v2-000001" : {
    "mappings" : {
      "doc" : {
        "dynamic" : "strict",
        "properties" : {
          "@timestamp" : {
            "type" : "date"
          },
          "attributes" : {
            "dynamic" : "true",
            "properties" : {
              "accepted" : {
                "type" : "date"
              },
              "active_primary_shards" : {
                "type" : "long"
              },
              "active_shards" : {
                "type" : "long"
              },
              "active_shards_percent_as_number" : {
                "type" : "float"
              },
              "attribution" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "azimuth" : {
                "type" : "float"
              },
              "brightness" : {
                "type" : "long"
              },
              "bytes_received" : {
                "type" : "long"
              },
              "bytes_sent" : {
                "type" : "long"
              },
              "can_cancel" : {
                "type" : "boolean"
              },
              "closed" : {
                "type" : "date"
              },
              "cluster_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "color_temp" : {
                "type" : "long"
              },
              "days_offset" : {
                "type" : "long"
              },
              "delayed_unassigned_shards" : {
                "type" : "long"
              },
              "device" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "device_class" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "duration" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "elevation" : {
                "type" : "float"
              },
              "entity_picture" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "es_location" : {
                "type" : "geo_point"
              },
              "excludes" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "external_ip" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "friendly_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "from" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "from_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "hidden" : {
                "type" : "boolean"
              },
              "hs_color" : {
                "type" : "float"
              },
              "icon" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "initializing_shards" : {
                "type" : "long"
              },
              "initiated" : {
                "type" : "date"
              },
              "is_connected" : {
                "type" : "boolean"
              },
              "is_linked" : {
                "type" : "boolean"
              },
              "is_volume_muted" : {
                "type" : "boolean"
              },
              "last_action" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "last_publish_time" : {
                "type" : "date"
              },
              "last_triggered" : {
                "type" : "date"
              },
              "latitude" : {
                "type" : "float"
              },
              "longitude" : {
                "type" : "float"
              },
              "max" : {
                "type" : "long"
              },
              "max_byte_rate_down" : {
                "type" : "float"
              },
              "max_byte_rate_up" : {
                "type" : "float"
              },
              "max_mireds" : {
                "type" : "long"
              },
              "media_album_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "media_artist" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "media_duration" : {
                "type" : "long"
              },
              "media_title" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "media_track" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "message" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "min" : {
                "type" : "long"
              },
              "min_mireds" : {
                "type" : "long"
              },
              "mode" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "next_dawn" : {
                "type" : "date"
              },
              "next_dusk" : {
                "type" : "date"
              },
              "next_midnight" : {
                "type" : "date"
              },
              "next_noon" : {
                "type" : "date"
              },
              "next_rising" : {
                "type" : "date"
              },
              "next_setting" : {
                "type" : "date"
              },
              "number_of_data_nodes" : {
                "type" : "long"
              },
              "number_of_in_flight_fetch" : {
                "type" : "long"
              },
              "number_of_nodes" : {
                "type" : "long"
              },
              "number_of_pending_tasks" : {
                "type" : "long"
              },
              "pattern" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "radius" : {
                "type" : "long"
              },
              "release_notes" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "relocating_shards" : {
                "type" : "long"
              },
              "rgb_color" : {
                "type" : "long"
              },
              "status" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "supported_features" : {
                "type" : "long"
              },
              "task_max_waiting_in_queue_millis" : {
                "type" : "long"
              },
              "timed_out" : {
                "type" : "boolean"
              },
              "title" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "to" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "to_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "transmission_rate_down" : {
                "type" : "long"
              },
              "transmission_rate_up" : {
                "type" : "long"
              },
              "type" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "unassigned_shards" : {
                "type" : "long"
              },
              "unit_of_measurement" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "uptime" : {
                "type" : "long"
              },
              "volume_level" : {
                "type" : "float"
              },
              "wan_access_type" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "white_value" : {
                "type" : "long"
              },
              "with" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "with_name" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "workdays" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "xy_color" : {
                "type" : "float"
              }
            }
          },
          "domain" : {
            "type" : "keyword"
          },
          "entity_id" : {
            "type" : "keyword"
          },
          "object_id" : {
            "type" : "keyword"
          },
          "time" : {
            "type" : "date"
          },
          "value" : {
            "type" : "text",
            "fields" : {
              "float" : {
                "type" : "float",
                "ignore_malformed" : true
              },
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 2048
              }
            }
          }
        }
      }
    }
  }
}

Discuss: Contribute component to home-assistant

Once we conform to ECS (#46) and cleanup a few other issues, I'm considering offering this component to home-assistant as a first-class component. That would give us broader exposure, and hopefully increased adoption.

If we do that though, I'm wondering how we'll manage support for different versions of the Elastic Stack. If we keep the component here, I can manage different branches for the different target versions of the stack. If we move the component to home-assistant, then we will either:

  1. Need a way to be backwards compatible for a certain number of versions. Unsure what this looks like between majors (6.x vs 7.x vs 8.x), of if that's even possible.
  2. Only support the latest major version - this creates a potential upgrade nightmare for users, as upgrading home-assistant could break this component for them until they upgrade their entire cluster.
  3. Forever only support 6.x. I don't see this as a viable option, as users will want to upgrade their clusters, and not be stuck on a specific major version for eternity.

only_publish_changed: true - not working?

To reduce amout of data stored I would like to store only when value changes. For this I can use the option only_publish_changed: true

This is turned on but it seems like it is logging same value serveral times:
image

As you can see, both documents has the same value and the entity is sorted by time.
I was not expecting to see two entries with value 17

Yes, Home Assistant has been restarted after adding the config variable to the configuration. My config is as follows:

elastic: # URL should point to your Elasticsearch cluster url: !secret elkserver only_publish_changed: true

`value` property mapping should conditionally support floats

The value property mapping is currently a string & keyword field, but for certain visualizations/aggregations in ES, it is more useful to treat it as a float/int.

Not all hass values are numeric, so we should conditionally support this behavior

Reorganize code

push code into a /src or /custom_components folder for easier nav

Add include_domains and include_entities

Is your feature request related to a problem? Please describe.
Currently, the configuration file allows for excluding domains and entities. If one needs to send only few data to ES, then she/he needs to add all other domains to the exclude_domains array.

Describe the solution you'd like
Add include_domains and include_entities with a precedence that included entities override excluded domains (and that excluded entities override include domains?).

object_id is conflated with entity_id in index mapping

The index mapping includes the entity_id, but in reality, we are shipping the object_id. We should be sending both object_id and entity_id to ES.

This will be a "breaking" change, which will essentially invalidate existing indices/templates.

Support config flows

Home-Assistant (as of 0.96) supports configuration flows for custom components. We should take advantage of this to allow users to setup this component via the UI, not just via configuration.yaml

Support watcher jobs

We should support creating a watcher job to send alerts if HASS data hasn't been sent to Elasticsearch within a specific period of time. This is useful to detect internet outages or local network/HASS issues.

Enable only health monitoring

Is your feature request related to a problem? Please describe.
No

Describe the solution you'd like
I would like a possibility to only enable health monitoring of the Elasticsearch stack. So a way to disable publishing events would be really nice.

Hass domain ZONE

Environment
Home-Assistant version: 0.90.2
Elasticsearch version: 7.0.0-rc1

Relevant configuration.yml settings:

elastic:
    url: http://192.168.1.187:9200

Describe the bug
Various errors in the HA log about zones.

mapper_parsing_exception, failed to parse field [hass.value] of type [long] in document with id 'gHZGyWkBVxYPvKEP6fdQ' caused_by illegal_argument_exception, For input string: "zoning".

Error publishing documents to Elasticsearch: ('5 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'gHZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'gHZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "zoning"'}}, 'data': {'hass.domain': 'zone', 'hass.object_id': 'home', 'hass.entity_id': 'zone.home', 'hass.attributes': {'hidden': True, 'latitude': 50.8823, 'longitude': 4.7028, 'radius': 100, 'friendly_name': 'Statik', 'icon': 'mdi:home'}, 'hass.value': 'zoning', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass'], 'hass.geo.location': {'lat': 50.8823, 'lon': 4.7028}}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'gnZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'gnZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "sunny"'}}, 'data': {'hass.domain': 'weather', 'hass.object_id': 'leuven', 'hass.entity_id': 'weather.leuven', 'hass.attributes': {'temperature': 15.4, 'humidity': 52, 'pressure': 1030.2, 'wind_bearing': 75, 'wind_speed': 1.8, 'visibility': 21100, 'attribution': 'Data provided by buienradar.nl', 'forecast': [{'datetime': datetime.datetime(2019, 3, 30, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': 5.0, 'temperature': 18.0}, {'datetime': datetime.datetime(2019, 3, 31, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': 5.0, 'temperature': 9.0}, {'datetime': datetime.datetime(2019, 4, 1, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': -1.0, 'temperature': 10.0}, {'datetime': datetime.datetime(2019, 4, 2, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'pouring', 'templow': 1.0, 'temperature': 9.0}, {'datetime': datetime.datetime(2019, 4, 3, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'rainy', 'templow': 2.0, 'temperature': 7.0}], 'friendly_name': 'Leuven'}, 'hass.value': 'sunny', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'hnZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'hnZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "paused"'}}, 'data': {'hass.domain': 'media_player', 'hass.object_id': 'octopus', 'hass.entity_id': 'media_player.octopus', 'hass.attributes': {'volume_level': 0.14, 'is_volume_muted': False, 'media_content_type': 'music', 'media_duration': 0, 'media_position': 0, 'media_position_updated_at': datetime.datetime(2019, 3, 29, 11, 17, 34, 376826, tzinfo=<UTC>), 'media_title': '', 'media_artist': '', 'media_album_name': '', 'source_list': ['C-Dance', 'dorst', 'dorst.m4a', 'FM Brussel', 'Gozadera Fm', 'RGR FM 106.5 (Electronic and Dance)', 'RTBF Pure FM 88.8 (Top-40-Pop)', 'VRT Radio 1', 'VRT Studio Brussel', 'Where Does The Good Go'], 'shuffle': False, 'sonos_group': ['media_player.octopus'], 'friendly_name': 'Octopus', 'supported_features': 64063}, 'hass.value': 'paused', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'h3ZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'h3ZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "paused"'}}, 'data': {'hass.domain': 'media_player', 'hass.object_id': 'rokkie_rollie', 'hass.entity_id': 'media_player.rokkie_rollie', 'hass.attributes': {'volume_level': 0.52, 'is_volume_muted': False, 'media_content_type': 'music', 'media_duration': 71, 'media_position': 0, 'media_position_updated_at': datetime.datetime(2019, 3, 29, 11, 17, 34, 502238, tzinfo=<UTC>), 'media_title': 'Happy Birthday to You - Piano Version', 'media_artist': 'Happy Birthday', 'media_album_name': 'Happy Birthday to You', 'source_list': ['C-Dance', 'dorst', 'dorst.m4a', 'FM Brussel', 'Gozadera Fm', 'RGR FM 106.5 (Electronic and Dance)', 'RTBF Pure FM 88.8 (Top-40-Pop)', 'VRT Radio 1', 'VRT Studio Brussel', 'Where Does The Good Go'], 'shuffle': False, 'sonos_group': ['media_player.rokkie_rollie'], 'friendly_name': 'Rokkie&Rollie', 'entity_picture': '/api/media_player_proxy/media_player.rokkie_rollie?token=8a7ae8a81b0889a2833cfeeca63dd98210dbcebae9f3a728b1e3d9dbeb20d40b&cache=355c957f9c61d46b', 'supported_features': 64063}, 'hass.value': 'paused', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'iXZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'iXZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "yellow"'}}, 'data': {'hass.domain': 'sensor', 'hass.object_id': 'es_cluster_health', 'hass.entity_id': 'sensor.es_cluster_health', 'hass.attributes': {'cluster_name': 'docker-cluster', 'status': 'yellow', 'timed_out': False, 'number_of_nodes': 1, 'number_of_data_nodes': 1, 'active_primary_shards': 4, 'active_shards': 4, 'relocating_shards': 0, 'initializing_shards': 0, 'unassigned_shards': 2, 'delayed_unassigned_shards': 0, 'number_of_pending_tasks': 0, 'number_of_in_flight_fetch': 0, 'task_max_waiting_in_queue_millis': 0, 'active_shards_percent_as_number': 66.66666666666666}, 'hass.value': 'yellow', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}])
Traceback (most recent call last):
  File "/config/custom_components/elastic/__init__.py", line 309, in do_publish
    bulk_response = bulk(self._gateway.get_client(), actions)
  File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/__init__.py", line 257, in bulk
    for ok, item in streaming_bulk(client, actions, *args, **kwargs):
  File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/__init__.py", line 192, in streaming_bulk
    raise_on_error, *args, **kwargs)
  File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/__init__.py", line 137, in _process_bulk_chunk
    raise BulkIndexError('%i document(s) failed to index.' % len(errors), errors)
elasticsearch.helpers.BulkIndexError: ('5 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'gHZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'gHZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "zoning"'}}, 'data': {'hass.domain': 'zone', 'hass.object_id': 'home', 'hass.entity_id': 'zone.home', 'hass.attributes': {'hidden': True, 'latitude': 50.8823, 'longitude': 4.7028, 'radius': 100, 'friendly_name': 'Statik', 'icon': 'mdi:home'}, 'hass.value': 'zoning', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass'], 'hass.geo.location': {'lat': 50.8823, 'lon': 4.7028}}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'gnZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'gnZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "sunny"'}}, 'data': {'hass.domain': 'weather', 'hass.object_id': 'leuven', 'hass.entity_id': 'weather.leuven', 'hass.attributes': {'temperature': 15.4, 'humidity': 52, 'pressure': 1030.2, 'wind_bearing': 75, 'wind_speed': 1.8, 'visibility': 21100, 'attribution': 'Data provided by buienradar.nl', 'forecast': [{'datetime': datetime.datetime(2019, 3, 30, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': 5.0, 'temperature': 18.0}, {'datetime': datetime.datetime(2019, 3, 31, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': 5.0, 'temperature': 9.0}, {'datetime': datetime.datetime(2019, 4, 1, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'partlycloudy', 'templow': -1.0, 'temperature': 10.0}, {'datetime': datetime.datetime(2019, 4, 2, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'pouring', 'templow': 1.0, 'temperature': 9.0}, {'datetime': datetime.datetime(2019, 4, 3, 12, 0, tzinfo=<DstTzInfo 'Europe/Amsterdam' CET+1:00:00 STD>), 'condition': 'rainy', 'templow': 2.0, 'temperature': 7.0}], 'friendly_name': 'Leuven'}, 'hass.value': 'sunny', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'hnZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'hnZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "paused"'}}, 'data': {'hass.domain': 'media_player', 'hass.object_id': 'octopus', 'hass.entity_id': 'media_player.octopus', 'hass.attributes': {'volume_level': 0.14, 'is_volume_muted': False, 'media_content_type': 'music', 'media_duration': 0, 'media_position': 0, 'media_position_updated_at': datetime.datetime(2019, 3, 29, 11, 17, 34, 376826, tzinfo=<UTC>), 'media_title': '', 'media_artist': '', 'media_album_name': '', 'source_list': ['C-Dance', 'dorst', 'dorst.m4a', 'FM Brussel', 'Gozadera Fm', 'RGR FM 106.5 (Electronic and Dance)', 'RTBF Pure FM 88.8 (Top-40-Pop)', 'VRT Radio 1', 'VRT Studio Brussel', 'Where Does The Good Go'], 'shuffle': False, 'sonos_group': ['media_player.octopus'], 'friendly_name': 'Octopus', 'supported_features': 64063}, 'hass.value': 'paused', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'h3ZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'h3ZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "paused"'}}, 'data': {'hass.domain': 'media_player', 'hass.object_id': 'rokkie_rollie', 'hass.entity_id': 'media_player.rokkie_rollie', 'hass.attributes': {'volume_level': 0.52, 'is_volume_muted': False, 'media_content_type': 'music', 'media_duration': 71, 'media_position': 0, 'media_position_updated_at': datetime.datetime(2019, 3, 29, 11, 17, 34, 502238, tzinfo=<UTC>), 'media_title': 'Happy Birthday to You - Piano Version', 'media_artist': 'Happy Birthday', 'media_album_name': 'Happy Birthday to You', 'source_list': ['C-Dance', 'dorst', 'dorst.m4a', 'FM Brussel', 'Gozadera Fm', 'RGR FM 106.5 (Electronic and Dance)', 'RTBF Pure FM 88.8 (Top-40-Pop)', 'VRT Radio 1', 'VRT Studio Brussel', 'Where Does The Good Go'], 'shuffle': False, 'sonos_group': ['media_player.rokkie_rollie'], 'friendly_name': 'Rokkie&Rollie', 'entity_picture': '/api/media_player_proxy/media_player.rokkie_rollie?token=8a7ae8a81b0889a2833cfeeca63dd98210dbcebae9f3a728b1e3d9dbeb20d40b&cache=355c957f9c61d46b', 'supported_features': 64063}, 'hass.value': 'paused', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'iXZGyWkBVxYPvKEP6fdQ', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.value] of type [long] in document with id 'iXZGyWkBVxYPvKEP6fdQ'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "yellow"'}}, 'data': {'hass.domain': 'sensor', 'hass.object_id': 'es_cluster_health', 'hass.entity_id': 'sensor.es_cluster_health', 'hass.attributes': {'cluster_name': 'docker-cluster', 'status': 'yellow', 'timed_out': False, 'number_of_nodes': 1, 'number_of_data_nodes': 1, 'active_primary_shards': 4, 'active_shards': 4, 'relocating_shards': 0, 'initializing_shards': 0, 'unassigned_shards': 2, 'delayed_unassigned_shards': 0, 'number_of_pending_tasks': 0, 'number_of_in_flight_fetch': 0, 'task_max_waiting_in_queue_millis': 0, 'active_shards_percent_as_number': 66.66666666666666}, 'hass.value': 'yellow', '@timestamp': datetime.datetime(2019, 3, 29, 12, 47, 40, 729443), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.90.2', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 50.8823, 'lon': 4.7028}, 'host.architecture': 'armv7l', 'host.os.name': 'Linux', 'host.hostname': 'homeassistant', 'tags': ['hass']}}}])

To Reproduce
Just enabled this and restarted HA.

Expected behavior
No errors and zones being logged.

Settings for index_format and alias are not applied

Environment
Home-Assistant version: hass.io 0.86.2
Elasticsearch version: 6.5.4

Relevant configuration.yml settings:

elastic:
  # URL should point to your Elasticsearch cluster
  url: http://elasticsearch.internal:9200
  index_format:     "hass"
  alias:            "hass-active" 
  publish_frequency: 10
  rollover_max_size: 1gb
  #rollover_max_age: 
  #rollover_max_docs: 

Describe the bug
When I restart my hass.io the settings for index_format and alias are not used. Instead in elasticseach I only have one index named "active-hass-index-v2" into which data is added.

Expected behavior
I expect the index "hass-00001" and the alias "hass-active" to be created.

To Reproduce
Load the qouted settings into hass.io and restart hass.io

Additional context
There is only one index template in elasticsearch which looks like this:

{
  "order": 0,
  "index_patterns": [
    "hass-events-v2*"
  ],
  "settings": {
    "index": {
      "number_of_shards": "1"
    }
  },
  "mappings": {
    "doc": {
      "dynamic": "strict",
      "properties": {
        "domain": {
          "type": "keyword"
        },
        "object_id": {
          "type": "keyword"
        },
        "entity_id": {
          "type": "keyword"
        },
        "attributes": {
          "type": "object",
          "dynamic": true,
          "properties": {
            "es_location": {
              "type": "geo_point"
            }
          }
        },
        "@timestamp": {
          "type": "date"
        },
        "value": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 2048
            },
            "float": {
              "type": "float",
              "ignore_malformed": true
            }
          }
        }
      }
    }
  },
  "aliases": {
    "all-hass-events": {}
  }
}

Question

What do I have to do to make homeassistant-elasticsearch apply the new settings?

Support for "always changed"

I am using only_publish_changed, and it is great, but I have problems with binary sensors where publishing at regular time intervals makes it easier to analyse (and graph).

I imagine that adding a config structure, similar to "exclude" (domain and entities lists) would be the way to go (if this is a feature that is wanted). If i were to choose a name, maybe "always_changed".

In my 5 minutes of looking at the code I would imagine adding a section to
do_publish(self):
Right below
if not self._only_publish_changed
With something like:
elif <always_changed has anything in it>:
However, I am not confident enough about the code yet to assume.

remove homeassistant version from requirements.txt?

Currently the requirements.txt lists homeassistant==0.69.1. When I installed this plugin a few weeks ago, the first thing I did was pip install -r requirements.txt (without really checking the content) and this resulted in a downgrade of my homeassistant which was already at 0.77 or so.

I think it would make sense to drop homeassistant from the requirements.txt, as this is an addon to homeassistant and I feel it's safe to assume someone who installs this plugin is already running homeassistant.
The other solution could be to use something like >= which would not downgrade HA if it's already installed

Thoughts?

Allow forcing lower-casing of entity ids.

Some devices append a .Battery_State while some append .battery_state.

This could be reconciled by forcing lower-case, and/or a separate mapping output that is forced into a lower-case state.

ilm_max_size vs ilm_hot_max_size

Environment
Home-Assistant version: 0.106
Elasticsearch version: 0.2.0

Relevant configuration.yml settings:

elastic:
    url: SOME_URL
    publish_frequency: 2
    only_publish_changed: true
    ilm_hot_max_size: 10gb
    ilm_delete_after: 30d

Describe the bug
The readme documentation lists ilm_hot_max_size which does not work. Looks like the real property name is ilm_max_size.

To Reproduce
Use the above yaml - validation will fail.

Cannot connect to ElasticSearch running on HTTPS

Trying the plugin from Hassio on Pi but cannot connect to an Elasticsearch running on HTTPS. I got the right certificates configured on ES (Let's Encrypt) connecting with browser is no issue. All efforts result in SSL errors. Config seems to ignore the verify_ssl = false option.

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py", line 485, in wrap_socket
    cnx.do_handshake()
  File "/usr/local/lib/python3.7/site-packages/OpenSSL/SSL.py", line 1934, in do_handshake
    self._raise_ssl_error(self._ssl, result)
  File "/usr/local/lib/python3.7/site-packages/OpenSSL/SSL.py", line 1671, in _raise_ssl_error
    _raise_current_error()
  File "/usr/local/lib/python3.7/site-packages/OpenSSL/_util.py", line 54, in exception_from_error_queue
    raise exception_type(errors)
OpenSSL.SSL.Error: [('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 672, in urlopen
    chunked=chunked,
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 376, in _make_request
    self._validate_conn(conn)
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 994, in _validate_conn
    conn.connect()
  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 394, in connect
    ssl_context=context,
  File "/usr/local/lib/python3.7/site-packages/urllib3/util/ssl_.py", line 370, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/usr/local/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py", line 491, in wrap_socket
    raise ssl.SSLError("bad handshake: %r" % e)
ssl.SSLError: ("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])",)

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/connection/http_urllib3.py", line 172, in perform_request
    response = self.pool.urlopen(method, url, body, retries=Retry(False), headers=request_headers, **kw)
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 720, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "/usr/local/lib/python3.7/site-packages/urllib3/util/retry.py", line 376, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/usr/local/lib/python3.7/site-packages/urllib3/packages/six.py", line 734, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 672, in urlopen
    chunked=chunked,
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 376, in _make_request
    self._validate_conn(conn)
  File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 994, in _validate_conn
    conn.connect()
  File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 394, in connect
    ssl_context=context,
  File "/usr/local/lib/python3.7/site-packages/urllib3/util/ssl_.py", line 370, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/usr/local/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py", line 491, in wrap_socket
    raise ssl.SSLError("bad handshake: %r" % e)
urllib3.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])",)

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/setup.py", line 170, in _async_setup_component
    hass, processed_config
  File "/config/custom_components/elastic/__init__.py", line 93, in async_setup
    publisher = DocumentPublisher(conf, gateway, hass, system_info)
  File "/config/custom_components/elastic/__init__.py", line 268, in __init__
    self._create_index_template()
  File "/config/custom_components/elastic/__init__.py", line 456, in _create_index_template
    es_version = self._gateway.get_es_version()
  File "/config/custom_components/elastic/__init__.py", line 160, in get_es_version
    version = self.client.info()["version"]
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/client/utils.py", line 76, in _wrapped
    return func(*args, params=params, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/client/__init__.py", line 241, in info
    return self.transport.perform_request('GET', '/', params=params)
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/transport.py", line 318, in perform_request
    status, headers_response, data = connection.perform_request(method, url, params, body, headers=headers, ignore=ignore, timeout=timeout)
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/connection/http_urllib3.py", line 178, in perform_request
    raise SSLError('N/A', str(e), e)
elasticsearch.exceptions.SSLError: ConnectionError(("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])",)) caused by: SSLError(("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')])",))

Is this part on my end, or something with the elastic add-on in HA?

NaN values are not supported

Log Details (ERROR)
Mon Mar 04 2019 13:44:13 GMT+0100 (Midden-Europese standaardtijd)
Error publishing documents to Elasticsearch: ('5 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'wbW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@1ee844c6; line: 1, column: 257]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_power_solar', 'entity_id': 'sensor.toon_p1_power_solar', 'attributes': {'unit_of_measurement': 'Watt', 'friendly_name': 'Toon P1 Power Solar', 'icon': 'mdi:weather-sunny'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'wrW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@24ea5bba; line: 1, column: 245]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_power_use_cnt', 'entity_id': 'sensor.toon_power_use_cnt', 'attributes': {'unit_of_measurement': 'kWh', 'friendly_name': 'Toon Power Use Cnt', 'icon': 'mdi:flash'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'xLW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@292a954b; line: 1, column: 268]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_power_solar_cnt', 'entity_id': 'sensor.toon_p1_power_solar_cnt', 'attributes': {'unit_of_measurement': 'kWh', 'friendly_name': 'Toon P1 Power Solar Cnt', 'icon': 'mdi:weather-sunny'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'ybW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@8ae6212; line: 1, column: 223]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_heat', 'entity_id': 'sensor.toon_p1_heat', 'attributes': {'unit_of_measurement': '', 'friendly_name': 'Toon P1 Heat', 'icon': 'mdi:fire'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'yrW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@54bcc9b5; line: 1, column: 234]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_power_use', 'entity_id': 'sensor.toon_power_use', 'attributes': {'unit_of_measurement': 'Watt', 'friendly_name': 'Toon Power Use', 'icon': 'mdi:flash'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}])
Traceback (most recent call last):
File "/config/custom_components/elastic/init.py", line 286, in do_publish
bulk_response = bulk(self._gateway.get_client(), actions)
File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/init.py", line 257, in bulk
for ok, item in streaming_bulk(client, actions, *args, **kwargs):
File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/init.py", line 192, in streaming_bulk
raise_on_error, *args, **kwargs)
File "/config/deps/lib/python3.7/site-packages/elasticsearch/helpers/init.py", line 137, in _process_bulk_chunk
raise BulkIndexError('%i document(s) failed to index.' % len(errors), errors)
elasticsearch.helpers.BulkIndexError: ('5 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'wbW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@1ee844c6; line: 1, column: 257]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_power_solar', 'entity_id': 'sensor.toon_p1_power_solar', 'attributes': {'unit_of_measurement': 'Watt', 'friendly_name': 'Toon P1 Power Solar', 'icon': 'mdi:weather-sunny'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'wrW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@24ea5bba; line: 1, column: 245]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_power_use_cnt', 'entity_id': 'sensor.toon_power_use_cnt', 'attributes': {'unit_of_measurement': 'kWh', 'friendly_name': 'Toon Power Use Cnt', 'icon': 'mdi:flash'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'xLW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@292a954b; line: 1, column: 268]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_power_solar_cnt', 'entity_id': 'sensor.toon_p1_power_solar_cnt', 'attributes': {'unit_of_measurement': 'kWh', 'friendly_name': 'Toon P1 Power Solar Cnt', 'icon': 'mdi:weather-sunny'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'ybW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@8ae6212; line: 1, column: 223]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_p1_heat', 'entity_id': 'sensor.toon_p1_heat', 'attributes': {'unit_of_measurement': '', 'friendly_name': 'Toon P1 Heat', 'icon': 'mdi:fire'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'yrW7SGkBaQlUzTS0rK9f', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': 'failed to parse', 'caused_by': {'type': 'json_parse_exception', 'reason': "Non-standard token 'NaN': enable JsonParser.Feature.ALLOW_NON_NUMERIC_NUMBERS to allow\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@54bcc9b5; line: 1, column: 234]"}}, 'data': {'domain': 'sensor', 'object_id': 'toon_power_use', 'entity_id': 'sensor.toon_power_use', 'attributes': {'unit_of_measurement': 'Watt', 'friendly_name': 'Toon Power Use', 'icon': 'mdi:flash'}, '@timestamp': datetime.datetime(2019, 3, 4, 13, 44, 12, 732634), 'value': nan}}}])

Do you have any idea ?

Insufficient privileges provided with the README example

Environment
Home-Assistant version: 0.93.2
Elasticsearch version: 7.1.0

Relevant configuration.yml settings:

elastic:
  url: https://d4eec97ef7554e85987327576a2af170.ece.REDACTED.ORG:9243
  ssl_ca_path: /path/to/letsencryptCA.pem
  username: !secret elastic_user
  password: !secret elastic_pass
  publish_frequency: 60
  request_rollover_frequency: 600
  rollover_max_size: 3gb

Describe the bug
The role suggestion you offer does not cover all needed roles.

Error creating initial index/alias: AuthorizationException(403, 'security_exception', 'action [indices:admin/aliases] is unauthorized for user [hass_writer]')
…
Error performing rollover: AuthorizationException(403, 'security_exception', 'action [indices:admin/rollover] is unauthorized for user [hass_writer]')

And because of these, it can't write to the index:

Error publishing documents to Elasticsearch: ('500 document(s) failed to index.'

To Reproduce
Steps to reproduce the behavior:

  1. Create a hass_writer user with the role as defined in the README

Expected behavior
The README instructions should include all necessary roles and permissions needed for a user to successfully write to Elasticsearch the first try.

Error loading elasticsearch

HA Version: 0.77
ELK version: 6.3.2

I just added this to my home assistant. Copied files as described and configured my configuration.yaml to point to my server running the ELK stack. After restarting HA I get the following error:

2018-09-01 08:11:53 ERROR (MainThread) [homeassistant.setup] Error during setup of component elastic Traceback (most recent call last): File "/usr/src/app/homeassistant/setup.py", line 145, in _async_setup_component hass, processed_config) File "/usr/local/lib/python3.6/asyncio/coroutines.py", line 212, in coro res = func(*args, **kw) File "/config/custom_components/elastic.py", line 62, in async_setup gateway = ElasticsearchGateway(hass, conf) File "/config/custom_components/elastic.py", line 126, in __init__ self.client = self._create_es_client() File "/config/custom_components/elastic.py", line 134, in _create_es_client import elasticsearch ModuleNotFoundError: No module named 'elasticsearch'

My config in HA is as follows:
elastic:
# URL should point to your Elasticsearch cluster
url: http://192.168.1.10:9200

And if I browse the url I get the default elasticsearch response:
{ "name" : "oTddzKZ", "cluster_name" : "elasticsearch", "cluster_uuid" : "mmt9dUJxQFunPQUve-kdcg", "version" : { "number" : "6.3.2", "build_flavor" : "default", "build_type" : "deb", "build_hash" : "053779d", "build_date" : "2018-07-20T05:20:23.451332Z", "build_snapshot" : false, "lucene_version" : "7.3.1", "minimum_wire_compatibility_version" : "5.6.0", "minimum_index_compatibility_version" : "5.0.0" }, "tagline" : "You Know, for Search" }

any ideas as to what is wrong here? I guess it tries to find a library named elasticsearch on my HA computer but it has no ELK installation.

Rollover defaults are too aggressive

The current defaults for index rollovers are too aggressive. This results in indices that are too small. The default rollover_size should be set to 30gb, and both rollover_age and rollover_docs should be initially unset.

Document Security role requirements

Is your feature request related to a problem? Please describe.
Thanks for adding auth support to this. It's not clear what (X-Pack) Security role requirements are needed when interacting a secured cluster.

Describe the solution you'd like
Having some extra docs would be great, an example of a custom role, or which of the existing roles could be granted would be even better.

Additional context

Rename time field to @timestamp?

Is your feature request related to a problem? Please describe.
Currently the default time field is just called time. Logstash and Beats are using @timestamp by default and while there is no real need for this name, it's kind of the standard name for the default time field.

Want to start a discussion on this and hear others opinions on this.

Error publishing documents to Elasticsearch

Environment
Home-Assistant version: 0.92.4
Elasticsearch version: 6.7

Relevant configuration.yml settings:

# Do not include your Elasticsearch URL, credentials, or any other sensitive information
elastic:
  url: http://localhost:9200
  ...

Describe the bug

getting the following in my logs ( truncated )

2019-04-19 07:49:56 INFO (MainThread) [elasticsearch] POST http://localhost:9200/_bulk [status:200 request:0.127s]
2019-04-19 07:49:56 ERROR (MainThread) [custom_components.elastic] Error publishing documents to Elasticsearch: ('2 document(s) failed to index.', [{'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'lQ9uNWoBlgpy3TEwgkab', 'status': 400, 'error': {'type': 'illegal_argument_exception', 'reason': 'mapper [hass.attributes.py_version] of different type, current_type [text], merged_type [long]'}, 'data': {'hass.domain': 'alarm_control_panel', 'hass.object_id': 'house', 'hass.entity_id': 'alarm_control_panel.house', 'hass.attributes': {'code_format': '.+', 'changed_by': '', 'immediate': set(), 'delayed': set(), 'ignored': {'binary_sensor.home_sensor', 'binary_sensor.away_delayed_sensor', 'binary_sensor.home_delayed_sensor', 'binary_sensor.perimeter_delayed_sensor', 'binary_sensor.perimeter_sensor', 'switch.skylight', 'binary_sensor.away_sensor'}, 'allsensors': {'binary_sensor.home_sensor', 'binary_sensor.away_delayed_sensor', 'binary_sensor.home_delayed_sensor', 'binary_sensor.perimeter_delayed_sensor', 'binary_sensor.perimeter_sensor', 'switch.skylight', 'binary_sensor.away_sensor'}, 'code_to_arm': False, 'panel_locked': False, 'passcode_attempts': 2, 'passcode_attempts_timeout': 10, 'changedbyuser': None, 'panic_mode': 'deactivated', 'arm_state': 'disarmed', 'enable_perimeter_mode': True, 'enable_persistence': False, 'enable_log': True, 'log_size': 10, 'supported_statuses_on': ['on', 'true', 'unlocked', 'open', 'detected', 'motion', 'motion_detected', 'motion detected'], 'supported_statuses_off': ['off', 'false', 'locked', 'closed', 'undetected', 'no_motion', 'standby'], 'updateUI': False, 'admin_password': 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxx', 'bwalarm_version': '1.1.3', 'py_version': sys.version_info(major=3, minor=6, micro=7, releaselevel='final', serial=0), 'users': [OrderedDict([('id', 'xxxxxxxx'), ('name', 'test'), ('enabled', False), ('code', '****'), ('picture', '/local/images/ha.png'), ('disable_animations', False)]), OrderedDict([('id', 'xxxxxxxx'), ('name', 'bart'), ('enabled', True), ('code', '****'), ('picture', '/local/images/hal.png'), ('disable_animations', False)]), OrderedDict([('id', 'xxxxxxxxxxxx'), ('name', 'Legacy API password user'), ('enabled', False), ('code', '****'), ('picture', '/local/images/ha.png'), ('disable_animations', False)]), OrderedDict([('id', 'xxxxxxxxxxx'), ('name', 'Legacy API password user'), ('enabled', False), ('code', '****'), ('picture', '/local/images/ha.png'), ('disable_animations', False)])], 'panel': OrderedDict([('camera_update_interval', ''), ('cameras', []), ('enable_camera_panel', 'False'), ('enable_clock', 'True'), ('enable_clock_12hr', 'True'), ('enable_custom_panel', 'False'), ('enable_floorplan_panel', 'False'), ('enable_sensors_panel', 'True'), ('enable_serif_font', 'True'), ('enable_weather', 'True'), ('hide_passcode', 'True'), ('panel_title', 'Surname Residence'), ('shadow_effect', 'True'), ('enable_fahrenheit', 'False')]), 'themes': [OrderedDict([('name', 'aaa'), ('warning_color', '#995BFF'), ('pending_color', '#FF2943'), ('disarmed_color', '#FF22E6'), ('triggered_color', '#FF0000'), ('armed_home_color', '#C1B1FF'), ('armed_away_color', '#FF8686'), ('armed_perimeter_color', '#DAFF9E'), ('active', 'False'), ('action_button_border_color', '#3ED5FF')])], 'logs': [], 'mqtt': {'enable_mqtt': False, 'payload_arm_night': 'ARM_NIGHT', 'state_topic': 'home/alarm', 'payload_disarm': 'DISARM', 'payload_arm_home': 'ARM_HOME', 'qos': 0, 'override_code': False, 'command_topic': 'home/alarm/set', 'pending_on_warning': False, 'payload_arm_away': 'ARM_AWAY'}, 'states': OrderedDict([('armed_away', OrderedDict([('immediate', ['switch.skylight', 'binary_sensor.away_sensor']), ('delayed', ['binary_sensor.away_delayed_sensor']), ('override', []), ('pending_time', 2), ('warning_time', 2), ('trigger_time', 5)])), ('armed_home', OrderedDict([('immediate', ['binary_sensor.home_sensor']), ('delayed', ['binary_sensor.home_delayed_sensor']), ('override', []), ('pending_time', 2), ('warning_time', 2), ('trigger_time', 5)])), ('armed_perimeter', OrderedDict([('immediate', ['binary_sensor.perimeter_sensor']), ('delayed', ['binary_sensor.perimeter_delayed_sensor']), ('override', []), ('pending_time', 0), ('warning_time', 2), ('trigger_time', 600)]))]), 'friendly_name': 'House'}, 'hass.value': 'disarmed', '@timestamp': datetime.datetime(2019, 4, 19, 7, 49, 55, 950180), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.91.4', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 34.xxxxx, 'lon': -84.xxxxxx}, 'host.architecture': 'x86_64', 'host.os.name': 'Linux', 'host.hostname': 'ubuntu', 'tags': ['hass']}}}, {'index': {'_index': 'hass-events-v2-000001', '_type': 'doc', '_id': 'pQ9uNWoBlgpy3TEwgkab', 'status': 400, 'error': {'type': 'mapper_parsing_exception', 'reason': "failed to parse field [hass.attributes.media_content_id] of type [long] in document with id 'pQ9uNWoBlgpy3TEwgkab'", 'caused_by': {'type': 'illegal_argument_exception', 'reason': 'For input string: "spotify:track:xxxxxxx"'}}, 'data': {'hass.domain': 'media_player', 'hass.object_id': 'spotify', 'hass.entity_id': 'media_player.spotify', 'hass.attributes': {'media_content_id': 'spotify:track:xxxxxxxxxxx', 'media_content_type': 'music', 'media_title': 'I Was Jack (You Were Diane)', 'media_artist': 'Jake Owen', 'media_album_name': 'Greetings From...Jake', 'source': 'Office', 'source_list': ['JOHNDYE-M-D3KG', '50" TCL Roku TV', 'Roku TV'], 'shuffle': False, 'friendly_name': 'Spotify', 'icon': 'mdi:spotify', 'entity_picture': '/api/media_player_proxy/media_player.spotify?token=xxxxxxx121ee&cache=xxxxx', 'supported_features': 51765}, 'hass.value': 'paused', '@timestamp': datetime.datetime(2019, 4, 19, 7, 49, 55, 950180), 'agent.name': 'My Home Assistant', 'agent.type': 'hass', 'agent.version': '0.91.4', 'ecs.version': '1.0.0', 'host.geo.location': {'lat': 34.xxxxxx, 'lon': -84.xxxx}, 'host.architecture': 'x86_64', 'host.os.name': 'Linux', 'host.hostname': 'ubuntu', 'tags': ['hass']}}}])
Traceback (most recent call last):
  File "/home/homeassistant/.homeassistant/custom_components/elastic/__init__.py", line 324, in do_publish
    bulk_response = bulk(self._gateway.get_client(), actions)
  File "/srv/homeassistant/lib/python3.6/site-packages/elasticsearch/helpers/__init__.py", line 257, in bulk
    for ok, item in streaming_bulk(client, actions, *args, **kwargs):
  File "/srv/homeassistant/lib/python3.6/site-packages/elasticsearch/helpers/__init__.py", line 192, in streaming_bulk
    raise_on_error, *args, **kwargs)
  File "/srv/homeassistant/lib/python3.6/site-packages/elasticsearch/helpers/__init__.py", line 137, in _process_bulk_chunk
    raise BulkIndexError('%i document(s) failed to index.' % len(errors), errors)

Add queries as sensors

Is your feature request related to a problem? Please describe.
I'd like to be able to make queries and return the result to HomeAssistant, this would be useful for geofencing, for example, as I send all my data to Elastic, but not all data flows through HomeAssistant.

Describe the solution you'd like
A plugin to create queries into ElasticSearch as sensors in HomeAssistant.

Additional context

Add config for verify_certs and ca_certs

Is your feature request related to a problem? Please describe.
When using a self signed cert, the connection fails with:

2018-09-30 12:34:20 INFO (MainThread) [homeassistant.setup] Setting up elastic
2018-09-30 12:34:20 WARNING (MainThread) [elasticsearch] HEAD https://xxx.ece.home.lan:9243/_template/hass-index-template [status:N/A request:0.076s]
Traceback (most recent call last):
  File "/home/homeassistant/homeassistant/lib/python3.5/site-packages/urllib3/connectionpool.py", line 601, in urlopen
    chunked=chunked)
  File "/home/homeassistant/homeassistant/lib/python3.5/site-packages/urllib3/connectionpool.py", line 346, in _make_request
    self._validate_conn(conn)
  File "/home/homeassistant/homeassistant/lib/python3.5/site-packages/urllib3/connectionpool.py", line 850, in _validate_conn
    conn.connect()
  File "/home/homeassistant/homeassistant/lib/python3.5/site-packages/urllib3/connection.py", line 326, in connect
    ssl_context=context)
  File "/home/homeassistant/homeassistant/lib/python3.5/site-packages/urllib3/util/ssl_.py", line 329, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "/usr/lib/python3.5/ssl.py", line 385, in wrap_socket
    _context=self)
  File "/usr/lib/python3.5/ssl.py", line 760, in __init__
    self.do_handshake()
  File "/usr/lib/python3.5/ssl.py", line 996, in do_handshake
    self._sslobj.do_handshake()
  File "/usr/lib/python3.5/ssl.py", line 641, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:720)

The CA is installed and trusted on the OS level:

$ openssl s_client -connect xxx.ece.home.lan:9243
...
    Verify return code: 0 (ok)
...
---

So it looks like python is not using the OS level CA trust.

Describe the solution you'd like
elasticsearch-py support verify_certs and ca_certs, it would be nice if they could be exposed to homeassistant-elasticsearch vi a config option.

How to Get Events Published?

Installed this as I've been looking for a good solution on getting HASS logs into elasticsearch.

After installing the component, it created the index and alias, but I'm not getting any events published. No errors are showing up, only things I see in the HASS logs are that the rollover succeeds whenever I restart HASS.

Is there some config that need to be setup to have the logs published? Currently just setting the url and leaving the rest at defaults.

The folder structure has to be changed to comply with hass v 0.88.0

Environment
Home-Assistant version: 0.88.0
Elasticsearch version: 6.6

Describe the bug
With version 0.88.0 hass.io throws the following two errors:

  • Error loading custom_components.elastic.sensor. Make sure all dependencies are installed
  • Integrations need to be in their own folder. Change sensor/elastic.py to elastic/sensor.py. This will stop working soon.

The plugin still seems to work but fine. But according to this article https://developers.home-assistant.io/blog/2019/02/19/the-great-migration.html the folder structure has to be changed.

Tune rollover settings

They are overly aggressive, and should not rollover until the index is of a sufficient size

Deprecate cloud_id configuration option

Trying to pare down the config options a bit in order to support HASS config flows (UI driven config). The cloud id connection option is something we don’t really need, so I think it can/should be removed.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.