Giter VIP home page Giter VIP logo

elkstack's Introduction

elkstack

Elasticsearch, Logstash, and Kibana stack. Due to the recommendations of the community, we are not using the embedded elasticsearch functionality of logstash at this point. This cookbook provides recipes for all three components, along with wrapper recipes such as single or cluster to facilitate different use cases.

This stack's design is intended for one or many standalone nodes, with a full stack of elasticsearch, logstash, and kibana. The only difference between one and many nodes is that elasticsearch is clustered together. Data dispatched to Logstash on a particular node will use the local elasticsearch transport interface to index those logs to the node (and thus, the cluster). HTTP traffic dispatched to Kibana on port 80 on any node will also use the local elasticsearch HTTP interface to fetch and manipulate data.

Please read the individual recipe summaries to understand what each recipe does, as well as what each wrapper recipe is actually wrapping. As much as possible, upstream attributes have been exposed/overriden for our needs.

Things you should know

  • This cookbook requires java. Because not everyone has the same desires for java versions, concurrently installed versions, or particular vendor versions, this cookbook simply assumes you have already satisfied this requirement. This cookbook does ship with default attributes to make the community cookbook use Java 7 over the default of Java 6.

  • You must update your Berksfile to use this cookbook. Due to the upstream changes constantly occuring, you should consult the Berksfile in this cookbook and use its sources for kibana, logstash, and elasticsearch cookbooks. Eventually, as PRs get merged, this may no longer be a hard requirement. But the hardest thing will be that kibana in supermarket is currently a different actual cookbook.

  • You should probably disable the nginx virtualhost that comes with the kibana cookbook and create your own configuration, securing it as appropriate for your own requirements. See the kibana_web LWRP documentation for more on what attributes should be set to accomplish this.

  • If you'd like to disable backups using cloud files, set node['elkstack']['config']['backups']['enabled'] = false (it defaults to true). If you'd like to override the backup schedule/behavior for ES, simply disable the backup crontab entry by setting node['elkstack']['config']['backups']['cron']=false. This cookbook will still configure everything except the cronjob, and then you may create another one with your own schedule using the cron_d LWRP.

  • Please note that this cookbook does not restart elasticsearch automatically, in order to avoid causing an outage of the cluster. It does restart nginx and logstash, however. You will have to restart elasticsearch after the initial bootstrap. You may also need to bounce logstash if it seems confused about losing a connection to eleasticsearch (unusual, but happens).

  • You may want to consider adjusting node['elasticsearch']['discovery']['search_query'] if you are sharing one cluster among multiple environments. Just put a chef search in that attribute and this will use that search instead of one scoped to chef environments.

  • You may want to consider adjusting node['elasticsearch']['allocated_memory'] if you are seeing an initial convergence failure (see #50). The chef client has been known to take up to 500mb or more on initial convergence. Combined with an initial allocation of 40% memory for ES, and 20% for logstash, that only leaves about 40% for the OS and chef. On a 2gb server, that ends up being 800mb for ES, about 400mb for logstash, leaving 800mb for the OS and the initial chef client run. After the initial run, memory footprint for the chef-client tends to be much, much lower, and ES is able to start.

  • The agent and logstash recipes requires a pre-generated SSL key and certificate due to the requirements of the lumberjack protocol. This cookbook will consult node['elkstack']['config']['lumberjack_data_bag'] in order to locate and load a database that stores this key. It will first try an encrypted data bag, and if that doesn't work, will try an unencrypted data bag of the same name. If no data bag is found, it will autogenerate one and save it as an encrypted data bag. This means you must already have a 'secret file' on the node for an encryption key, as this is a require to use any encrypted data bags. To generate a key of your own, use something like:

openssl req -x509 -newkey rsa:2048 -keyout lumberjack.key -out lumberjack.crt -nodes -days 1000

This key and certificate data should be placed in data bag with name node['elkstack']['config']['lumberjack_data_bag'] under key and certificate keys, and base64 encoded into a single line string. You may also supply these secrets with some other method and populate the appropriate node.run_state values (see _secrets.rb for more details). Note that this is not a PKI trust model, but an explicit trust model. You may also set the data bag key to false to disable lumberjack entirely.

There exists a make-lumberjack-key.sh to help you make this. For Go 1.3+, you may be required by the standard libraries to create a SAN cert as described here.

See CHANGELOG.md for additional information about changes to this stack over time.

Supported Platforms

Ubuntu 12.04

Ubuntu 14.04

CentOS 6.5

Attributes

Key Type Description Default
['elkstack']['config']['logstash']['instance_name'] String Default logstash instance name server
['elasticsearch']['discovery']['search_query'] String A query to search for and connect Elasticsearch to cluster nodes (see `attributes/elasticsearch.rb`)
['logstash_forwarder']['config']['files'] Hash See customizing the stack section below. Most logs in `/var/log`
['elkstack']['config']['data_disk']['disk_config_type'] Boolean or String See customizing the stack section below. false
['elkstack']['config']['agent']['enabled'] Boolean Enable/Disable agent functionality true
['elkstack']['config']['cloud_monitoring']['enabled'] Boolean Enable/Disable cloud_monitoring functionality true
['elkstack']['config']['iptables']['enabled'] Boolean Enable/Disable iptables functionality true
['elkstack']['config']['site_name'] String Control the name of the self-signed SSL key and cert in /etc/nginx/ssl kibana
['elkstack']['config']['kibana']['prepare_ssl'] Boolean Enable/disable automatic creation of an SSL certificate and private key and htpassword file for Kibana's nginx reverse-proxy. If disabled, you are responsible for placing these items in the correct location or supplying your own nginx vhost configuration for Kibana. See the `kibana_ssl` recipe for details. true
['elkstack']['config']['kibana']['redirect'] Boolean Enable/Disable nginx redirect for kibana from port 80 to port 443 true
node.run_state['elkstack_kibana_username'] and ['elkstack']['config']['kibana']['username'] String Default username for basic auth for kibana, run_state used first kibana
node.run_state['elkstack_kibana_password'] String Password for basic auth for kibana random from Opscode::OpenSSL::Password
['elkstack']['config']['lumberjack_data_bag'] String Data bag name for lumberjack key and certificate lumberjack
['elkstack']['config']['custom_logstash']['name'] Array of strings See `attributes/logstash.rb` for an explanation of how to use this attribute to populate additional logstash configuration file templates []
['elkstack']['config']['restart_logstash_service'] Boolean Restart logstash if we deploy a custom config file true

Customizing the stack

To override local storage for elasticsearch nodes (the stack will format and mount, as well as configure elasticsearch), set ['elkstack']['config']['data_disk']['disk_config_type'] to custom and provide each storage device and mount point in the following way:

disk_config = {
  'file_system' => 'ext4',
  'mount_options' => 'rw,user',
  'mount_path' => '/usr/local/var/data/elasticsearch/disk1',
  'format_command' => 'mkfs -t ext4 ',
  'fs_check_command' => 'dumpe2fs'
}

node.override['elasticsearch']['data']['devices']['/dev/xvde1'] = disk_config
node.override['elasticsearch']['path']['data'] = disk_config['mount_path']

To add additional logstash configuration to this stack, simply add additional templates in your wrapper cookbook. They should be placed in "#{@basedir}/#{@instance}/etc/conf.d" (see the config provider in the logstash cookbook). If you choose to use logstash-forwarder instead of the regular agent, please see the hash structure in attributes/forwarder.rb for adding additional files for the forwarder to watch and forward, node['logstash_forwarder']['config']['files'].

To override the nginx configuration, simply supply a new template and specify your cookbook using ['kibana']['nginx']['template_cookbook'] and ['kibana']['nginx']['template']. You can also override just the password for the reverse proxy using node.run_state['elkstack_kibana_password'].

To override anything else, set the appropriate node hash (logstash, kibana, or elasticsearch).

Usage

elkstack::default

A simple wrapper recipe that sets up Elasticsearch, Logstash, and Kibana. Also configures an rsyslog sink into logstash on the local box. Everything except Logstash and Kibana is locked down to listen only on localhost.

elkstack::agent

A simple wrapper recipe that sets up a logstash agent on the local box. Also configures an rsyslog sink into logstash on the local box. You need node['elkstack']['config']['agent']['enabled'] set to true if you want to use this recipe (default to true).

elkstack::forwarder

A go-based alternative to the normal agent, configured simply to watch logs forward them directly on to the cluster. This project is in heavy development, and is not publishing releases very often, so the packaged versions may be quite old or buggy. As of the addition of the recipe, the package was almost a year behind current development, but only because there also had been no releases either.

elkstack::elasticsearch

Leans on the upstream elasticsearch/cookbook-elasticsearch cookbook for much of its work. We do override the default set of plugins to be installed, as well as the amount of JVM heap. See attributes/default.rb for those settings.

This recipe also tags the node so that other nodes that run this recipe can discover it, and configure Elasticsearch appropriately to join their cluster. It uses a tag, the current chef environment, and the cluster name as the default search criteria.

Most of this is configurable using the upstream Elasticsearch cookbook's attributes, including the chef search itself. There is not an easy toggle to turn off the search, however. Enables iptables rules if node['elkstack']['config']['iptables']['enabled'] is not nil.

elkstack::logstash

Leans on the upstream lusis/chef-logstash cookbook for much of its work. We do override the default set of plugins to be installed, as well as the amount of JVM heap. See attributes/default.rb for those settings.

elkstack::kibana

Leans on the upstream lusis/chef-kibana cookbook for most of its work. Sets up an nginx site for kibana by default. By default, it also does not pass through most of the http paths directly to elasticsearch (whitelist).

elkstack::newrelic

Validates if there is a newrelic license set and based on that, see if the node is tagged as 'elkstack' and creates a file with elasticsearch details. Installs python, pip and setuptools packages in order to support newrelic_meetme_plugin

elkstack::acl

Adds cluster node basic iptables rules and cluster iptables rules if appropriate attributes are set.

elkstack::agent_acl

Adds agent node basic iptables rules.

elkstack::disk_setup

Look for node['elkstack']['config']['data_disk']['disk_config_type'] to be truthy, and configure the upstream elasticsearch cookbook to format, mount, and use devices appropriately.

elkstack::*_monitoring

These correspond with the recipes above, and just provide a way to pull out the monitoring work to make the original recipes cleaner.

Miscellaneous

The wrapper recipes are single and cluster. These change attributes and then invoke elasticsearch, logstash, kibana, and rsyslog. Finally, there are utility recipes like java and newrelic (not invoked otherwise), as well as acl which is called by _base if node['elkstack']['config']['iptables']['enabled'].

Contributing

See CONTRIBUTING.

Authors

Author:: Rackspace ([email protected])

License

# Copyright 2014, Rackspace Hosting
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

elkstack's People

Contributors

bobross419 avatar chewi avatar edhurtig avatar fernandohonig avatar gondoi avatar jarosser06 avatar jimmycuadra avatar joerg avatar jujugrrr avatar lmunro avatar marcoamorales avatar martinb3 avatar mattjbarlow avatar nicka avatar patcon avatar phoolish avatar prometheanfire avatar schwing avatar stephan202 avatar theborch avatar zdeptawa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

elkstack's Issues

==> default: undefined method `[]' for nil:NilClass (node['logstash']['instance_default']['basedir'])

Wrapping up the cookbook as a library (using it as a recipe from a wrapper cookbook)
I get this error (using berkshelf and Vagrant, and it fails on Vagrant with Centos 6.5)

==> default:
==> default:
==> default: NoMethodError
==> default: -------------
==> default: undefined method []' for nil:NilClass ==> default: ==> default: ==> default: Cookbook Trace: ==> default: --------------- ==> default: /tmp/vagrant-chef-3/chef-solo-1/cookbooks/elkstack/recipes/logstash.rb:22:infrom_file'
==> default: /tmp/vagrant-chef-3/chef-solo-1/cookbooks/elkstack/recipes/single.rb:14:in from_file' ==> default: /tmp/vagrant-chef-3/chef-solo-1/cookbooks/walla_dashboards/recipes/default.rb:12:infrom_file'

And it looks like the error originates from this code (inside logstash.rb)
directory node['logstash']['instance_default']['basedir'] do
owner node['logstash']['instance_default']['user']

I understand this issue is probably due to lack of understanding
Chef internals and the proper way to use the recipe, but it would be very helpful
to be able to make it work.

the issue can be found here and on many other place over the web:
http://stackoverflow.com/questions/26368264/having-an-error-regarding-a-node-basedir-while-installing-a-elkstack-cookbook

Error when elkstack called from platformstack 1.4.1

@martinb3 Here is the trace you requested:


       Recipe Compile Error in /tmp/kitchen/cache/cookbooks/860257-dealereprocess-app/recipes/default.rb
         ================================================================================


         NoMethodError
         -------------
         undefined method `[]' for nil:NilClass

         Cookbook Trace:
#
         ---------------
           /tmp/kitchen/cache/cookbooks/elkstack/recipes/_secrets.rb:89:in `from_file'
           /tmp/kitchen/cache/cookbooks/elkstack/recipes/agent.rb:49:in `from_file'
           /tmp/kitchen/cache/cookbooks/platformstack/recipes/logging.rb:21:in `from_file'
           /tmp/kitchen/cache/cookbooks/platformstack/recipes/default.rb:66:in `from_file'
           /tmp/kitchen/cache/cookbooks/rackops_rolebook/recipes/default.rb:19:in `block in from_file'
           /tmp/kitchen/cache/cookbooks/rackops_rolebook/recipes/default.rb:18:in `each'
           /tmp/kitchen/cache/cookbooks/rackops_rolebook/recipes/default.rb:18:in `from_file'
           /tmp/kitchen/cache/cookbooks/860257-dealereprocess-app/recipes/default.rb:10:in `from_file'

         Relevant File Content:
         ----------------------
         /tmp/kitchen/cache/cookbooks/elkstack/recipes/_secrets.rb:

          82:  elsif lumberjack_secrets.nil?
          83:    fail 'Could not find an encrypted or unencrypted data bag to use as a lumberjack keypair, and could not generate a keypair either'
          84:  else
          85:    fail 'Unable to complete lumberjack keypair configuration'
          86:  end
          87:
          88:  # if we had overrode basedir value, we'd need to use the new value here too
          89>> file "#{node['logstash']['instance_default']['basedir']}/lumberjack.key" do
          90:    content node.run_state['lumberjack_decoded_key']
          91:    owner node['logstash']['instance_default']['user']
          92:    group node['logstash']['instance_default']['group']
          93:    mode '0600'
          94:    not_if { node.run_state['lumberjack_decoded_key'].nil? }
          95:  end
          96:
          97:  # if we had overrode basedir value, we'd need to use the new value here too
          98:  file "#{node['logstash']['instance_default']['basedir']}/lumberjack.crt" do



         Running handlers:
       [2014-10-15T16:19:00+00:00] ERROR: Running exception handlers
         Running handlers complete
       [2014-10-15T16:19:00+00:00] ERROR: Exception handlers complete
       [2014-10-15T16:19:00+00:00] FATAL: Stacktrace dumped to /tmp/kitchen/cache/chef-stacktrace.out
         Chef Client failed. 10 resources updated in 290.358062985 seconds
       [2014-10-15T16:19:00+00:00] ERROR: undefined method `[]' for nil:NilClass
       [2014-10-15T16:19:00+00:00] FATAL: Chef::Exceptions::ChildConvergeError: Chef run process exited unsuccessfully (exit code 1)

ELK stack can't read lumberjack unencrypted data bag that it created

Reported by @chri7765. If elkstack has no lumberjack keypair, it will generate one and store it in an un-encrypted data bag. On the next agent to converge, if no other encrypted data bags exist, it will pick up the key it generated no problem. But it will fail if any other encrypted data bags exist.

Error:

  Recipe Compile Error in /var/chef/cache/cookbooks/wrapper/recipes/default.rb
  ================================================================================

  Chef::EncryptedDataBagItem::DecryptionFailure
  ---------------------------------------------
  Error decrypting data bag value: 'wrong final block length'. Most likely the provided key is incorrect

  Cookbook Trace:
  ---------------
    /var/chef/cache/cookbooks/elkstack/recipes/_secrets.rb:77:in `from_file'
    /var/chef/cache/cookbooks/elkstack/recipes/agent.rb:49:in `from_file'
    /var/chef/cache/cookbooks/wrapper/recipes/elk_agent.rb:11:in `from_file'
    /var/chef/cache/cookbooks/wrapper/recipes/default.rb:14:in `block in from_file'
    /var/chef/cache/cookbooks/wrapper/recipes/default.rb:10:in `each'
    /var/chef/cache/cookbooks/wrapper/recipes/default.rb:10:in `from_file'

  Relevant File Content:
  ----------------------
  /var/chef/cache/cookbooks/elkstack/recipes/_secrets.rb:

   70:    lumberjack_secrets = Chef::DataBagItem.new
   71:    lumberjack_secrets.data_bag(lumberjack_data_bag)
   72:    lumberjack_secrets.raw_data = secrets
   73:    lumberjack_secrets.save
   74:  end
   75:  
   76:  # now try to use the data bag
   77>> if !lumberjack_secrets.nil? && lumberjack_secrets['key'] && lumberjack_secrets['certificate']
   78:    node.run_state['lumberjack_decoded_key'] = Base64.decode64(lumberjack_secrets['key'])
   79:    node.run_state['lumberjack_decoded_certificate'] = Base64.decode64(lumberjack_secrets['certificate'])
   80:  elsif !lumberjack_secrets.nil?
   81:    fail 'Found a data bag for lumberjack secrets, but it was missing \'key\' and \'certificate\' data bag items'
   82:  elsif lumberjack_secrets.nil?
   83:    fail 'Could not find an encrypted or unencrypted data bag to use as a lumberjack keypair, and could not generate a keypair either'
   84:  else
   85:    fail 'Unable to complete lumberjack keypair configuration'
   86:  end


  Running handlers:
[2014-11-13T08:35:59-07:00] ERROR: Running exception handlers
  Running handlers complete
[2014-11-13T08:35:59-07:00] ERROR: Exception handlers complete
[2014-11-13T08:35:59-07:00] FATAL: Stacktrace dumped to /var/chef/cache/chef-stacktrace.out
  Chef Client failed. 1 resources updated in 16.827280415 seconds
[2014-11-13T08:36:00-07:00] ERROR: Error decrypting data bag value: 'wrong final block length'. Most likely the provided key is incorrect
[2014-11-13T08:36:00-07:00] FATAL: Chef::Exceptions::ChildConvergeError: Chef run process exited unsuccessfully (exit code 1)

Remote and Process Monitoring for elk components

Need to add:

  • process monitors for the two java processes and nginx
  • tcp connect check for elasticsearch transport protocol and logstash input plugins (e.g. syslog sink)
  • http checks on elasticsearch http interface and nginx site for kibana

Document Java and Berksfile dependencies

@hhoover Elkstack’s dependency on Platformstack requires cookbooks to be included in the Kitchen’s Berksfile that are not documented (except through Berks errors), cookbooks like rackspace_cloudbackup and rackspace_gluster, for example. These cookbooks also have to be dealt with in overriding attributes either in roles or app cookbooks.

Firewall rules for single stack and cluster

Cluster will need search to open :9300 for clustered elasticsearch as well as any sink plugins for logstash. Single instance should just need sinks. Kibana will be covered in another ticket.

Elasticsearch version 1.3 is not compatible with Kibana 4

The default version for Elasticsearch is currently set to 1.3.4, and the default version for Kibana (kibana_lwrp) is 4.0.0-beta3. Kibana 4 does not support Elasticsearch lower than version 1.4. Unless you override the version, you are presented with the following error:

Kibana: This version of Kibana requires Elasticsearch 1.4.0 or higher on all nodes. I found the following incompatible nodes in your cluster: Elasticsearch 1.3.4 @ inet/xx.xx.xx.xx:9300

The default needs to be updated:
https://github.com/rackspace-cookbooks/elkstack/blob/master/attributes/elasticsearch.rb#L3

Solution constraints can't be solved due to Kibana

Trying to create a wrapper cookbook for this, and getting the following error uponberks install:

Unable to satisfy constraints on package kibana due to solution constraint (elkstack = 3.2.1). Solution constraints that may result in a constraint on kibana: [(elkstack = 3.2.1) -> (kibana ~> 1.3)], [(odin-elk = 0.1.0) -> (elkstack = {3.2.1,0.1.1,0.3.0}) -> (kibana ~> 1.3)]
Demand that cannot be met: (elkstack = 3.2.1)
Artifacts for which there are conflicting dependencies: kibana = 0.1.6 -> [(build-essential >= 0.0.0)]Unable to find a solution for demands: elkstack (3.2.1), odin-elk (0.1.0)

For reference, here are my Berksfile and metadata.rb:

source "https://api.berkshelf.com"

cookbook 'elkstack', github: 'rackspace-cookbooks/elkstack'

metadata
name             'odin-elk'
maintainer       'YOUR_NAME'
maintainer_email 'YOUR_EMAIL'
license          'All rights reserved'
description      'Installs/Configures odin-elk'
long_description 'Installs/Configures odin-elk'
version          '0.1.0'

depends 'elkstack'

Everything looks OK to me in your Berksfile and metadata.rb, so I suspect this is a problem with the Kibana cookbook itself. Not sure why the error message cites a conflicting dependency for Kibana 0.1.6, when the cookbook dependency is locked to 1.3.0. Anybody run into this issue, or have any insights?

iptables rules specified in acl recipe aren't applied

Description
When running recipe elkstack::single against a single centos 6.5 server, it deploy everything fine, however, the iptables rules aren't seemingly applied.
Looking at _server recipe, it says:

iptables_enabled = node.deep_fetch('elkstack', 'config', 'iptables')
if !iptables_enabled.nil? && iptables_enabled
  include_recipe 'elkstack::acl'
end

The default attribute to enable iptables is set to true:

default['elkstack']['iptables']['enabled'] = 'true'

However, after running the single recipe (which includes _server, which in turn includes acl based on the aforementioned condition), I can't see of the unconditional iptables rules applied (for ports 9900, 5959, 5960, 80, 443).

Reproduction steps

  • deploy elkstack::single (include elkstack::java if required too) to a single node (I used chef-zero; prepped a databag, see below)
  • run iptables -nvL

Expected Results
You should see ports 9900, 5959, 5960, 80 and 443 in the INPUT chain

Actual Results
There are no rules in place

Additional information
You can use this data bag when running chef-zero:
lumberjack/secrets,json:

{
  "id": "secrets",
  "key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tDQpNSUlFdmdJQkFEQU5CZ2txaGtpRzl3MEJBUUVGQUFTQ0JLZ3dnZ1NrQWdFQUFvSUJBUURZdEpCYngyNFJUclVODQpVdktKN015L01aeEVCb2k2azd4K0JLT3pyKzU0V3lOZWRCMnFGM0VCWnMwTHQyVkJCWEJ0UWdxVVVIWHpFRmdTDQo2ZUdwOC9YclpqOG9WNkI0ckpSL0lxbHJtbGdzb2Zvb0N3QVVZTzg1RUhrN1RIOWdXNDN6QmI4Y2pvbW5IR0ZyDQpjQmllRXZBRTZHL2VVaDdIMGNpaWxIblY1bytveEZkTjRFL2N4OWx6L3JLR0M3SnBVZ3BROVlIc1NOVHdJbWNlDQpFcVdLbFczS3hoTWFZUVJWb2FxSFRiMUxNbmxIcVk1MWg4S1g4WmtOV3ZyYjJLRU9FUW9QSUlGdGtoazVTdUJBDQp1ZXByVFh6amJYWitaUVpDTHo2ekNONmxYanBVRTRDY0dCUTZydStsOXZ2RXc4dWdaNUExa1o2ZU9Kdm5GUHR6DQpuaTJKV3lJekFnTUJBQUVDZ2dFQUptUEdkeHZiV2VHUm5XeW1YdHkrWU5pUEVGWC8vdDJSTk5ucGpqbUtpM1BKDQp6ai9QeVlRaGx0ZjVWeXdFR2dLMnFnUmJEMjg1bGZlOVFveUFWN1ZLU1l0eGdOb1ZLWXVaT3ZTUEF1cnkxK1ZTDQpCYW45TjU3OGdpVnk5SmhXc0dGSHdsZXdSWVRTeWZIektDOVJqUjladWVUYmZJMGJ1dFpsTHJnUzVWdlU4MWFSDQpZYVZtVnh4YkY5UEpMc3V3ZXlCd1hwR296eDl5QWp3ZGhkRWJ2OTE1MTB5aFU4eVg5b1doSWxkMjE2WmJ1eXVZDQoxVkxnRVZseGN5bklEa0ZMWDFVcFdmTVZjSFlZbVF2UjNCb1dUMWpnS0YwcWdJV0VCbW13Vm5RUFlWaWhZM3NSDQpkZDBtd0NNZ3VXeWMralc2U281czhadkszU2RnVXpaTVR4a3FoR0psWVFLQmdRRDhkT056LytRTW94Tk5ab0VKDQo5RExhcHpOWWpITDIrRDFRWFQwbU4yc09YZFZQSjMveUNKSFpSSTZCT21GbjhSdGkzRjdUNFFGLzlxa1JGUmltDQpMdE51MnY2aVRyU3JXMkgveGxSTjZDeS85SitTMHNPcmRzUDhWeDYrWm1PTFJHMUt3VU5HTExSTVBzN2NSSUpWDQp5QTYvVW1zNVpyNkpIdUF0WGhxWFhhank2d0tCZ1FEYnZ6ZFY2NDlNMVlmVzJCU2VWek9vM3g1alpJRWdUVG00DQowbXA1OTZHVTBENDhNeUp1ekV5TTgxcWQycy9ZdVdYb01YekhveHFNY003Y0poQnRLa2wwUFNuSFo2UGEvU3JVDQprcWN0eWpEWWJJRUJERUM0UG5BWCtkUGI4c1k3bXVnZWRoZjBuUllCWGlGL3pBeER5UktUYVVraXZMM1NnL2hQDQpmSXNBL1pkcjJRS0JnRlRzaFM4YVgzNll4UDkrZ2QzMVZiNFVETU8yeTEyaytBczJza0ZPMXlhSURoK0liQlBoDQpLaDdxWHYyOXc5S1JXdU1RdXAyUHpVOWNqRmNBdjcvM2RJeVFBcVJhMFkvck43WXc0MThwd1JQNW1FeHR0Z0RSDQovTysvNXNtcDY4YUhpRFJqZXR2NllkUmNOSDRJVmNmQmVxU0ZkeWhpRmFwT3hwNjhUem1uK2hOdEFvR0JBTnN2DQozaWdyc0dJNVV5ZHZuUkZiZGNIcDI4dlVRaUJRSjFVOVBNdXZ2MUpLYk9sckw5dEltTXEzS2hudVdnZVkxaGRHDQoxV25rUE9UODJMa3FscFBzN0J1dnJtNmg2QVRWSmRXbSttNW9FVlN1MWZhUG5EYXF3UENKVjFNNjA1UThyVzlFDQo2QndzVy9pOVJiak5kU1pmOTlGbDRYZHV3QUN0ZGc4QzhUdnB0eEh4QW9HQkFQQnJJQm92eEV3WlZUcVl0Y0hFDQphclBhZXlsSmN5TEkxOUo0cC8vSzU4UFQ2NnNxU0lJWFVva1NDUnF5WFNvV0FsMnNKUUtHUUNhcHBmQnk4ZGZVDQpMeHRTRkdqR04xQ1Z0d2YrV2dGeVI1NU1wK3VRZkZ0TUpnMWxUWE9ZMzdrNm9aZ3pmN0t6b1NacHBxdmhUQkljDQpJdzcvN21rTVQvN1hnelRYbVBuT2Z3UHoNCi0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0=",
  "certificate": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tDQpNSUlEWFRDQ0FrV2dBd0lCQWdJSkFJRk5oYjFxQ0FVOE1BMEdDU3FHU0liM0RRRUJCUVVBTUVVeEN6QUpCZ05WDQpCQVlUQWtGVk1STXdFUVlEVlFRSURBcFRiMjFsTFZOMFlYUmxNU0V3SHdZRFZRUUtEQmhKYm5SbGNtNWxkQ0JYDQphV1JuYVhSeklGQjBlU0JNZEdRd0hoY05NVFF4TURBNU1Ea3pOVEkwV2hjTk1UY3dOekExTURrek5USTBXakJGDQpNUXN3Q1FZRFZRUUdFd0pCVlRFVE1CRUdBMVVFQ0F3S1UyOXRaUzFUZEdGMFpURWhNQjhHQTFVRUNnd1lTVzUwDQpaWEp1WlhRZ1YybGtaMmwwY3lCUWRIa2dUSFJrTUlJQklqQU5CZ2txaGtpRzl3MEJBUUVGQUFPQ0FROEFNSUlCDQpDZ0tDQVFFQTJMU1FXOGR1RVU2MURWTHlpZXpNdnpHY1JBYUl1cE84ZmdTanM2L3VlRnNqWG5RZHFoZHhBV2JODQpDN2RsUVFWd2JVSUtsRkIxOHhCWUV1bmhxZlAxNjJZL0tGZWdlS3lVZnlLcGE1cFlMS0g2S0FzQUZHRHZPUkI1DQpPMHgvWUZ1Tjh3Vy9ISTZKcHh4aGEzQVluaEx3Qk9odjNsSWV4OUhJb3BSNTFlYVBxTVJYVGVCUDNNZlpjLzZ5DQpoZ3V5YVZJS1VQV0I3RWpVOENKbkhoS2xpcFZ0eXNZVEdtRUVWYUdxaDAyOVN6SjVSNm1PZFlmQ2wvR1pEVnI2DQoyOWloRGhFS0R5Q0JiWklaT1VyZ1FMbnFhMDE4NDIxMmZtVUdRaTgrc3dqZXBWNDZWQk9BbkJnVU9xN3ZwZmI3DQp4TVBMb0dlUU5aR2VuamliNXhUN2M1NHRpVnNpTXdJREFRQUJvMUF3VGpBZEJnTlZIUTRFRmdRVXV0OHMyTjZuDQpITHRiNXVYUXI5d0htQ0M3YVR3d0h3WURWUjBqQkJnd0ZvQVV1dDhzMk42bkhMdGI1dVhRcjl3SG1DQzdhVHd3DQpEQVlEVlIwVEJBVXdBd0VCL3pBTkJna3Foa2lHOXcwQkFRVUZBQU9DQVFFQVVyTlpWSG81ak9zLzIxZHIzYks5DQpvT3ZiLzZNR2RFeUZUZTRxV3FRZTNsR0hnTWovRUI1MExXcXpwVzFQc29EWUp4a2p1Nk10bWtIeHlwNHpaWFJ4DQpGdEF3RUZpY2VISm1TcXFGd0ttblEyY1pVcEVaTVdDUzNQN2phTGpVTFUrMzl4SEY3WFpPUk9zd3BZNTFsTjRQDQo5KzRobm1RQjdaMG1QQUMva2VaN1VJcmppd21NTGt6UzlRczNndkxoY2NWcklsTC9zK1lUYm0wWms0dHhhcmQyDQpPVFArQmRDWDR1aUtET1hlWjg1NjdIRTFSM0hRRkV1T1UvNDk1TnB0SC9zSUZvN3FtMGs5Y1ZYamc0VG5oalNuDQpHZ2V0VDd5YkpzbTdBTDBOaHJOay9iMUpyNm5GSlZ5N21vM2ovSEFHV3pqcnFxSUhXYU9hV0ZLYUhya0NSN3p3DQpUdz09DQotLS0tLUVORCBDRVJUSUZJQ0FURS0tLS0t"
}

Logstash fails to run properly

Logstash fails to run properly after a new install (currently using ELKStack ver. 4.0.0). The following error is being generated:

Unable to load "text!config" modules because of an unknown error.

According to discussions found on elastic/kibana#1653 , the following needs to be added to the "location" block in nginx to resolve issues with CORS on Elasticsearch:

proxy_set_header Host $http_host;

After fixing that issue, the following error is now being generated:

ReferenceError: ZeroClipboard is not defined

Not too clear on what is needed to fix this one, though.

Disable _base in Java recipe

Java seems to be there to install Java if not installed, it's including _base which is also included in other recipes (cluster etc...), I'd say it should only include java cookbook as it's its only purpose.

Python's setuptools fails to install in CentOS 6.5

I'm trying to make a wrapper for this cookbook using CentOS 6.5 but am having an issue with python. I've got my Berksfile configured to point the python cookbook to

'python', git: '[email protected]:racker/python.git'

My recipe simply installs java using my own java wrapper and then calls include_recipe 'elkstack::single'. The only attributes I've changed are:

default['kibana']['webserver_listen'] = '127.0.0.1'
default['kibana']['webserver_hostname'] = 'localhost'
default['elkstack']['config']['backups']['enabled'] = false

Using chef_zero as the provisioner.

The error I'm getting is:

           - upgrade python_pip[setuptools] version from 0.6rc11 to latest

           - upgrade python_pip[setuptools] version from 0.6rc11 to latest
       Recipe: stack_commons::python


           ================================================================================
           Error executing action `run` on resource 'bash[manually upgrade setuptools]'
           ================================================================================

           Mixlib::ShellOut::ShellCommandFailed
       ------------------------------------
           Expected process to exit with [0], but received '1'
           ---- Begin output of "bash"  "/tmp/chef-script20150204-5869-fqxh25" ----
           STDOUT: 
           STDERR: usage: easy_install [options] requirement_or_url ...
       or: easy_install --help

           error: invalid command 'easy_install'
           ---- End output of "bash"  "/tmp/chef-script20150204-5869-fqxh25" ----
           Ran "bash"  "/tmp/chef-script20150204-5869-fqxh25" returned 1

           Resource Declaration:
           ---------------------
           # In /tmp/kitchen/cache/cookbooks/stack_commons/recipes/python.rb

            30: bash 'manually upgrade setuptools' do
            31:   user 'root'
            32:   cwd '/tmp'
            33:   code <<-EOH
            34:   easy_install --upgrade setuptools
            35:   EOH
            36:   only_if { rhel? }
            37: end
            38: 

           Compiled Resource:
       ------------------
           # Declared in /tmp/kitchen/cache/cookbooks/stack_commons/recipes/python.rb:30:in `from_file'

           bash("manually upgrade setuptools") do
             action "run"
             retries 0
             retry_delay 2
             default_guard_interpreter :default
             command "\"bash\"  \"/tmp/chef-script20150204-5869-fqxh25\""
             backup 5
             cwd "/tmp"
             returns 0
             user "root"
             code "  easy_install --upgrade setuptools\n"
             interpreter "bash"
             declared_type :bash
             cookbook_name "stack_commons"
             recipe_name "python"
             only_if { #code block }
           end


       Running handlers:
       [2015-02-04T10:15:03-06:00] ERROR: Running exception handlers
       Running handlers complete
       [2015-02-04T10:15:03-06:00] ERROR: Exception handlers complete
       [2015-02-04T10:15:03-06:00] FATAL: Stacktrace dumped to /tmp/kitchen/cache/chef-stacktrace.out
       Chef Client failed. 22 resources updated in 653.70782717 seconds
       [2015-02-04T10:15:03-06:00] ERROR: bash[manually upgrade setuptools] (stack_commons::python line 30) had an error: Mixlib::ShellOut::ShellCommandFailed: Expected process to exit with [0], but received '1'
       ---- Begin output of "bash"  "/tmp/chef-script20150204-5869-fqxh25" ----
       STDOUT: 
       STDERR: usage: easy_install [options] requirement_or_url ...
          or: easy_install --help

       error: invalid command 'easy_install'
       ---- End output of "bash"  "/tmp/chef-script20150204-5869-fqxh25" ----
       Ran "bash"  "/tmp/chef-script20150204-5869-fqxh25" returned 1
       [2015-02-04T10:15:04-06:00] FATAL: Chef::Exceptions::ChildConvergeError: Chef run process exited unsuccessfully (exit code 1)

I tried a few things manually after the converge failed and didn't seem to have much luck:

$ python /usr/lib/python2.6/site-packages/easy_install.py --upgrade setuptools
usage: easy_install.py [options] requirement_or_url ...
   or: easy_install.py --help

error: invalid command 'easy_install'

$ sudo pip install -U setuptools
Collecting setuptools from https://pypi.python.org/packages/3.4/s/setuptools/setuptools-12.0.5-py2.py3-none-any.whl#md5=87647eb8380c5da10e63fdd7274d8b49
  Using cached setuptools-12.0.5-py2.py3-none-any.whl
Installing collected packages: setuptools
  Found existing installation: setuptools 0.6rc11
    Can't uninstall 'setuptools'. No files were found to uninstall.

Successfully installed setuptools-0.6rc11

I repeated that last command a couple times because someone on the SO said it might help but no joy. Ideas?

Ensure logs are rotated appropriately

Ensure logs are rotated regularly by default, with a monitor to ensure the cluster never fills up. Once we're doing a lot of logging with the above items, this will be critical. Related to #6.

Logstash syslog input shoud use UDP instead of TCP

By default syslog should always use UDP instead of TCP. With TCP the problem is, that if the syslog (logstash) server is unavailable the sending server will start blocking of caching which can be very problematic. This can even crash your whole infrastructure if the logstash server is down for too long.

Document from "knife cookbook site install" onward

It would be helpful for pre-Supermarket users of Chef if there existed documentation that showed how to build wrapper cookbook for elkstack.

I believe this goes something like:

knife cookbook site install elkstack
git init
git add .

Not sure what comes after!

Thanks for any assistance.
M.

Issues when updating kibana

If moving a system between 1.3.x versions, this cookbook fails to converge. This is as the kibana install doesn't referesh the file in https://github.com/lusis/chef-kibana/blob/KIBANA3/provider/install.rb#L60. Unfortunately, the symlink is changed, as this does include the kibana version
https://github.com/lusis/chef-kibana/blob/KIBANA3/providers/install.rb#L75-L76. This then results in a failure as https://github.com/rackspace-cookbooks/elkstack/blob/master/recipes/kibana.rb#L55 fails (as the 'curren't symlink' is broken).

Secure Kibana by default

Kibana will point at localhost for elasticsearch, but it needs full access to the RESTful http endpoint of elasticsearch. This can be a major security issue if wide open, so we need to figure out a nice default way to keep this closed off but still usable when requested. Maybe SSL+basic auth?

better integration tests for kibana UI

The kibana UI is javascript, so the current curl tests miss errors printed within the page.
It would be great if we could find a command line javascript browser to test it properly.
Some options:
HTMLUnit (I use that one for unit testing)
PhantomJS
Zombie.js

PhantomJS looks the best but it takes > 30mins to compile, and there is no binary available for linux yet.

Tuning pass and go public with stack

Before going public, we should do one more pass to make sure all the settings and tunables for all three components are good, and also to be sure that we document plenty of examples for customizing inputs and outputs for logstash.

Error executing action `restart` on resource 'service[newrelic-plugin-agent]'

The newrelic-plugin-agent won't start after the latest change:

Mixlib::ShellOut::ShellCommandFailed
------------------------------------
Expected process to exit with [0], but received '1'
---- Begin output of /sbin/service newrelic-plugin-agent restart ----
STDOUT: Stopping /usr/bin/newrelic-plugin-agent: cannot stop /usr/bin/newrelic-plugin-agent: not running[FAILED]
Starting /usr/bin/newrelic-plugin-agent: /usr/bin/newrelic-plugin-agent -c /etc/newrelic/newrelic-plugin-agent.cfg
[FAILED]
STDERR: ERROR: Startup of newrelic-plugin-agent Failed
.
Error starting /usr/bin/newrelic-plugin-agent: zero length field name in format
---- End output of /sbin/service newrelic-plugin-agent restart ----
Ran /sbin/service newrelic-plugin-agent restart returned 1

From the shell:

# /usr/bin/newrelic-plugin-agent -c /etc/newrelic/newrelic-plugin-agent.cfg
ERROR: Startup of newrelic-plugin-agent Failed
Error starting /usr/bin/newrelic-plugin-agent: zero length field name in format`

The /etc/newrelic/newrelic-plugin-agent.cfg specifies:

Daemon:
  user: newrelic
  pidfile: /var/run/newrelic/newrelic-plugin-agent.pid

The user newrelic is not created on the box and/or the user specified needs to write access to /var/run/newrelic.

Reference:
MeetMe/newrelic-plugin-agent#197

Elkstack logstash-forwarder (AKA lumberjack) SSL auth

@hhoover It didn’t generate my certificates properly, and after manually putting certificates in place (on the shipper and elk server) would still not work. I ended up writing a very small cookbook to upload a cert and key to the filesystem, which even then could not be verified by logstash-forwarder.

Cannot allocate memory on elkstack::single

Hi,
Some tests were failing while trying to integrate ELKstack::single recipe. The node was not able to converge due to a lack of memory preventing to install a simple nginx package.

    Recipe: nginx::package
         * package[nginx] action install[2014-09-30T08:25:11+00:00] INFO: Processing package[nginx] action install (nginx::package line 40)

           Error executing action `install` on resource 'package[nginx]'

           Errno::ENOMEM
           -------------
           Cannot allocate memory - fork(2)

           Resource Declaration:
           ---------------------

           # In /tmp/kitchen/cache/cookbooks/nginx/recipes/package.rb

            40: package node['nginx']['package_name'] do
            41:   options package_install_opts

The stack is integrated like that:

include_recipe 'java'
include_recipe 'elkstack::single'

Tests were done on a Rackspace performance1-2 flavor instance running only the recipe above. It works well after the second chef-run so I suspect we do something consuming too much memory during the first run?

A work around for now has been to reduce ES memory usage :

+es_mem = (node['memory']['total'].to_i * 0.2).floor / 1024
+default['elasticsearch']['allocated_memory'] = "#{es_mem}m"

logstash-forwarder requires proper CN names in lumberjack certs

see elastic/logstash-forwarder#221 for reference.
Since go 1.3+ tls requires proper CN hostnames, or if using IPs to connect to logstash server you need to add the IP as a subjectAlternativeName.

We will need to generate certificates per forwarder node to fix this. We could maybe leverage this
elastic/logstash-forwarder#221 (comment)

I tried this
elastic/logstash-forwarder#221 (comment)
but still had problems, I think it may only work with go 1.2 and below.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.