terraform-ibm-modules / terraform-ibm-icd-elasticsearch Goto Github PK
View Code? Open in Web Editor NEWImplements an instance of the IBM Cloud Databases for Elasticsearch service.
License: Apache License 2.0
Implements an instance of the IBM Cloud Databases for Elasticsearch service.
License: Apache License 2.0
When provisioning from the IBM Cloud Catalog, we can enable public and private endpoint of Elasticsearch instance, but this option is not available for selection/configuration on the DA. It enables only the private endpoints when provisioned.
Please add some type of selection/option for enabling public endpoints.
Also need this option for use in the Gen AI RAG DA Stack: #204
It is currently set to 1GB by default, but the minimum appears to be 4GB, so this immediately result in a validation error for anyone trying to consume the DA
Common use case to deploy kibana - I'd suggest to have automation to auto-deploy it as part of the DA (possibly as a stack). Some existing material:
https://cloud.ibm.com/docs/databases-for-elasticsearch?topic=databases-for-elasticsearch-kibana-code-engine-icd-elasticsearch
At first glance, but not exclusively:
"solutions/secure"
-> "solutions/standard"
Error deploying
2024/05/27 13:46:38 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [1m50s elapsed]
2024/05/27 13:46:48 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m0s elapsed]
2024/05/27 13:46:58 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m10s elapsed]
2024/05/27 13:47:08 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m20s elapsed]
2024/05/27 13:47:18 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m30s elapsed]
2024/05/27 13:47:28 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m40s elapsed]
2024/05/27 13:47:38 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [2m50s elapsed]
2024/05/27 13:47:48 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m0s elapsed]
2024/05/27 13:47:58 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m10s elapsed]
2024/05/27 13:48:08 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m20s elapsed]
2024/05/27 13:48:18 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m30s elapsed]
2024/05/27 13:48:28 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m40s elapsed]
2024/05/27 13:48:38 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [3m50s elapsed]
2024/05/27 13:48:48 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m0s elapsed]
2024/05/27 13:48:58 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m10s elapsed]
2024/05/27 13:49:08 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m20s elapsed]
2024/05/27 13:49:18 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m30s elapsed]
2024/05/27 13:49:28 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m40s elapsed]
2024/05/27 13:49:38 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [4m50s elapsed]
2024/05/27 13:49:48 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m0s elapsed]
2024/05/27 13:49:58 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m10s elapsed]
2024/05/27 13:50:08 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m20s elapsed]
2024/05/27 13:50:18 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m30s elapsed]
2024/05/27 13:50:28 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m40s elapsed]
2024/05/27 13:50:38 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [5m50s elapsed]
2024/05/27 13:50:48 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [6m0s elapsed]
2024/05/27 13:50:58 Terraform apply | module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch: Still creating... [6m10s elapsed]
2024/05/27 13:51:04 Terraform apply |
2024/05/27 13:51:04 Terraform apply | Error: [ERROR] Error waiting for create database instance (crn:v1:bluemix:public:databases-for-elasticsearch:eu-de:a/190c293e9fda4c6684b5acf4b17871b8:bbb1eabb-0422-442c-b66a-b73978897e7b::) to complete: [ERROR] Error ICD interface not ready after create: crn:v1:bluemix:public:databases-for-elasticsearch:eu-de:a/190c293e9fda4c6684b5acf4b17871b8:bbb1eabb-0422-442c-b66a-b73978897e7b:: with error [ERROR] Error getting database config for: crn:v1:bluemix:public:databases-for-elasticsearch:eu-de:a%!F(MISSING)190c293e9fda4c6684b5acf4b17871b8:bbb1eabb-0422-442c-b66a-b73978897e7b:: with error Request failed with status code: 500, ServerErrorResponse: {"status":500,"error":"Internal Server Error"}
2024/05/27 13:51:04 Terraform apply |
2024/05/27 13:51:04 Terraform apply |
2024/05/27 13:51:04 Terraform apply |
2024/05/27 13:51:04 Terraform apply | with module.elasticsearch.module.elasticsearch.ibm_database.elasticsearch,
2024/05/27 13:51:04 Terraform apply | on ../../main.tf line 45, in resource "ibm_database" "elasticsearch":
2024/05/27 13:51:04 Terraform apply | 45: resource "ibm_database" "elasticsearch" {
2024/05/27 13:51:04 Terraform apply |
2024/05/27 13:51:04 �[1m�[31mTerraform APPLY error: Terraform APPLY errorexit status 1�[39m�[0m
Group related inputs together
Put inputs more frequently used at the top
from the customer point of view, we want to create admin_pass
automatically as part of terraform, instead that users must to enter a string.
admin_pass
existing_resource_group
-> use_existing_resource_group
to be consistent with other DAselasticsearch_version
- since this has a default value, it means we need to maintain it. I suggest that we remove the default value, but make it a required variable in the ibm_catalog.json and have it so it has a dropdown list of the supported versionsThe RAG DA Stack should provision ICD Elasticsearch.
The Elasticsearch instance provisioning will require some input ( edition, version, sizing, encryption etc.). Upon provisioning the Elasticsearch instance information (host, port, private and public url, admin credentials, certificate etc.) must be provided as DA outputs. These output values will be used to configure integration in other services and DAs that use this Elasticsearch instance.
Here are some initial high-level requirements:
• Provision Elasticsearch of a selected edition, version, size,
• If own key encrypt is selected, integrate with Key Protect - create/identify key use for encrypt (requires service cross authorization)
• Add ELSER model support - ELSER 1 and 2
• Create ES admin user password, get certificates etc. (generate and set a new random password)
• Add credentials to Secrets Manager for use by other DAs
• Set DA and Stack level outputs
• Option to deploy Kibana (on Code Engine) and connect the Elasticsearch instance
o Add Kibana credentials to Secrets Manager
Input: (draft)
o IBM Cloud Admin API Key (and IAM Bearer Token)
o Region, ES Name, Resource Group
o Database Edition/Plan (Default – Enterprise/Platinum TBD)
o Version (Default - 8.12)
o Sizing
Default for Enterprise
• Small. 0.5 vCPUs; 4 GB RAM; 5GB Disk
• Hosting model - Shared
Default for Platinum
• XS 4x16
• Hosting model - Isolated
o If customer managed encryption Y
Key Protect Key (from Key Protect)
o Endpoints – Both Public and Private
Output: (draft)
o ICD ES Deployment ID
o ES Hostname
o ES Port
o ES Version
o ES Edition
o ES Public Endpoint and port
o ES Private Endpoint and port
o ES admin user password (in Secrets Manager)
o ES certificate (in Secrets Manager)
o Kibana URL
The complete example shows how to use the elasticsearch provider to create indexes and configure cluster settings. Can we add that support directly to the module?
< Placeholder for now > details to come
2024/05/27 13:26:53 Terraform plan | on ../../modules/fscloud/main.tf line 8, in module "elasticsearch":
2024/05/27 13:26:53 Terraform plan | 8: elasticsearch_version = var.elasticsearch_version
2024/05/27 13:26:53 Terraform plan | ├────────────────
2024/05/27 13:26:53 Terraform plan | │ var.elasticsearch_version is "8.12"
2024/05/27 13:26:53 Terraform plan |
2024/05/27 13:26:53 Terraform plan | Version must be 8.7 or 8.10 (Enterprise plan only or Platinum if 8.10 or
2024/05/27 13:26:53 Terraform plan | above). If no value passed, the current ICD preferred version is used.
2024/05/27 13:26:53 Terraform plan |
The Elasticsearch DA has 1 flavor currently:
The following items will need to be reviewed and updated where required:
"The GUID of the Hyper Protect Crypto Services instance. The value is used only to create an authorization policy."
To be consistent with other DAs, the DA should ask for:
existing_kms_instance_crn
(instead of existing_kms_instance_guid
)existing_kms_instance_crn
existing_kms_instance_crn
and remove the kms_region
variableChanges should go into solutions/secure
only..
Add new optional variable ibmcloud_kms_api_key
. This should be used in a new provider block like so:
provider "ibm" {
alias = "kms"
ibmcloud_api_key = var.ibmcloud_kms_api_key != null ? var.ibmcloud_kms_api_key : var.ibmcloud_api_key
region = local.kms_region # this value should be parsed from the existing KMS CRN
}
provider "ibm" {
ibmcloud_api_key = var.ibmcloud_api_key
region = var.region
}
The kms
module block should use the kms
provider alias (its already set up like this in the code actually).
Support creating cross account s2s auth policy (in KMS account):
ibmcloud_kms_api_key
is passed, and skip_iam_authorization_policy
is set to false, then create a cross account s2s auth policy in the KMS account to allow all Elasticsearch instance in the given resource group reader access to the KMS instance GUID in the KMS account (thats the best scoping we can do, since it has to exist before Elasticsearch can be installed).skip_iam_authorization_policy
value thats passed to the Elasticsearch module itself is set to true
since we will create the cross account one in the DA itself.Review all of the variable descriptions and readme markdowns to ensure its clear that it supports KMS in a different account using the ibmcloud_kms_api_key
variable.
Review the diagram(s) in the reference-architectures
directory...
Once terraform-ibm-modules/terraform-ibm-secrets-manager#157 is merged we should be able to call that module in the DA to add service credentials to a secrets manager secret
The official documentation specifies that the user type property within the users
variable should be optional.
Our current module configuration enforces the type
property as a required field for each user object. This should be addressed to align with the documentation.
Currently, the users variable is defined as follows, making the type property required:
variable "users" {
type = list(object({
name = string
password = string # pragma: allowlist secret
type = string # "type" is required to generate the connection string for the outputs.
role = optional(string)
}))
}
The below Elasticsearch DA variables are complex object types, and will be hard for consumers in Projects to know what format should be used for the values. I suggest that we add some supporting documentation for them, the same way we did here. Then we can update the variable descriptions to point to the supporting doc:
service_credential_names
users
auto_scaling
The existing elasticsearch standard
DA flavor/variation is secure by default and complies with fscloud requirements. It must use KMS encryption and exposes only private endpoints.
We need an elasticsearch basic
DA variation that can provision elasticsearch without KMS encryption (i.e., uses IBM managed default encryption) and option to expose public endpoints.
We need this for use in the basic
variation Gen AI RAG DA Stack where we don't have Key Protect. Reference: #204
There are several configuration items related to use of key management system/KMS (e.g Key Protect) but it is unclear how to provision Elasticsearch without a KMS, i.e., using only the IBM managed encryption.
It appears we just need to make this item "existing_kms_instance_crn" blank if we don't want to use KMS.
If yes, please update the tool tip description of this item to be more clear. Currently its says "If not specified, a root key is created." which is unclear.
The current tool tip text for existing_kms_instance_crn : "The CRN of a Hyper Protect Crypto Services or Key Protect instance in the same account as the Databases for Elasticsearch instance. This value is used to create an authorization policy if skip_iam_authorization_policy is false. If not specified, a root key is created."
Also need this option for use in the Gen AI RAG DA Stack: #204
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.