Giter VIP home page Giter VIP logo

hcl2's Introduction

HCL

HCL is a toolkit for creating structured configuration languages that are both human- and machine-friendly, for use with command-line tools. Although intended to be generally useful, it is primarily targeted towards devops tools, servers, etc.

HCL has both a native syntax, intended to be pleasant to read and write for humans, and a JSON-based variant that is easier for machines to generate and parse.

The HCL native syntax is inspired by libucl, nginx configuration, and others.

It includes an expression syntax that allows basic inline computation and, with support from the calling application, use of variables and functions for more dynamic configuration languages.

HCL provides a set of constructs that can be used by a calling application to construct a configuration language. The application defines which attribute names and nested block types are expected, and HCL parses the configuration file, verifies that it conforms to the expected structure, and returns high-level objects that the application can use for further processing.

Experimental HCL2

This repository was used for the initial development of version 2 of HCL, but is now archived because the work from this repository was incorporated back in to the main HCL repository.

This repository remains here for the moment for compatibility with existing callers that imported the experimental version. We strongly encourage switching to a stable release of HCL 2 as soon as possible. For more information on installing HCL 1 and/or HCL 2, please see the Version Selection guide.

There will be no further development in this temporary experimental HCL 2 repository.

Why?

Newcomers to HCL often ask: why not JSON, YAML, etc?

Whereas JSON and YAML are formats for serializing data structures, HCL is a syntax and API specifically designed for building structured configuration formats.

HCL attempts to strike a compromise between generic serialization formats such as JSON and configuration formats built around full programming languages such as Ruby. HCL syntax is designed to be easily read and written by humans, and allows declarative logic to permit its use in more complex applications.

HCL is intended as a base syntax for configuration formats built around key-value pairs and hierarchical blocks whose structure is well-defined by the calling application, and this definition of the configuration structure allows for better error messages and more convenient definition within the calling application.

It can't be denied that JSON is very convenient as a lingua franca for interoperability between different pieces of software. Because of this, HCL defines a common configuration model that can be parsed from either its native syntax or from a well-defined equivalent JSON structure. This allows configuration to be provided as a mixture of human-authored configuration files in the native syntax and machine-generated files in JSON.

Information Model and Syntax

HCL is built around two primary concepts: attributes and blocks. In native syntax, a configuration file for a hypothetical application might look something like this:

io_mode = "async"

service "http" "web_proxy" {
  listen_addr = "127.0.0.1:8080"
  
  process "main" {
    command = ["/usr/local/bin/awesome-app", "server"]
  }

  process "mgmt" {
    command = ["/usr/local/bin/awesome-app", "mgmt"]
  }
}

The JSON equivalent of this configuration is the following:

{
  "io_mode": "async",
  "service": {
    "http": {
      "web_proxy": {
        "listen_addr": "127.0.0.1:8080",
        "process": {
          "main": {
            "command": ["/usr/local/bin/awesome-app", "server"]
          },
          "mgmt": {
            "command": ["/usr/local/bin/awesome-app", "mgmt"]
          },
        }
      }
    }
  }
}

Regardless of which syntax is used, the API within the calling application is the same. It can either work directly with the low-level attributes and blocks, for more advanced use-cases, or it can use one of the decoder packages to declaratively extract into either Go structs or dynamic value structures.

Attribute values can be expressions as well as just literal values:

# Arithmetic with literals and application-provided variables
sum = 1 + addend

# String interpolation and templates
message = "Hello, ${name}!"

# Application-provided functions
shouty_message = upper(message)

Although JSON syntax doesn't permit direct use of expressions, the interpolation syntax allows use of arbitrary expressions within JSON strings:

{
  "sum": "${1 + addend}",
  "message": "Hello, ${name}!",
  "shouty_message": "${upper(message)}"
}

For more information, see the detailed specifications:

Acknowledgements

HCL was heavily inspired by libucl, by Vsevolod Stakhov.

HCL and HIL originate in HashiCorp Terraform, with the original parsers for each written by Mitchell Hashimoto.

The original HCL parser was ported to pure Go (from yacc) by Fatih Arslan. The structure-related portions of the new native syntax parser build on that work.

The original HIL parser was ported to pure Go (from yacc) by Martin Atkins. The expression-related portions of the new native syntax parser build on that work.

HCL2, which merged the original HCL and HIL languages into this single new language, builds on design and prototyping work by Martin Atkins in zcl.

hcl2's People

Contributors

acburdine avatar akupila avatar apparentlymart avatar backspace avatar ceh avatar delatrie avatar dmcneil avatar endocrimes avatar fatih avatar habnabit avatar jbardin avatar mildwonkey avatar minamijoyo avatar nicholasjackson avatar notnoop avatar paultyng avatar radeksimko avatar wata727 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hcl2's Issues

Missing newline at the end of a file causes error

Due to some details of how the attribute and block parsers terminate, an error is currently produced if a file does not end with a newline character:

Error: Missing newline after block definition

  on .terraform/init-from-module/root/alibaba-terraform-alicloud-ecs-instance-063c382/variables.tf line 156:
 153: variable "number_of_instances" {
 154:   description = "The number of launching instances one time."
 155:   default = 1
 156: }

A newline is required after a block definition at the end of a file.

This was a stopgap solution made during the prototyping phase, but we need to accommodate files with missing trailing newlines now in order to read real Terraform configurations in the wild.

Missing newline after comment at EOF causes error

This may be just another mutation of #18

main.tf

# comment
$ hexdump -C main.tf
00000000  23 20 63 6f 6d 6d 65 6e  74                       |# comment|
00000009
$ terraform validate

Error: Invalid character

  on main.tf line 1:
   1: # comment

This character is not used within the language.


Error: Argument or block definition required

  on main.tf line 1:
   1: # comment

An argument or block definition is required here.

Question: how to decode a custom type and validate it?

Hey, given the following example code:

package main

import (
        "fmt"

        "github.com/hashicorp/hcl2/gohcl"
        "github.com/hashicorp/hcl2/hclparse"
)

const configSrc = `
config {
        type = "foo"
}
`

type MyType string

const (
        Foo MyType = "foo"
        Bar        = "bar"
)

type Config struct {
        Type string `hcl:"type,attr"`
}

type Root struct {
        Config Config `hcl:"config,block"`
}

func main() {
        parser := hclparse.NewParser()

        file, diags := parser.ParseHCL([]byte(configSrc), "demo.hcl")
        if len(diags) != 0 {
                for _, diag := range diags {
                        fmt.Printf("%+v\n", diag)
                }
                return
        }

        configBody := file.Body

        var root Root

        diags = gohcl.DecodeBody(configBody, nil, &root)
        if len(diags) != 0 {
                for _, diag := range diags {
                        fmt.Printf("%+v\n", diag)
                }
                return
        }

        fmt.Printf("%+v\n", root)
}

If only Foo or Bar should be accepted values for Config.Type, is there a way to enforce that through hcl means (e.g. hcldec.Decode with an AttrSpec?) or would I need to validate myself after decoding into string?

JSON syntax is too strict about nested block structures

The JSON syntax is currently defined such that multiple blocks of a given type are defined by using a JSON array as the value of the innermost label, like this: (using a Terraform-like structure as an example)

{
  "provisioner": {
    "local-exec": [
      {
        "command": "echo command number 1"
      },
      {
        "command": "echo command number 2"
      }
    ]
  }
}

The above example is equivalent to the following in the native syntax, assuming a suitable schema that defines provisioner as a block with a single label:

provisioner "local-exec" {
  command = "echo command number 1"
}
provisioner "local-exec" {
  command = "echo command number 2"
}

To minimize the number of different possible expressions of a single HCL structure in JSON, the spec currently requires that array to be innermost, but in doing this the JSON syntax is unable to express a structure like the following:

provisioner "local-exec" {
  command = "echo command number 1"
}
provisioner "file" {
  source      = "foo"
  destination = "bar"
}
provisioner "local-exec" {
  command = "echo command number 2"
}

The closest we can get with the current JSON syntax specification is the following:

{
  "provisioner": {
    "local-exec": [
      {
        "command": "echo command number 1"
      },
      {
        "command": "echo command number 2"
      }
    ],
    "file": [
      {
        "source": "foo",
        "destination": "bar"
      }
    ]
  }
}

This is not equivalent because the relative ordering of the blocks has been lost in lowering to JSON. To represent the ordering unambiguously requires a more complex structure, which the current specification considers invalid:

{
  "provisioner": [
    {
      "local-exec": {
        "command": "echo command number 1"
      }
    },
    {
      "file": {
        "source": "foo",
        "destination": "bar",
      }
    },
    {
      "local-exec": {
        "command": "echo command number 2"
      }
    }
  ]
}

Here the array is sandwiched between the key representing the block type and the keys representing the labels, with each block in its own object so that the local-exec key can appear twice without forcing the two to merge.

This more-flexible model has the unfortunate side-effect that there is no longer a single canonical structure, but being able to properly represent all possible native syntax structures is the more important requirement.


To address this, the spec (and along with it, the reference implementation) must be amended so that any object whose keys are interpreted as block labels may instead be an array of such objects, with each array visited in order and contributing zero or more blocks to the final set.

Each of these listed label-defining objects may define multiple label keys if desired, in which case they are interpreted as required for a single label-defining object today, producing blocks in an undefined order. To ensure an exact ordering, the author must use only a single label key in each object and let the array define the ordering.

Splat on a null should return an empty sequence

The splat operators already have a special case that if they are applied to a non-sequence value they behave as if the LHS was a single-element list. This is intended to make it easy to work with values that may nor may not be lists.

To complete this convenient abstraction, there should be a further special case where applying splat to a null value produces an empty tuple, thus making splat expressions a convenient way to work with values that might be null.

To avoid breaking the existing abstraction, this should apply only for null values of types we would not normally be able to iterate over. That is, a null value of list, tuple or set type should produce an error, not an empty tuple.

Confusing error in HCL1-style map

provider "aws" {
  region = "us-west-2"
  profile = "tf"
}

resource "aws_elasticsearch_domain" "example" {
  domain_name           = "example"
  elasticsearch_version = "1.5"

  cluster_config {
    instance_type = "r4.large.elasticsearch"
  }

  advanced_options {
    "rest.action.multi.allow_explicit_index" = true
  }

  snapshot_options {
    automated_snapshot_start_hour = 23
  }

  tags = {
    Domain = "TestDomain"
  }
}
Terraform v0.12.0-dev
$ terraform plan
Error: Invalid argument name

  on main.tf line 15, in resource "aws_elasticsearch_domain" "example":
  15:     "rest.action.multi.allow_explicit_index" = true

Argument names must not be quoted.

advanced_options here is TypeMap, which means we expect =, but that's not what the error suggests.

Tabs are not allowed

This repo inherited a restriction from an early prototype that tabs are not allowed.

This was done as part of an experiment that didn't end up working out, so this restriction should be removed and instead the scanner should permit tabs as a funny sort of space, counting them as a single "column" for source-location-reference purposes.

The canonical style is still to indent with pairs of spaces, so the auto-formatter in hclwrite will continue to replace tabs with spaces.

Define "lexicographic sort" for map iteration purposes

Currently the spec has the following to say about iterating objects/maps:

For object and map types, the key is the string attribute name or element key, and the value is the attribute or element value. The elements are visited in the order defined by a lexicographic sort of the attribute names or keys.

It doesn't currently state what exactly is being lexicographically sorted. The current implementation is sorting the normalized UTF-8 sequences, which likely produces a counter-intuitive result on the boundaries between the different UTF-8 length prefixes.

Should figure out what we really want to do here, specify it, and then implement it.

for expression with keyword on newline doesn't seem to parse correctly

Given input like this:

foo = {
  for net in networks:
  net.foo => net.bar
}

...the parser seems to not be detecting that for keyword as a for expression introducer and instead treating it as a map key expression.

It seems like the peek-ahead for that keyword isn't working correctly with the intervening newline, even though we turn off newline-sensitive tokenization before peeking it:

hcl2/hcl/hclsyntax/parser.go

Lines 1023 to 1028 in 3e4b7e0

p.PushIncludeNewlines(false)
defer p.PopIncludeNewlines()
if forKeyword.TokenMatches(p.Peek()) {
return p.finishParsingForExpr(open)
}

Perhaps somehow the newline token is ending up in the peek slot before PushIncludeNewlines(false) takes effect, and so it's still there when we subsequently call Peek. Will need to dig into the parser code some more to see what's going on here.

Allow leading underscores in identifiers

HCL2 uses the unicode definition of identifiers, which permits underscores only in the ID_Continue portion, and not in ID_Start.

It is a reasonable expectation from other languages that identifiers can begin with an underscore, so we'll extend the unicode definition here to include underscore in ID_Start, in similar vein to how we extended ID_Continue to include dashes.

gohcl decoder and entity indexes

The gohcl package is nearly perfect for my requirements. I have found however the need to know the index of a given entity with a document following decode, due to the items being decoded into arrays of different types.

Example:

Task "snarf" {
  TypeOne {
     prop = "whatevs"
  }
  TypeTwo {
    another_prop = "something"
  }
}

If for example the high level decoder could dump the start from Range into a named field that would be ideal. Otherwise I will need to move to the next highest level decoder.

I am reticent to decode the sub entities on the fly.

What are your thoughts regarding discovering entity ordering utilizing the high level decoder?

Using HCL in other software (developer guide)

I want to use HCL in my own application... right now I'm looking through Terraform's source code to try to get a handle on how its used in code.

Is there a guide for developers? I've been using Terraform for long enough I'm familiar with its syntax but have no idea how to go about using HCL as a dependency or if I'm in for an uphill battle by trying.

Any advice is appreciated, thank you!

EDIT: Found what I was looking for I think but leaving this open for a few days for advice anyway. Link is below for what I'm working with now. Thanks again!
https://media.readthedocs.org/pdf/hcl/guide/hcl.pdf

Custom function, create more HCL blocks

Sorry about the terrible issue title, I have no idea how to make this concise.

I want to include a forEach function in the HCL eval context (or elsewhere) that lets me take a list of strings and return a list of the block the forEach function was in + one element per block from the list (not sure how to say that better).

As an example... given this block:

resource "some_resource" {
  dns_server = forEach(["8.8.8.8", "8.8.4.4"])
}

I want to be able to get some near-equivalent of:

resource "some_resource" {
  dns_server = "8.8.8.8"
}
resource "some_resource" {
  dns_server = "8.8.4.4"
}

I found this example usage of a custom function which makes me think it might not be possible to do what I want to do (this way at least).

Can anyone steer me in the right direction?

Edit: I can't just require to make it multiple blocks in documentation because the list is likely to be a variable/output from another block.

Thanks!

Clarification on the future of HCL?

I'm wondering if the Terraform team could provide some clarification on the future of HCL and HCL2. I haven't seen any official mention of HCL2, and only found out about it via vague references in terraform github issues. These usually sound something like:

We're not going to address this critical Terraform bug/deficiency/feature, because we're focusing all of our efforts on a new version of the HCL language. Hopefully, your bug/deficiency/feature will be addressed in HCL2.

On the one hand, I'm very excited about a new version of HCL. As much as I love Terraform, it quickly become clear that the current iteration of HCL is a little rough around the edges. I'd love to put all my hopes and dreams into a promised shiny new version of HCL.

On the other hand, the prospect of a backwards-incompatible language change kind of freaks me out. Should developers hold off on using Terraform in production until HCL v2 is released? Do we need to budget time a for full rewrite of all existing modules? Will the Terraform team provide some kind of tooling to make the transition smoother?

I hope I'm not coming across as too snarky. I really love Terraform, and despite all of its rough-edges, it is still the best thing out there for what it does. It would just be nice to see some official communication from the Terraform team on this matter, so users can plan accordingly.

Thank you!

Incorrect error message for type mismatch in conditionals

The error message for a conditional expression type mismatch uses a direct string representation of the types, rather than calling the FriendlyName method on those types to get the user-facing name:

return cty.DynamicVal, hcl.Diagnostics{
{
Severity: hcl.DiagError,
Summary: "Inconsistent conditional result types",
Detail: fmt.Sprintf(
// FIXME: Need a helper function for showing natural-language type diffs,
// since this will generate some useless messages in some cases, like
// "These expressions are object and object respectively" if the
// object types don't exactly match.
"The true and false result expressions must have consistent types. The given expressions are %s and %s, respectively.",
trueResult.Type(), falseResult.Type(),
),
Subject: hcl.RangeBetween(e.TrueResult.Range(), e.FalseResult.Range()).Ptr(),
Context: &e.SrcRange,
},
}

As a result, the detail message is useless:

The true and false result expressions must have consistent types. The given
expressions are {{{} [{{{} %!s(cty.primitiveTypeKind=83)}}]}} and {{{}
%!s(cty.primitiveTypeKind=83)}}, respectively.

"Incorrect key type" error reports incorrect source location

When giving a non-primitive value as a key in an object constructor expression, an error is returned as expected:

Error: Incorrect key type

  on list-as-map-key.tf line 7, in locals:
   7:     "${null_resource.x.*.id}" = {
   8:       username = "foo"
   9:       password = "bar"
  10:     }

Can't use this value as a key: string required.

However, the error message indicates the value expression as the subject of the error, rather than the key expression. (That is, the part on the right hand side of the equals is what is underlined in the rendered message.)

Inaccurate diagnostics context for nested fields

Terraform Version

369d512e22bf18fd810b8583e6914458abc569aa

Terraform Configuration Files

resource "azurerm_kubernetes_cluster" "aks_with_rbac" {
  name                = "something"
  location            = "test"
  resource_group_name = "tada"
  dns_prefix          = "radek"
  kubernetes_version  = "1.11"

  role_based_access_control {
    enabled = true
  }

  agent_pool_profile {}
  service_principal {
    client_id     = "00000000-0000-0000-0000-000000000000"
    client_secret = "00000000000000000000000000000000"
  }
}

Expected Behavior

Snippet of code pointing to the right LOC - i.e. to the definition of parent block which is missing arguments.

Actual Behavior

$ terraform plan
Error: Missing required argument

  on main.tf line 1, in resource "azurerm_kubernetes_cluster" "aks_with_rbac":
   1: resource "azurerm_kubernetes_cluster" "aks_with_rbac" {

The argument "name" is required, but no definition was found.


Error: Missing required argument

  on main.tf line 1, in resource "azurerm_kubernetes_cluster" "aks_with_rbac":
   1: resource "azurerm_kubernetes_cluster" "aks_with_rbac" {

The argument "vm_size" is required, but no definition was found.

Steps to Reproduce

$ terraform plan

Additional Context

As documented the arguments reported as missing are arguments nested under agent_pool_profile block, which is not obvious from the error output.

Formatter uses incorrect spacing for `for` expression constructs

Given a for expression like this:

  for_each = [for x in [var.versioning]: x if x.enabled]

...the auto-formatter will shuffle the spacing around incorrectly like this:

  for_each = [for x in[var.versioning] : x if x.enabled]

The extra space before the colon should be easy to address. The space after the in keyword may be trickier, since to the formatting rules it looks like an index expression against an identifier in. The formatter might need a special case for this one as a result.

Spurious "Invalid character" errors at EOF

In a file in the Google Cloud SQL example on Terraform Registry, the parser seems to be getting confused about something, since it's reporting dozens of error messages like the following (two variants seen):

Error: Invalid character

  on .terraform/init-from-module/root/GoogleCloudPlatform-terraform-google-sql-db-688a320/variables.tf line 136:

This character is not used within the language.
Error: Invalid character

  on .terraform/init-from-module/root/GoogleCloudPlatform-terraform-google-sql-db-688a320/variables.tf line 136:

The "`" character is not valid. To create a multi-line string, use the
"heredoc" syntax, like "<<EOT".

In both cases, the source reference is at the end of the file and so this suggests that the parser is entering recovery mode at some earlier point and then some other parser component is getting confused and not aborting properly in that mode.

The mention of the backtick character in some of the error messages may be a clue, since backticks are used in some of the description strings. Perhaps the string parser is getting confused by those backticks.


Errors like the following are also being reported, due to the new parser no longer accepting the non-idiomatic form of unquoted block labels:

Error: Attribute or block definition required

  on .terraform/init-from-module/root/GoogleCloudPlatform-terraform-google-sql-db-688a320/variables.tf line 17:
  17: variable project {

An attribute or block definition is required here. To define an attribute, use
the equals sign "=" to introduce the attribute value.


Error: Attribute or block definition required

  on .terraform/init-from-module/root/GoogleCloudPlatform-terraform-google-sql-db-688a320/variables.tf line 22:
  22: variable region {

An attribute or block definition is required here. To define an attribute, use
the equals sign "=" to introduce the attribute value.


Error: Attribute or block definition required

  on .terraform/init-from-module/root/GoogleCloudPlatform-terraform-google-sql-db-688a320/variables.tf line 27:
  27: variable name {

An attribute or block definition is required here. To define an attribute, use
the equals sign "=" to introduce the attribute value.

However, this particular error is also visible in that repository's outputs.tf file and yet it doesn't also exhibit the strange "Invalid character" error behavior, so possibly there is something else going on here. There are no backticks in outputs.tf, so that lends further credence to the idea that the backticks are triggering the weird behavior.

Splat evaluation is not concurrency-safe

Splat expressions require some odd gymnastics to evaluate, since they are similar to a for expression but with an anonymous iterator symbol.

That anonymous iterator is handled internally via a special expression node type whose value changes dynamically (per EvalContext) during evaluation. Due to how the internals work here, the current implementation cannot safely support concurrent evaluation of the same splat expression from two callers with separate EvalContexts: a map is modified concurrently, which can cause a crash.

As a general rule HCL is not expected to be concurrency-safe in most cases -- that's the caller's responsibility -- but for splat expressions in particular this implementation detail is pretty hard for the caller to properly manage, so we should make an exception here and make AnonSymbolExpr (the expression node that represents the anonymous iterator) use a mutex to allow concurrent evaluations of the same splat expression.

HCL template

As we've seen, there has been quite a few discussions around the use of a "template engine" or something similar in order to introduce more logic to the current template interpolation.
https://github.com/hashicorp/hcl2/blob/master/hcl/hclsyntax/spec.md#templates

I'd like to know the motivation in re-defining yet another template engine/syntax, when so many have been created before (and simply work) and could be re-used?

Marshal/ Unmarshal like in encoding/json

I'm looking for two simple functions:

func Marshal(v interface{}) ([]byte, error)
func Unmarshal(data []byte, v interface{}) error

Compare with encoding/json.Marshal and encoding/json.Unmarshal.

Unmarshal ...

I was able to get the Unmarshal working with:

func Unmarshal(data []byte, v interface{}) error {
	parser := hclparse.NewParser()
	file, diag := parser.ParseHCL(data, "<data>")
	if diag.HasErrors() {
		return diag
	}

	diag = gohcl.DecodeBody(file.Body, nil, v)
	if diag.HasErrors() {
		return diag
	}
	return nil
}

Where v interface{} must include hcl struct tags, or it will return errors.

Marshal?

Can someone point me in the right direction for the Marshal function?

Incorrect handling of "heredocs"

Error: Missing newline after attribute definition

  on .terraform/init-from-module/root/Azure-terraform-azurerm-computegroup-c1a7c62/main.tf line 89:
  84:     settings = <<SETTINGS
  85:     {
  86:         "commandToExecute": "${var.cmd_extension}"
  87:     }
  88:     SETTINGS
  89:   }

An attribute definition must end with a newline.

Looks like this issue arises because the closing SETTINGS marker consumes the newline that the attribute parser normally uses to mark the end of a newline.

gohcl.EncodeIntoBody ignores optional elements

type config struct {
    Other string `hcl:"other"`
    Field bool `hcl:"field,optional"`
}

input := &config{Field: true, Other: "hello"}
fw := hclwrite.NewEmptyFile()
gohcl.EncodeIntoBody(input, fw.Body())
println(string(fw.Bytes()))

The output only contains the required fields, does not contain the optional ones (though set).

String literal parsing is incorrect

The syntax for template directives changed during prototyping, but there is some remnant handling of the old syntax in the string literal parser that needs to be updated to support the current syntax.

In particular, it currently trips over single ! characters believing them to be the start of a directive sequence, and conversely does not correctly handle escaping of the new introduction symbol %.

Format doesn't correct handle block content after newline

My apologies if this is a known issue.

using github.com/hashicorp/hcl2 v0.0.0-20190130225218-89dbc5eb3d9e

Given the following HCL

service "foo"
{
	content = "bar"
}

hclwrite.format does not correctly move the bracket '{' to the same line as the block, making the following example fail:

func main() {
    testContent := []byte(`service "foo"
{
    content = "bar"
}`)

    formattedContent := hclwrite.Format(testContent)
    p := hclparse.NewParser()
    _, diags := p.ParseHCL(formattedContent, "test-file.hcl")
    if diags.HasErrors() {
        fmt.Printf("error parsing hcl file: %v", diags.Error())
    }
}

result:

error parsing hcl file: test-file.hcl:1,14-2,1: Invalid block definition; A block definition must have block content delimited by "{" and "}", starting on the same line as the block header.

Using HCL v1, this works with

formattedContent, _ := printer.Format(testContent)

maps values string parsing conflict

When testing out terraform-0.12.0 for forwards compatibility, I wanted to see if our scripts need to be changed. We've started changing some things until we've encountered this below which seems like underscores needs to be processed and looks like it may be a minor bug.

$ terraform init
...
Error: Missing key/value separator

  on .terraform/modules/dcos-bootstrap-instance.dcos-bootstrap-instance.dcos-tested-oses/dcos-terraform-terraform-aws-tested-oses-94a4f4c/variables.tf line 35:
  33:   default = {
  34:     # Centos 7.2
  35:     centos_7.2_ap-south-1     = "ami-95cda6fa"

Expected an equals sign ("=") to mark the beginning of the attribute value.
$ cat variables.tf
...
# AWS recommends all HVM vs PV. HVM Below.
variable "aws_ami" {
  description = "AMI that will be used for the instances instead of Mesosphere provided AMIs"
  type        = "map"

  default = {
    # Centos 7.2
    centos_7.2_ap-south-1     = "ami-95cda6fa"
    centos_7.2_eu-west-2      = "ami-bb373ddf"
    centos_7.2_eu-west-1      = "ami-7abd0209"
    centos_7.2_ap-northeast-2 = "ami-c74789a9"
    centos_7.2_ap-northeast-1 = "ami-eec1c380"
    centos_7.2_sa-east-1      = "ami-26b93b4a"
    centos_7.2_ca-central-1   = "ami-af62d0cb"
    centos_7.2_ap-southeast-1 = "ami-f068a193"
    centos_7.2_ap-southeast-2 = "ami-fedafc9d"
    centos_7.2_eu-central-1   = "ami-9bf712f4"
    centos_7.2_us-east-1      = "ami-6d1c2007"
    centos_7.2_us-east-2      = "ami-6a2d760f"
    centos_7.2_us-west-1      = "ami-af4333cf"
    centos_7.2_us-west-2      = "ami-d2c924b2"
    }
}

Language design: take a look at Nix lang

Though I understand it is too late to make any language design changes, but take a look at Nix language on how it solved the problem of "configuration lang" + "template lang". Basically, it defined both the same.

This allowed to substitute "templates" with "functions", make "variables" (let-bindings in Nix terminology), have JSONish dictionaries, simple syntax and so on.

For example, here are examples from README.md translated to Nix language. You can see that it is slightly more verbose than HCL, a lot less verbose than JSON, but has a builtin function declaration, named function arguments, namespaces and interpolation.

{
  io_mode = "async";
  service.http.web_proxy = {
    listen_addr = "127.0.0.1:8080";
    process.main = {
      command = [ "/usr/local/bin/awesome-app" "server" ];
    };
    process.mgmt = {
      command = [ "/usr/local/bin/awesome-app" "mgmt" ];
    };
  };
}

and

{ addend, upper, name }:
rec {
  # Arithmetic with literals and application-provided variables
  sum = 1 + addend;

  # String interpolation and templates
  message = "Hello, ${name}!";

  # Application-provided functions
  shouty_message = upper message;
}

Nix lang proved to be very useful both for package specification and configuration. For example, take a look at Consul package expression file: https://github.com/NixOS/nixpkgs/blob/32340793aafec24dcef95fee46a21e634dd63457/pkgs/servers/consul/default.nix and system profile configuration: https://github.com/NixOS/nixpkgs/blob/32340793aafec24dcef95fee46a21e634dd63457/nixos/modules/profiles/hardened.nix

Variables not detected in JSON object keys

From just reviewing the code here, it looks like we'd fail to detect a reference to a variable inside a key of an object that is being treated as an expression:

hcl2/hcl/json/structure.go

Lines 527 to 531 in 6558d83

case *objectVal:
for _, jsonAttr := range v.Attrs {
vars = append(vars, (&expression{src: jsonAttr.Value}).Variables()...)
}
}

Not confirmed yet, but we should write a test for this and then fix it if it is indeed broken.

hclpack: Attributes being mixed up in content()

When multiple blocks have the same type, the attributes within the block are somehow set on the previously decoded blocks too. This is a bit tricky to explain so an example is better:

package main

import (
	"fmt"
	"log"

	"github.com/hashicorp/hcl2/gohcl"
	"github.com/hashicorp/hcl2/hcl"
	"github.com/hashicorp/hcl2/hclpack"
)

func main() {
	src := `
foo {
	bar = "123"
}

foo {
	bar = "abc"
}
`
	
	body, diags := hclpack.PackNativeFile([]byte(src), "file.hcl", hcl.Pos{Line: 1, Column: 1})
	if diags.HasErrors() {
		log.Fatal(diags)
	}

	got := struct {
		Foos []struct {
			Bar string `hcl:"bar"`
		} `hcl:"foo,block"`
	}{}

	diags = gohcl.DecodeBody(body, nil, &got)
	if diags.HasErrors() {
		log.Fatal(diags)
	}

	fmt.Printf("Result: %+v\n", got)
	
	// Output:
	// Result: {Foos:[{Bar:abc} {Bar:abc}]}
}

Here I would expect the output to be Result: {Foos:[{Bar:123} {Bar:abc}]}.

If we change lines 23-26 to this:

file, diags := hclsyntax.ParseConfig([]byte(src), "file.hcl", hcl.Pos{Byte: 0, Line: 1, Column: 1})
if diags.HasErrors() {
	log.Fatal(diags)
}
body := file.Body

The result is what I would expect.

This seems like a bug to me but it's possible I've misunderstood something.

Here's a failing test case:

diff --git hclpack/structure_test.go hclpack/structure_test.go
index 0d5f11c..78b1037 100644
--- a/hclpack/structure_test.go
+++ b/hclpack/structure_test.go
@@ -75,6 +75,73 @@ func TestBodyContent(t *testing.T) {
 				},
 			},
 		},
+		"block attributes": {
+			&Body{
+				ChildBlocks: []Block{
+					{
+						Type: "foo",
+						Body: Body{
+							Attributes: map[string]Attribute{
+								"bar": {
+									Expr: Expression{
+										Source:     []byte(`"hello"`),
+										SourceType: ExprNative,
+									},
+								},
+							},
+						},
+					},
+					{
+						Type: "foo",
+						Body: Body{
+							Attributes: map[string]Attribute{
+								"bar": {
+									Expr: Expression{
+										Source:     []byte(`"world"`),
+										SourceType: ExprNative,
+									},
+								},
+							},
+						},
+					},
+				},
+			},
+			&hcl.BodySchema{
+				Blocks: []hcl.BlockHeaderSchema{
+					{Type: "foo"},
+				},
+			},
+			&hcl.BodyContent{
+				Blocks: hcl.Blocks{
+					{
+						Type: "foo",
+						Body: &Body{
+							Attributes: map[string]Attribute{
+								"bar": {
+									Expr: Expression{
+										Source:     []byte(`"hello"`),
+										SourceType: ExprNative,
+									},
+								},
+							},
+						},
+					},
+					{
+						Type: "foo",
+						Body: &Body{
+							Attributes: map[string]Attribute{
+								"bar": {
+									Expr: Expression{
+										Source:     []byte(`"world"`),
+										SourceType: ExprNative,
+									},
+								},
+							},
+						},
+					},
+				},
+			},
+		},
 	}
 
 	for name, test := range tests {
@@ -85,7 +152,13 @@ func TestBodyContent(t *testing.T) {
 			}
 
 			if !cmp.Equal(test.Want, got) {
-				t.Errorf("wrong result\n%s", cmp.Diff(test.Want, got))
+				bytesAsString := func(s []byte) string {
+					return string(s)
+				}
+				t.Errorf("wrong result\n%s", cmp.Diff(
+					test.Want, got,
+					cmp.Transformer("bytesAsString", bytesAsString),
+				))
 			}
 		})
 	}

Format introduces space before index bracket

The whitespace rules in hclwrite.Format are not getting index brackets right:

aws_instance.foo.*.ip [count.index]

Idiomatic style is for the bracket to immediately follow what it is indexing, with no intervening space.

Splat operators should work with Set values

Currently splat operators only work with List and Tuple values. Given that in other contexts we are able to automatically convert between set and list, we should be able to splat over sets too for consistency, with the caveat that the result will not be in any particular order.

why not an alternative to a custom language

With the announcement of terraform 0.12 there are many extensions to the HCL configuration language.

It looks more like a programming language rather than a data configuration language.

HCL covers a domain specific language. It seems that the maintenance and building a programming language and DSL can be counter productive. For example, the breaking changes listed for hcl2.

What alternatives were looked at? For example, lua is a language designed for embedding and DSLs. Even nginx has an extensions to support (community provided) it.

From a purely educational level, I'd like to learn why hcl is still the supported choice.

Legacy numeric index syntax

Due to limitations of HIL, Terraform has for a long time been supporting a weird index syntax using numeric "attributes", like data.alicloud_zones.default.zones.0.id. From HIL's perspective, that all just one long variable name with dots and digits in it.

Our expression parser currently rejects this because 0 is not a valid attribute name. Since identifiers cannot start with zero, it's safe for us to accept this as an alias for data.alicloud_zones.default.zones[0].id, though we may generate a warning in this case because that odd old form is deprecated now that we support proper attribute and index operators.

Error: Invalid attribute name

  on .terraform/init-from-module/root/alibaba-terraform-alicloud-ecs-instance-063c382/main.tf line 33:
  33:   instance_type = "${var.instance_type == "" ? data.alicloud_instance_types.default.instance_types.0.id : var.instance_type}"

An attribute name is required after a dot.

Formatter inserts unwanted space between two consecutive interpolation sequences

Given a string like this:

  "${foo}${bar}"

...the auto-formatter wants to insert a space between the two sequences, changing the meaning:

  "${foo} ${bar}"

Spacing should never be adjusted when we're between OQuote/OHeredoc tokens and their corresponding CQuote/CHeredoc tokens, except within interpolation/directive sequences.

Forbid zero-prefixed number literals to avoid octal confusion

Those coming from C-based languages may bring an expectation that zero-prefixed number literals are to be interpreted as octal. HCL 2 does not do that, and currently seems to just interpret them as decimal numbers after stripping the leading zero, which can lead to confusing, hard-to-explain results.

As a compromise then, we should detect when a numeric literal has leading zeros and produce an error about it which includes a specific note about octal numbers. Something like:

Leading zeroes are not allowed on number literals. If you were trying to write a number in octal, please note that only decimal number literals are allowed.

Support HCL1-style single-line one-argument block

HCL1 supported a special single-line block form when only one argument was present:

foo { bar = baz }

This was initially excluded from the new language spec intentionally, but given how commonly this form is used in the wild with HCL1-based languages we should support it in HCL2 so it doesn't become another point of friction during a migration.

The specification should be updated to allow for a different form of block syntax where the opening brace is directly followed by an identifier, then the key = value can be parsed as normal and we'll require the closing brace to also appear on the same line. This is stricter than what was permitted in HCL1 but accepts the common idiom while only excluding degenerate forms such as having the closing brace on the next line or listing out more attributes no the following lines as if it were a multi-line block.

We also need to teach the formatter in hclwrite how to deal with this form, since it currently has no rules to deal with an opening brace with more content after it, or a closing brace that isn't on its own line.

Spurious error parsing existing Terraform configuration

When asked to parse a file from Terraform's public module registry, the HCL native syntax parser produced the following error:

Error: Missing newline after attribute definition at main.tf:104,3

An attribute definition must end with a newline.

This message seems to be a parsing bug, since line 104 is the following valid attribute definition:

  availability_zone       = "${element(var.azs, count.index)}"

It's referring to column 3 here, which is the location of the availability_zone identifier. Possibly there's some leftover state bleeding through from the previous line, causing the parser to get confused when it encounters the next token. Previous line is:

 cidr_block              = "${var.public_subnets[count.index]}"

(This also appears to be valid.)

Attribute or block definition required --> do not see any attibute missing.

While making Terraform Registry resync for update on module, it failed with error:
Attribute or block definition required: An attribute or block definition is required here. To define an attribute, use the equals sign "=" to introduce the attribute value.

But it is not giving proper message which explains about what exactly the issue is.

Below is snippet of issue:
image

I hope, am on correct repo for bug. If it is known issue, let me know resolution.

Is dynamic walk of a file.Body possible without schema?

Hi,

I have HCL file with custom blocks and user functions. I would like to walk the file to collect all variable references in the form of var.foo. These references can start with any kind of prefix, such as foo.bar, resource.bar, server.ip, etc.. This is needed so I can create a DAG to compute the references firsthand and pass the references later with a evalContext to each block (so decoding works and the expressions are called). It seems like the without knowing the structure of the file this is not possible? Here is an example which I'm trying to do (diags errors are neglect for the sake of simplicity):

var src = `
variable "url" {
 default = "https://httpbin.org/ip"
}

foo "bar" {
  method = "get"
  url    = var.url
}

bar "example" {
  result = "${foo.result}"
}
`

parser := hclparse.NewParser()
file, _ := parser.ParseHCL([]byte(src), "demo.hcl")

var raw map[string]interface{}
_ = gohcl.DecodeBody(file.Body, nil, &raw)

// doesn't work unfortunately
fmt.Printf("raw = %+v\n", raw)

Is there a way to walk the tree without knowing the structure and using Go structs to define it? Or maybe helper function that would allow me to collect all references?

Thanks

Example usage of custom defined function

Hi,

I see that this project will replace HCL and HIL going forward. Are there any example on how to use it with custom defined function (like how Terraform injects into the HIL's eval context). The README and doc is highly technical written and seems like until I read the source code it's not easily discoverable.

Thanks

Incorrect parsing of modulo operator

In an expression in the alibaba/ecs-instance/alicloud Terraform Registry module there is a modulo operator that is not parsing correctly:

Error: Missing argument separator

  on .terraform/init-from-module/root/alibaba-terraform-alicloud-ecs-instance-063c382/main.tf line 79:
  79:   instance_id = "${element(alicloud_instance.instances.*.id, count.index%var.number_of_instances)}"

A comma is required to separate each function argument from the next.

It seems like % is not being recognized as an operator here and instead the parser thinks it has found the end of the expression.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.