Automatic builds of container images for Docker Hub
hashicorp / docker-hub-images Goto Github PK
View Code? Open in Web Editor NEWAutomatic builds of container images for Docker Hub
License: Mozilla Public License 2.0
Automatic builds of container images for Docker Hub
License: Mozilla Public License 2.0
With Packer version 1.4.0 being available for 2 days can the Docker hub image get an update? Perhaps automate this to get released at the same time as the Github release is created?
With the recent addition of arm64 cloud instances in providers like AWS, running images on arm architecture is becoming more and more attractive. Docker now also supports multi-arch image tags, which make deploying anywhere much more convenient. It would be great to be able to run official hashicorp images on arm64 cloud instances.
I'm not too familiar with Docker, so my apologies in advance if this issue is misguided...
So as I understand it, the official Docker images define an ENTRYPOINT
, which make it difficult to use TerraForm in CI, having to explicitly override it. To exacerbate this issue, GitLab CI doesn't even allow you to override it, leading to various forks being created with varying degrees of up-to-date-ness.
Now, I'm not sure if there's a way to make the current image more suitable for CI use, or whether an additional image would be a feasible solution, but... It would be really nice to be able to easily use TerraForm in CI without having to fork it yourself or having to trust someone other than Hashicorp.
Packer light builds have been failing on Docker Hub for ~2 weeks:
I've investigated a bit and the build fails during this step:
RUN sed -i '/packer_${PACKER_VERSION}_linux_amd64.zip/!d' packer_${PACKER_VERSION}_SHA256SUMS
with the following error:
sha256sum: packer_1.0.1_SHA256SUMS: no checksum lines found
This appears to be due to the single quotes; the ${PACKER_VERSION}
is not replaced.
Version Packer Latest.
What i try: Creating a Dockercontainer - which uses Packer to create a KVM/qemu Image.
So far so good.
Everything works fine, but if the build KVM Image tries to access the Internet it fails (while Build is on going)
Packer is able to connect via SSH and doing so stuff, if packer tries to update the created Image, no internetconnection is given.
Just to understand me correct:
We use this docker/packer image to create qemu images.
We dont generate the Dockerfile with packer!
So as of today (7/29), there are an assortment of docker images for various tools.
Also, nomad appears to be missing completely which I guess makes sense since if you decide to run a docker task within a docker container that might be a bit odd.
So is the plan to eventually have each tool live as an official docker image? Or will the hashicorp images still be maintained with the docker official images? I only ask because it looks certain efforts are being duplicated and other things are not consistent.
Not sure what I'm doing wrong:
docker run --rm -v $(pwd):/app/ -w /app/ hashicorp/terraform:light init my.tf
Error checking configuration: configuration path must be a directory: my.tf
$ docker run -i -t hashicorp/terraform:light terraform
standard_init_linux.go:175: exec user process caused "no such file or directory"
My system
$ docker info
Containers: 4
Running: 0
Paused: 0
Stopped: 4
Images: 23
Server Version: 1.12.0
Storage Driver: aufs
Root Dir: /var/lib/docker/aufs
Backing Filesystem: extfs
Dirs: 33
Dirperm1 Supported: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: null host bridge overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Security Options: apparmor seccomp
Kernel Version: 4.4.0-31-generic
Operating System: Ubuntu 16.04.1 LTS
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 15.57 GiB
Name: jackbox
ID: NWSH:ZHPA:KETS:UIKE:AMTZ:G2HJ:NOS3:TUKP:CUM3:Q4OD:W3GO:DDEJ
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
WARNING: No swap limit support
Insecure Registries:
127.0.0.0/8
This a physical machine. Also, same error using Macbook and Docker for Mac
You pushed a wrong version of alpine :
$ docker pull hashicorp/packer:latest
latest: Pulling from hashicorp/packer
Digest: sha256:001e6d7f728d0a672c4b695c60b26b56402d430e528f2c3995b69949e5cc4a0c
Status: Image is up to date for hashicorp/packer:latest
$ docker pull alpine:latest
latest: Pulling from library/alpine
Digest: sha256:ca1c944a4f8486a153024d9965aafbe24f5723c1d5c02f4964c045a16d19dc54
Status: Image is up to date for alpine:latest
$ docker run --rm --entrypoint /usr/bin/env hashicorp/packer:latest cat /etc/alpine-release
3.7.0
$ docker run --rm --entrypoint /usr/bin/env alpine:latest cat /etc/alpine-release
3.10.0
I feel like the obvious answer should be given to this is no... but I have to ask to solve my problem
Wercker my CI/CD solution, apparently executes some of their code within the container. I can't really explain the problem because I believe that their is a red herring regarding the error message that I get. I'm currently attempting to work with their support in order to resolve the issue, but so far that's not going well.
Their code seems to explicitly have problem executing in Alpine, after some trials with installing bash into an extension, and changing the entrypoint, etc, I've only been able to deduce that they are trying something alpine can't do, and it has nothing to do with mkdir -p /pipeline
(the "failing command")
So this request is to create a fat image based on ubuntu, debian, centos, etc, any of those should work. I wrote this based on your image, but I imagine maintaining will be hard as I'll have to pay attention to updates.
A side improvement that would make maintaing my own image easier is to have a LATEST, or CURRENT, files so instead of needing to fetch a VERSION, I could simply refer to LATEST.
FROM ubuntu:latest
RUN apt-get update && apt-get --assume-yes install wget unzip && apt-get clean
ENV TERRAFORM_VERSION=0.8.8
ENV TERRAFORM_SHA256SUM=403d65b8a728b8dffcdd829262b57949bce9748b91f2e82dfd6d61692236b376
RUN wget https://releases.hashicorp.com/terraform/${TERRAFORM_VERSION}/terraform_${TERRAFORM_VERSION}_linux_amd64.zip && \
echo "${TERRAFORM_SHA256SUM} terraform_${TERRAFORM_VERSION}_linux_amd64.zip" > terraform_${TERRAFORM_VERSION}_SHA256SUMS && \
sha256sum -c terraform_${TERRAFORM_VERSION}_SHA256SUMS && \
unzip terraform_${TERRAFORM_VERSION}_linux_amd64.zip -d /bin && \
rm -f terraform_${TERRAFORM_VERSION}_linux_amd64.zip
git and ssh are missing from the image so module references that use git::ssh do not work.
git is in the full image but ssh is not
Suggest removing the line apk del git and adding openssh to the add update
The Terraform docker images are very helpful, but lack many tools that are helpful in a CI context. Providing an official Terraform image based on Debian would fill this void!
Looks like there's a 0.10.0 tag, but not 0.10.1: https://hub.docker.com/r/hashicorp/packer/tags/
We're running packer in CI and don't want to auto-upgrade using latest.
it looks like hashicorp/terraform:light image wasn't updated with the v0.11.14 release and the current light image is still running v0.11.3.
$ docker run -ti --rm hashicorp/terraform:light -version
Terraform v0.11.13
is that docker image still being maintained in this repo? i only see packer configs.
this is the repo listed on the dockerhub page: https://hub.docker.com/r/hashicorp/terraform/dockerfile
Dear hashicorp team
Is there a way to autoexpose the packerport onto the Dockerhost for Provisioning the user-data / meta-data file via http out of the docker network?
Would be awesome, because currently I'm using the packer:light image as a gitlab runner and so the packer http server only exposes itself in the docker network for prov the user-data / meta-data stuff.
Would be nice if you implement this or show me a valid workaround.
When looking at https://github.com/hashicorp/docker-hub-images/blob/master/terraform/Dockerfile-full, I expected to find terraform version 0.7.13 inside the container, but it looks like Terraform v0.8.0-dev (e90b70ee13aa1be07cf66ae4b3b337b067ababfa)
is included. Is this expected?
$ docker pull hashicorp/terraform:full && docker run hashicorp/terraform:full version
full: Pulling from hashicorp/terraform
Digest: sha256:eb8ab38e16fed207a7bb0df3dd379cf7613964eda324d7408c36775df04e2c8e
Status: Image is up to date for hashicorp/terraform:full
Terraform v0.8.0-dev (e90b70ee13aa1be07cf66ae4b3b337b067ababfa)
As a side note: Would it be possible to get tagged versions of the full image? I found it incredible useful for building terraform with some custom plugins for internal usage.
The Dockerfile has already been updated for 1.2.2 but there's no corresponding tag.
docker: Error response from daemon: invalid header field value "oci runtime error: container_linux.go:247: starting container process caused \"exec: \\\"bin/packer\\\": stat bin/packer: no such file or directory\"\n".
It seems like the docker images on docker-hub are no longer built from CI? The last build was 4 months ago. (Versions are manually pushed?)
https://hub.docker.com/r/hashicorp/terraform/
https://hub.docker.com/r/hashicorp/terraform/builds/
This introduces some trust issues regarding the images that are being pushed to docker-hub vs the source code within the terraform
folder.
The following DockerFile still lists 0.10.0 as the latest version but there are builds for 0.11.*
Is this repo still even used? Are these docker images built from CI if so are they public?
This is also the issue that broke the layering in the ReadMe's
https://imagelayers.io/?images=hashicorp%2Fterraform:0.6.14
vs
https://imagelayers.io/?images=hashicorp%2Fterraform:latest
It seems like the last build for latest was automated. https://hub.docker.com/r/hashicorp/terraform/builds/bnfcu6rpkrae2iuzvzbbyrl/
I suspect any merge to this repository will result in downgrading hashicorp/terraform:latest
from the current 0.11.1 version to the current 0.10 in the git repo depending if there are still automated builds setup.
Hey,
i try to validate and build a vmware-iso via gitlab-ci (template from https://gitlab.com/gitlab-org/gitlab-ce/blob/master/lib/gitlab/ci/templates/Packer.gitlab-ci.yml)
Version:
Error-Message:
ovftool validation error: fork/exec : no such file or directory;
Output:
�[0KRunning with gitlab-runner 11.7.0 (8bb608ff)
�[0;m�[0K on gitLabRunner1 MmYe5HzG
�[0;m�[0KUsing Docker executor with image hashicorp/packer:1.4.1 ...
�[0;m�[0KPulling docker image hashicorp/packer:1.4.1 ...
�[0;m�[0KUsing docker image sha256:c312ba0fcdb843be74fb76b6fe5bd632d5ce6fd22685b8d1051ef02440d583cd for hashicorp/packer:1.4.1 ...
�[0;msection_start:1558100462:prepare_script
�[0KRunning on runner-MmYe5HzG-project-41-concurrent-0 via gitLabRunner1...
section_end:1558100464:prepare_script
�[0Ksection_start:1558100464:get_sources
�[0K�[32;1mCloning repository...�[0;m
Cloning into '/builds/smr/test'...
�[32;1mChecking out 65b09221 as master...�[0;m
�[32;1mSkipping Git submodules setup�[0;m
section_end:1558100469:get_sources
�[0Ksection_start:1558100469:restore_cache
�[0Ksection_end:1558100471:restore_cache
�[0Ksection_start:1558100471:download_artifacts
�[0Ksection_end:1558100472:download_artifacts
�[0Ksection_start:1558100472:build_script
�[0K�[32;1m$ packer --version�[0;m
2019/05/17 13:41:13 [INFO] Packer version: 1.4.1
2019/05/17 13:41:13 Packer Target OS/Arch: linux amd64
2019/05/17 13:41:13 Built with Go Version: go1.12.5
2019/05/17 13:41:13 Detected home directory from env var: /root
2019/05/17 13:41:13 Using internal plugin for digitalocean
2019/05/17 13:41:13 Using internal plugin for openstack
2019/05/17 13:41:13 Using internal plugin for azure-arm
2019/05/17 13:41:13 Using internal plugin for cloudstack
2019/05/17 13:41:13 Using internal plugin for triton
2019/05/17 13:41:13 Using internal plugin for lxd
2019/05/17 13:41:13 Using internal plugin for oracle-classic
2019/05/17 13:41:13 Using internal plugin for qemu
2019/05/17 13:41:13 Using internal plugin for tencentcloud-cvm
2019/05/17 13:41:13 Using internal plugin for amazon-ebsvolume
2019/05/17 13:41:13 Using internal plugin for oracle-oci
2019/05/17 13:41:13 Using internal plugin for hyperv-vmcx
2019/05/17 13:41:13 Using internal plugin for linode
2019/05/17 13:41:13 Using internal plugin for lxc
2019/05/17 13:41:13 Using internal plugin for virtualbox-ovf
2019/05/17 13:41:13 Using internal plugin for vmware-iso
2019/05/17 13:41:13 Using internal plugin for vmware-vmx
2019/05/17 13:41:13 Using internal plugin for amazon-ebssurrogate
2019/05/17 13:41:13 Using internal plugin for hcloud
2019/05/17 13:41:13 Using internal plugin for ncloud
2019/05/17 13:41:13 Using internal plugin for null
2019/05/17 13:41:13 Using internal plugin for virtualbox-iso
2019/05/17 13:41:13 Using internal plugin for yandex
2019/05/17 13:41:13 Using internal plugin for amazon-instance
2019/05/17 13:41:13 Using internal plugin for googlecompute
2019/05/17 13:41:13 Using internal plugin for scaleway
2019/05/17 13:41:13 Using internal plugin for oneandone
2019/05/17 13:41:13 Using internal plugin for proxmox
2019/05/17 13:41:13 Using internal plugin for amazon-ebs
2019/05/17 13:41:13 Using internal plugin for docker
2019/05/17 13:41:13 Using internal plugin for file
2019/05/17 13:41:13 Using internal plugin for hyperone
2019/05/17 13:41:13 Using internal plugin for hyperv-iso
2019/05/17 13:41:13 Using internal plugin for parallels-iso
2019/05/17 13:41:13 Using internal plugin for alicloud-ecs
2019/05/17 13:41:13 Using internal plugin for amazon-chroot
2019/05/17 13:41:13 Using internal plugin for vagrant
2019/05/17 13:41:13 Using internal plugin for parallels-pvm
2019/05/17 13:41:13 Using internal plugin for profitbricks
2019/05/17 13:41:13 Using internal plugin for chef-solo
2019/05/17 13:41:13 Using internal plugin for inspec
2019/05/17 13:41:13 Using internal plugin for windows-shell
2019/05/17 13:41:13 Using internal plugin for windows-restart
2019/05/17 13:41:13 Using internal plugin for ansible
2019/05/17 13:41:13 Using internal plugin for powershell
2019/05/17 13:41:13 Using internal plugin for shell
2019/05/17 13:41:13 Using internal plugin for puppet-server
2019/05/17 13:41:13 Using internal plugin for breakpoint
2019/05/17 13:41:13 Using internal plugin for chef-client
2019/05/17 13:41:13 Using internal plugin for converge
2019/05/17 13:41:13 Using internal plugin for salt-masterless
2019/05/17 13:41:13 Using internal plugin for shell-local
2019/05/17 13:41:13 Using internal plugin for sleep
2019/05/17 13:41:13 Using internal plugin for ansible-local
2019/05/17 13:41:13 Using internal plugin for file
2019/05/17 13:41:13 Using internal plugin for puppet-masterless
2019/05/17 13:41:13 Using internal plugin for docker-tag
2019/05/17 13:41:13 Using internal plugin for manifest
2019/05/17 13:41:13 Using internal plugin for vagrant-cloud
2019/05/17 13:41:13 Using internal plugin for vsphere
2019/05/17 13:41:13 Using internal plugin for vsphere-template
2019/05/17 13:41:13 Using internal plugin for docker-push
2019/05/17 13:41:13 Using internal plugin for vagrant
2019/05/17 13:41:13 Using internal plugin for digitalocean-import
2019/05/17 13:41:13 Using internal plugin for googlecompute-export
2019/05/17 13:41:13 Using internal plugin for googlecompute-import
2019/05/17 13:41:13 Using internal plugin for shell-local
2019/05/17 13:41:13 Using internal plugin for artifice
2019/05/17 13:41:13 Using internal plugin for amazon-import
2019/05/17 13:41:13 Using internal plugin for checksum
2019/05/17 13:41:13 Using internal plugin for compress
2019/05/17 13:41:13 Using internal plugin for docker-import
2019/05/17 13:41:13 Using internal plugin for docker-save
2019/05/17 13:41:13 Using internal plugin for alicloud-import
2019/05/17 13:41:13 Detected home directory from env var: /root
2019/05/17 13:41:13 Attempting to open config file: /root/.packerconfig
2019/05/17 13:41:13 [WARN] Config file doesn't exist: /root/.packerconfig
1.4.1
2019/05/17 13:41:13 Packer config: &{DisableCheckpoint:false DisableCheckpointSignature:false PluginMinPort:10000 PluginMaxPort:25000 Builders:map[alicloud-ecs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-alicloud-ecs amazon-chroot:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-chroot amazon-ebs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebs amazon-ebssurrogate:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebssurrogate amazon-ebsvolume:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebsvolume amazon-instance:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-instance azure-arm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-azure-arm cloudstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-cloudstack digitalocean:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-digitalocean docker:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-docker file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-file googlecompute:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-googlecompute hcloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hcloud hyperone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperone hyperv-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-iso hyperv-vmcx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-vmcx linode:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-linode lxc:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxc lxd:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxd ncloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-ncloud null:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-null oneandone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oneandone openstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-openstack oracle-classic:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-classic oracle-oci:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-oci parallels-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-iso parallels-pvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-pvm profitbricks:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-profitbricks proxmox:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-proxmox qemu:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-qemu scaleway:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-scaleway tencentcloud-cvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-tencentcloud-cvm triton:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-triton vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vagrant virtualbox-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-iso virtualbox-ovf:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-ovf vmware-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-iso vmware-vmx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-vmx yandex:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-yandex] PostProcessors:map[alicloud-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-alicloud-import amazon-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-amazon-import artifice:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-artifice checksum:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-checksum compress:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-compress digitalocean-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-digitalocean-import docker-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-import docker-push:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-push docker-save:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-save docker-tag:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-tag googlecompute-export:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-export googlecompute-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-import manifest:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-manifest shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-shell-local vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant vagrant-cloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant-cloud vsphere:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere vsphere-template:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere-template] Provisioners:map[ansible:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible ansible-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible-local breakpoint:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-breakpoint chef-client:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-client chef-solo:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-solo converge:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-converge file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-file inspec:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-inspec powershell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-powershell puppet-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-masterless puppet-server:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-server salt-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-salt-masterless shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell-local sleep:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-sleep windows-restart:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-restart windows-shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-shell]}
2019/05/17 13:41:13 Detected home directory from env var: /root
2019/05/17 13:41:13 Setting cache directory: /builds/smr/test2/packer_cache
No tty available: open /dev/tty: no such device or address
2019/05/17 13:41:13 [INFO] (telemetry) Finalizing.
2019/05/17 13:41:13 Detected home directory from env var: /root
2019/05/17 13:41:14 waiting for all plugin processes to complete...
�[32;1m$ find / -name ovftool*�[0;m
�[32;1m$ packer fix test2.json > test.json�[0;m
2019/05/17 13:41:16 [INFO] Packer version: 1.4.1
2019/05/17 13:41:16 Packer Target OS/Arch: linux amd64
2019/05/17 13:41:16 Built with Go Version: go1.12.5
2019/05/17 13:41:16 Detected home directory from env var: /root
2019/05/17 13:41:16 Using internal plugin for parallels-iso
2019/05/17 13:41:16 Using internal plugin for lxd
2019/05/17 13:41:16 Using internal plugin for azure-arm
2019/05/17 13:41:16 Using internal plugin for hyperv-iso
2019/05/17 13:41:16 Using internal plugin for openstack
2019/05/17 13:41:16 Using internal plugin for scaleway
2019/05/17 13:41:16 Using internal plugin for virtualbox-iso
2019/05/17 13:41:16 Using internal plugin for amazon-ebssurrogate
2019/05/17 13:41:16 Using internal plugin for amazon-ebsvolume
2019/05/17 13:41:16 Using internal plugin for hcloud
2019/05/17 13:41:16 Using internal plugin for hyperv-vmcx
2019/05/17 13:41:16 Using internal plugin for vagrant
2019/05/17 13:41:16 Using internal plugin for alicloud-ecs
2019/05/17 13:41:16 Using internal plugin for digitalocean
2019/05/17 13:41:16 Using internal plugin for oneandone
2019/05/17 13:41:16 Using internal plugin for amazon-ebs
2019/05/17 13:41:16 Using internal plugin for hyperone
2019/05/17 13:41:16 Using internal plugin for proxmox
2019/05/17 13:41:16 Using internal plugin for qemu
2019/05/17 13:41:16 Using internal plugin for triton
2019/05/17 13:41:16 Using internal plugin for amazon-chroot
2019/05/17 13:41:16 Using internal plugin for ncloud
2019/05/17 13:41:16 Using internal plugin for tencentcloud-cvm
2019/05/17 13:41:16 Using internal plugin for virtualbox-ovf
2019/05/17 13:41:16 Using internal plugin for vmware-vmx
2019/05/17 13:41:16 Using internal plugin for file
2019/05/17 13:41:16 Using internal plugin for cloudstack
2019/05/17 13:41:16 Using internal plugin for lxc
2019/05/17 13:41:16 Using internal plugin for null
2019/05/17 13:41:16 Using internal plugin for oracle-classic
2019/05/17 13:41:16 Using internal plugin for oracle-oci
2019/05/17 13:41:16 Using internal plugin for parallels-pvm
2019/05/17 13:41:16 Using internal plugin for profitbricks
2019/05/17 13:41:16 Using internal plugin for amazon-instance
2019/05/17 13:41:16 Using internal plugin for googlecompute
2019/05/17 13:41:16 Using internal plugin for linode
2019/05/17 13:41:16 Using internal plugin for vmware-iso
2019/05/17 13:41:16 Using internal plugin for yandex
2019/05/17 13:41:16 Using internal plugin for docker
2019/05/17 13:41:16 Using internal plugin for ansible-local
2019/05/17 13:41:16 Using internal plugin for breakpoint
2019/05/17 13:41:16 Using internal plugin for converge
2019/05/17 13:41:16 Using internal plugin for windows-shell
2019/05/17 13:41:16 Using internal plugin for salt-masterless
2019/05/17 13:41:16 Using internal plugin for ansible
2019/05/17 13:41:16 Using internal plugin for chef-client
2019/05/17 13:41:16 Using internal plugin for chef-solo
2019/05/17 13:41:16 Using internal plugin for file
2019/05/17 13:41:16 Using internal plugin for inspec
2019/05/17 13:41:16 Using internal plugin for powershell
2019/05/17 13:41:16 Using internal plugin for puppet-masterless
2019/05/17 13:41:16 Using internal plugin for shell
2019/05/17 13:41:16 Using internal plugin for sleep
2019/05/17 13:41:16 Using internal plugin for windows-restart
2019/05/17 13:41:16 Using internal plugin for puppet-server
2019/05/17 13:41:16 Using internal plugin for shell-local
2019/05/17 13:41:16 Using internal plugin for artifice
2019/05/17 13:41:16 Using internal plugin for compress
2019/05/17 13:41:16 Using internal plugin for googlecompute-export
2019/05/17 13:41:16 Using internal plugin for googlecompute-import
2019/05/17 13:41:16 Using internal plugin for shell-local
2019/05/17 13:41:16 Using internal plugin for vagrant
2019/05/17 13:41:16 Using internal plugin for digitalocean-import
2019/05/17 13:41:16 Using internal plugin for vsphere
2019/05/17 13:41:16 Using internal plugin for amazon-import
2019/05/17 13:41:16 Using internal plugin for docker-import
2019/05/17 13:41:16 Using internal plugin for docker-save
2019/05/17 13:41:16 Using internal plugin for vagrant-cloud
2019/05/17 13:41:16 Using internal plugin for alicloud-import
2019/05/17 13:41:16 Using internal plugin for checksum
2019/05/17 13:41:16 Using internal plugin for docker-push
2019/05/17 13:41:16 Using internal plugin for docker-tag
2019/05/17 13:41:16 Using internal plugin for manifest
2019/05/17 13:41:16 Using internal plugin for vsphere-template
2019/05/17 13:41:16 Detected home directory from env var: /root
2019/05/17 13:41:16 Attempting to open config file: /root/.packerconfig
2019/05/17 13:41:16 [WARN] Config file doesn't exist: /root/.packerconfig
2019/05/17 13:41:16 Packer config: &{DisableCheckpoint:false DisableCheckpointSignature:false PluginMinPort:10000 PluginMaxPort:25000 Builders:map[alicloud-ecs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-alicloud-ecs amazon-chroot:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-chroot amazon-ebs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebs amazon-ebssurrogate:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebssurrogate amazon-ebsvolume:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebsvolume amazon-instance:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-instance azure-arm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-azure-arm cloudstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-cloudstack digitalocean:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-digitalocean docker:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-docker file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-file googlecompute:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-googlecompute hcloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hcloud hyperone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperone hyperv-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-iso hyperv-vmcx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-vmcx linode:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-linode lxc:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxc lxd:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxd ncloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-ncloud null:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-null oneandone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oneandone openstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-openstack oracle-classic:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-classic oracle-oci:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-oci parallels-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-iso parallels-pvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-pvm profitbricks:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-profitbricks proxmox:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-proxmox qemu:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-qemu scaleway:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-scaleway tencentcloud-cvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-tencentcloud-cvm triton:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-triton vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vagrant virtualbox-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-iso virtualbox-ovf:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-ovf vmware-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-iso vmware-vmx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-vmx yandex:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-yandex] PostProcessors:map[alicloud-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-alicloud-import amazon-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-amazon-import artifice:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-artifice checksum:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-checksum compress:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-compress digitalocean-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-digitalocean-import docker-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-import docker-push:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-push docker-save:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-save docker-tag:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-tag googlecompute-export:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-export googlecompute-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-import manifest:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-manifest shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-shell-local vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant vagrant-cloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant-cloud vsphere:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere vsphere-template:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere-template] Provisioners:map[ansible:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible ansible-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible-local breakpoint:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-breakpoint chef-client:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-client chef-solo:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-solo converge:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-converge file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-file inspec:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-inspec powershell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-powershell puppet-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-masterless puppet-server:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-server salt-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-salt-masterless shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell-local sleep:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-sleep windows-restart:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-restart windows-shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-shell]}
2019/05/17 13:41:16 Detected home directory from env var: /root
2019/05/17 13:41:16 Setting cache directory: /builds/smr/test2/packer_cache
No tty available: open /dev/tty: no such device or address
2019/05/17 13:41:16 Running fixer: iso-md5
2019/05/17 13:41:16 Running fixer: createtime
2019/05/17 13:41:16 Running fixer: virtualbox-gaattach
2019/05/17 13:41:16 Running fixer: pp-vagrant-override
2019/05/17 13:41:16 Running fixer: virtualbox-rename
2019/05/17 13:41:16 Running fixer: vmware-rename
2019/05/17 13:41:16 Running fixer: parallels-headless
2019/05/17 13:41:16 Running fixer: parallels-deprecations
2019/05/17 13:41:16 Running fixer: sshkeypath
2019/05/17 13:41:16 Running fixer: sshdisableagent
2019/05/17 13:41:16 Running fixer: scaleway-access-key
2019/05/17 13:41:16 Running fixer: manifest-filename
2019/05/17 13:41:16 Running fixer: amazon-shutdown_behavior
2019/05/17 13:41:16 Running fixer: amazon-enhanced-networking
2019/05/17 13:41:16 Running fixer: amazon-private-ip
2019/05/17 13:41:16 Running fixer: amazon-temp-sec-cidrs
2019/05/17 13:41:16 Running fixer: docker-email
2019/05/17 13:41:16 Running fixer: powershell-escapes
2019/05/17 13:41:16 Running fixer: vmware-compaction
2019/05/17 13:41:16 Running fixer: hyperv-cpu-and-ram
2019/05/17 13:41:16 Running fixer: clean-image-name
"builders": [
{
"boot_command": [
"<tab> text ks=http://{{ .HTTPIP }}:{{ .HTTPPort }}/scripts/centos-7-kickstart.cfg<enter><wait>"
],
"boot_wait": "7s",
"communicator": "ssh",
"disk_size": 8192,
"disk_type_id": "zeroedthick",
"display_name": "test2 1.5.0",
"format": "ovf",
"headless": false,
"http_directory": ".",
"iso_checksum": "af5f788aee1b32c4b2634734309cc9e9",
"iso_checksum_type": "md5",
"iso_url": "http://old-releases.ubuntu.com/releases/precise/ubuntu-12.04.2-server-amd64.iso",
"keep_registered": true,
"name": "test2",
"ovftool_options": [
"--noSSLVerify"
],
"remote_cache_datastore": "datastore2",
"remote_datastore": "datastore2",
"remote_host": "**removed**",
"remote_password": "**removed**",
"remote_type": "esx5",
"remote_username": "**removed**",
"shutdown_command": "shutdown -P now",
"skip_compaction": "true",
"skip_export": "false",
"skip_validate_credentials": "false",
"ssh_password": "packer",
"ssh_pty": "true",
"ssh_username": "packer",
"ssh_wait_timeout": "60m",
"tools_upload_flavor": "linux",
"type": "vmware-iso",
"vm_name": "test2",
"vmdk_name": "test2",
"vmx_data": {
"ethernet0.addressType": "generated",
"ethernet0.generatedAddressOffset": "0",
"ethernet0.networkName": "{{user `packer_esxi_portgroup`}}",
"ethernet0.present": "TRUE",
"ethernet0.startConnected": "TRUE",
"ethernet0.virtualDev": "e1000",
"ethernet0.wakeOnPcktRcv": "FALSE",
"memsize": "4096",
"numvcpus": "2"
},
"vmx_data_post": {
"ethernet0.virtualDev": "vmxnet3",
"ide1:0.clientDevice": "TRUE",
"ide1:0.fileName": "emptyBackingString",
"ide1:0.startConnected": "FALSE"
},
"vnc_disable_password": true
}
]
}
2019/05/17 13:41:16 [INFO] (telemetry) Finalizing.
2019/05/17 13:41:16 Detected home directory from env var: /root
2019/05/17 13:41:17 waiting for all plugin processes to complete...
�[32;1m$ cat test.json�[0;m
{
"builders": [
{
"boot_command": [
"<tab> text ks=http://{{ .HTTPIP }}:{{ .HTTPPort }}/scripts/centos-7-kickstart.cfg<enter><wait>"
],
"boot_wait": "7s",
"communicator": "ssh",
"disk_size": 8192,
"disk_type_id": "zeroedthick",
"display_name": "test2 1.5.0",
"format": "ovf",
"headless": false,
"http_directory": ".",
"iso_checksum": "af5f788aee1b32c4b2634734309cc9e9",
"iso_checksum_type": "md5",
"iso_url": "http://old-releases.ubuntu.com/releases/precise/ubuntu-12.04.2-server-amd64.iso",
"keep_registered": true,
"name": "test2",
"ovftool_options": [
"--noSSLVerify"
],
"remote_cache_datastore": "datastore2",
"remote_datastore": "datastore2",
"remote_host": "**removed**",
"remote_password": "**removed**",
"remote_type": "esx5",
"remote_username": "**removed**",
"shutdown_command": "shutdown -P now",
"skip_compaction": "true",
"skip_export": "false",
"skip_validate_credentials": "false",
"ssh_password": "packer",
"ssh_pty": "true",
"ssh_username": "packer",
"ssh_wait_timeout": "60m",
"tools_upload_flavor": "linux",
"type": "vmware-iso",
"vm_name": "test2",
"vmdk_name": "test2",
"vmx_data": {
"ethernet0.addressType": "generated",
"ethernet0.generatedAddressOffset": "0",
"ethernet0.networkName": "{{user `packer_esxi_portgroup`}}",
"ethernet0.present": "TRUE",
"ethernet0.startConnected": "TRUE",
"ethernet0.virtualDev": "e1000",
"ethernet0.wakeOnPcktRcv": "FALSE",
"memsize": "4096",
"numvcpus": "2"
},
"vmx_data_post": {
"ethernet0.virtualDev": "vmxnet3",
"ide1:0.clientDevice": "TRUE",
"ide1:0.fileName": "emptyBackingString",
"ide1:0.startConnected": "FALSE"
},
"vnc_disable_password": true
}
]
}
�[32;1m$ packer validate test2.json�[0;m
2019/05/17 13:41:17 [INFO] Packer version: 1.4.1
2019/05/17 13:41:17 Packer Target OS/Arch: linux amd64
2019/05/17 13:41:17 Built with Go Version: go1.12.5
2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 Using internal plugin for oneandone
2019/05/17 13:41:17 Using internal plugin for tencentcloud-cvm
2019/05/17 13:41:17 Using internal plugin for amazon-instance
2019/05/17 13:41:17 Using internal plugin for googlecompute
2019/05/17 13:41:17 Using internal plugin for hcloud
2019/05/17 13:41:17 Using internal plugin for proxmox
2019/05/17 13:41:17 Using internal plugin for yandex
2019/05/17 13:41:17 Using internal plugin for digitalocean
2019/05/17 13:41:17 Using internal plugin for lxd
2019/05/17 13:41:17 Using internal plugin for parallels-pvm
2019/05/17 13:41:17 Using internal plugin for oracle-classic
2019/05/17 13:41:17 Using internal plugin for triton
2019/05/17 13:41:17 Using internal plugin for cloudstack
2019/05/17 13:41:17 Using internal plugin for file
2019/05/17 13:41:17 Using internal plugin for openstack
2019/05/17 13:41:17 Using internal plugin for lxc
2019/05/17 13:41:17 Using internal plugin for oracle-oci
2019/05/17 13:41:17 Using internal plugin for qemu
2019/05/17 13:41:17 Using internal plugin for virtualbox-iso
2019/05/17 13:41:17 Using internal plugin for virtualbox-ovf
2019/05/17 13:41:17 Using internal plugin for amazon-ebsvolume
2019/05/17 13:41:17 Using internal plugin for azure-arm
2019/05/17 13:41:17 Using internal plugin for hyperv-vmcx
2019/05/17 13:41:17 Using internal plugin for linode
2019/05/17 13:41:17 Using internal plugin for ncloud
2019/05/17 13:41:17 Using internal plugin for vmware-iso
2019/05/17 13:41:17 Using internal plugin for hyperone
2019/05/17 13:41:17 Using internal plugin for hyperv-iso
2019/05/17 13:41:17 Using internal plugin for parallels-iso
2019/05/17 13:41:17 Using internal plugin for profitbricks
2019/05/17 13:41:17 Using internal plugin for vagrant
2019/05/17 13:41:17 Using internal plugin for alicloud-ecs
2019/05/17 13:41:17 Using internal plugin for amazon-chroot
2019/05/17 13:41:17 Using internal plugin for docker
2019/05/17 13:41:17 Using internal plugin for amazon-ebssurrogate
2019/05/17 13:41:17 Using internal plugin for null
2019/05/17 13:41:17 Using internal plugin for amazon-ebs
2019/05/17 13:41:17 Using internal plugin for scaleway
2019/05/17 13:41:17 Using internal plugin for vmware-vmx
2019/05/17 13:41:17 Using internal plugin for puppet-server
2019/05/17 13:41:17 Using internal plugin for salt-masterless
2019/05/17 13:41:17 Using internal plugin for shell-local
2019/05/17 13:41:17 Using internal plugin for ansible
2019/05/17 13:41:17 Using internal plugin for inspec
2019/05/17 13:41:17 Using internal plugin for sleep
2019/05/17 13:41:17 Using internal plugin for puppet-masterless
2019/05/17 13:41:17 Using internal plugin for shell
2019/05/17 13:41:17 Using internal plugin for windows-restart
2019/05/17 13:41:17 Using internal plugin for ansible-local
2019/05/17 13:41:17 Using internal plugin for converge
2019/05/17 13:41:17 Using internal plugin for file
2019/05/17 13:41:17 Using internal plugin for powershell
2019/05/17 13:41:17 Using internal plugin for windows-shell
2019/05/17 13:41:17 Using internal plugin for breakpoint
2019/05/17 13:41:17 Using internal plugin for chef-client
2019/05/17 13:41:17 Using internal plugin for chef-solo
2019/05/17 13:41:17 Using internal plugin for digitalocean-import
2019/05/17 13:41:17 Using internal plugin for googlecompute-import
2019/05/17 13:41:17 Using internal plugin for vsphere-template
2019/05/17 13:41:17 Using internal plugin for googlecompute-export
2019/05/17 13:41:17 Using internal plugin for manifest
2019/05/17 13:41:17 Using internal plugin for shell-local
2019/05/17 13:41:17 Using internal plugin for alicloud-import
2019/05/17 13:41:17 Using internal plugin for amazon-import
2019/05/17 13:41:17 Using internal plugin for docker-tag
2019/05/17 13:41:17 Using internal plugin for vagrant-cloud
2019/05/17 13:41:17 Using internal plugin for vsphere
2019/05/17 13:41:17 Using internal plugin for checksum
2019/05/17 13:41:17 Using internal plugin for compress
2019/05/17 13:41:17 Using internal plugin for docker-push
2019/05/17 13:41:17 Using internal plugin for vagrant
2019/05/17 13:41:17 Using internal plugin for artifice
2019/05/17 13:41:17 Using internal plugin for docker-import
2019/05/17 13:41:17 Using internal plugin for docker-save
2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 Attempting to open config file: /root/.packerconfig
2019/05/17 13:41:17 [WARN] Config file doesn't exist: /root/.packerconfig
2019/05/17 13:41:17 Packer config: &{DisableCheckpoint:false DisableCheckpointSignature:false PluginMinPort:10000 PluginMaxPort:25000 Builders:map[alicloud-ecs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-alicloud-ecs amazon-chroot:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-chroot amazon-ebs:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebs amazon-ebssurrogate:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebssurrogate amazon-ebsvolume:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-ebsvolume amazon-instance:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-amazon-instance azure-arm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-azure-arm cloudstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-cloudstack digitalocean:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-digitalocean docker:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-docker file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-file googlecompute:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-googlecompute hcloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hcloud hyperone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperone hyperv-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-iso hyperv-vmcx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-hyperv-vmcx linode:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-linode lxc:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxc lxd:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-lxd ncloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-ncloud null:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-null oneandone:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oneandone openstack:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-openstack oracle-classic:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-classic oracle-oci:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-oracle-oci parallels-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-iso parallels-pvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-parallels-pvm profitbricks:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-profitbricks proxmox:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-proxmox qemu:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-qemu scaleway:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-scaleway tencentcloud-cvm:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-tencentcloud-cvm triton:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-triton vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vagrant virtualbox-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-iso virtualbox-ovf:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-virtualbox-ovf vmware-iso:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-iso vmware-vmx:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-vmware-vmx yandex:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-builder-yandex] PostProcessors:map[alicloud-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-alicloud-import amazon-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-amazon-import artifice:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-artifice checksum:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-checksum compress:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-compress digitalocean-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-digitalocean-import docker-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-import docker-push:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-push docker-save:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-save docker-tag:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-docker-tag googlecompute-export:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-export googlecompute-import:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-googlecompute-import manifest:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-manifest shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-shell-local vagrant:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant vagrant-cloud:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vagrant-cloud vsphere:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere vsphere-template:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-post-processor-vsphere-template] Provisioners:map[ansible:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible ansible-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-ansible-local breakpoint:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-breakpoint chef-client:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-client chef-solo:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-chef-solo converge:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-converge file:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-file inspec:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-inspec powershell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-powershell puppet-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-masterless puppet-server:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-puppet-server salt-masterless:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-salt-masterless shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell shell-local:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-shell-local sleep:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-sleep windows-restart:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-restart windows-shell:/bin/packer-PACKERSPACE-plugin-PACKERSPACE-packer-provisioner-windows-shell]}
2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 Setting cache directory: /builds/smr/test2/packer_cache
No tty available: open /dev/tty: no such device or address
2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 Loading builder: vmware-iso
2019/05/17 13:41:17 Plugin could not be found. Checking same directory as executable.
2019/05/17 13:41:17 Current exe path: /bin/packer
2019/05/17 13:41:17 Creating plugin client for path: /bin/packer
2019/05/17 13:41:17 Starting plugin: /bin/packer []string{"/bin/packer", "plugin", "packer-builder-vmware-iso"}
2019/05/17 13:41:17 Waiting for RPC address for: /bin/packer
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 [INFO] Packer version: 1.4.1
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Packer Target OS/Arch: linux amd64
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Built with Go Version: go1.12.5
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Attempting to open config file: /root/.packerconfig
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 [WARN] Config file doesn't exist: /root/.packerconfig
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Packer config: &{DisableCheckpoint:false DisableCheckpointSignature:false PluginMinPort:10000 PluginMaxPort:25000 Builders:map[] PostProcessors:map[] Provisioners:map[]}
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Setting cache directory: /builds/smr/test2/packer_cache
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Detected home directory from env var: /root
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 args: []string{"packer-builder-vmware-iso"}
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Plugin minimum port: 10000
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Plugin maximum port: 25000
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Plugin address: unix /tmp/packer-plugin286126886
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Waiting for connection...
2019/05/17 13:41:17 packer: 2019/05/17 13:41:17 Serving a plugin connection...
2019/05/17 13:41:17 Preparing build: test2
2019/05/17 13:41:17 Build 'test2' prepare failure: 1 error(s) occurred:
* ovftool validation error: fork/exec : no such file or directory;
2019/05/17 13:41:17 ui error: Template validation failed. Errors are shown below.
2019/05/17 13:41:17 ui error: Errors validating build 'test2'. 1 error(s) occurred:
* ovftool validation error: fork/exec : no such file or directory;
2019/05/17 13:41:17 [INFO] (telemetry) Finalizing.
Template validation failed. Errors are shown below.
Errors validating build 'test2'. 1 error(s) occurred:
*** ovftool validation error: fork/exec : no such file or directory;**
2019/05/17 13:41:18 waiting for all plugin processes to complete...
2019/05/17 13:41:18 /bin/packer: plugin process exited
section_end:1558100479:build_script
�[0Ksection_start:1558100479:after_script
�[0Ksection_end:1558100480:after_script
�[0Ksection_start:1558100480:upload_artifacts_on_failure
�[0Ksection_end:1558100482:upload_artifacts_on_failure
�[0K�[31;1mERROR: Job failed: exit code 1
�[0;m
Before I updated my packer build pipeline to any version > 1.3.1, everything was working like a charm. Now I get the following cryptic error:
Status: Downloaded newer image for hashicorp/packer:1.4.3
amazon-ebs output will be in this color.
Build 'amazon-ebs' errored: error validating regions: UnauthorizedOperation: You are not authorized to perform this operation.
status code: 403, request id: 9126ba8a-3f99-4aec-a23a-155a46f5b385
==> Some builds didn't complete successfully and had errors:
--> amazon-ebs: error validating regions: UnauthorizedOperation: You are not authorized to perform this operation.
status code: 403, request id: 9126ba8a-3f99-4aec-a23a-155a46f5b385
==> Builds finished but no artifacts were created.
make: *** [Makefile:49: build] Error 1
script returned exit code 2
Adding PACKER_LOG=1 does not give much more information on the subject.
After googling a lot and putting some debug traces in my own packer executable I found out what was wrong: the alpine image is missing CA certificates :-(
Note that this seems to be working on MacOSX platforms without any change. I did not check either on other platforms
Hi guys, it looks like your Docker Hub builds have been failing for a month or so?
Hello everyone,
sorry for posting this question but I'm running in circles for a while.
we have company proxy and hence I need to run init
command like this:
docker run \
--env TF_LOG="TRACE" \
--env HTTPS_PROXY="http://fqdn-of-company-proxy---:8081" \
--env HTTP_PROXY="http://fqdn-of-company-proxy---:8080" \
--volume '/terraform:/data' \
--workdir '/data' \
hashicorp/terraform:0.12.18 \
init
but and getting back this error:
2020/01/06 18:35:09 [ERR] Checkpoint error: Get https://checkpoint-api.hashicorp.com/v1/check/terraform?arch=amd64&os=linux&signature=2efe935d-c53c-7034-5425-e26d96ca5cd0&version=0.12.18: x509: certificate signed by unknown authority
2020/01/06 18:35:09 [DEBUG] Failed to request discovery document: Get https://registry.terraform.io/.well-known/terraform.json: x509: certificate signed by unknown authority
seems that proxy is behaving like man-in-the-middle and is signing packages.
I was considering to add our CA to trusted list.
Do you have an idea how to specify additional trusted CA for terraform image?
currently the docker tags that are available include:
where 0.11.8
maps to the light
version of the image.
In order to be able to make use of the full
image and still keep versioning, I propose adding: full-0.11.8
(or comparable)
Packer was updated recently (~18hrs ago at time of writing) to use govendor instead of godeps (hashicorp/packer@0202950) and the image build is probably failing due to this -- it looks like the tracking info on the README is broken right now.
Adding these two lines fixed our personal Dockerfile which uses similar commands for installing packer:
RUN go get github.com/kardianos/govendor
…
RUN govendor fetch packer
It seems as if the last few CI builds that we've done here, are bumping the remote state (s3) to
"version": 3,
"terraform_version": "0.11.7",
after a successful plan / apply
This is a little strange, because when I login to the container (terraform:light) directly, and create a local, minimal state with an AWS module, I see
"version": 3,
"terraform_version": "0.11.5",
The result is that subsequent builds cannot be applied
Terraform doesn't allow running any operations against a state
deploy_1 | that was written by a future Terraform version. The state is
deploy_1 | reporting it is written by Terraform '0.11.7'
If I manually edit the remote state file, I can build again, but it will continue to bump it back up to latest. Am I doing something that would upgrade the stack, somewhere between plan and apply?
Thanks.
Although the Dockerfile for packer already seems to be updated to version 1.0.3 there's no tag in docker hub yet for that version:
https://hub.docker.com/r/hashicorp/packer/tags/
Any chance that it can be pushed? :)
I would like to be able to pull images with commands like:
docker pull hashicorp/terraform:0.12
and have it pull the latest patch for terraform 0.12. This could be acheived with moving 0.11/0.12 tags which point at the latest release for that version (e.g. 0.11 == 0.11.14).
I have packer running locally. Seemed to be a not friendly, non scaleable, and non repeatable setup. Saw a docker image with packer, thought I would check it out. Sounds like it would solve all the things I was talking about if I could just spin up packer in a docker container and have it work. It is very unclear how to use it though. Looking through the commits, I see that we added terraform examples for #8.
I start poking around, trying out various things with the packer image; docker run -v /Users/user1/Documents/Projects/packer_base:/data hashicorp/packer:light build -on-error=ask -force -only CentOS7-VBOX /data/packer_templates/centos7/test7.json. Could not find VBoxManage. Tried the hashicorp/packer:full, same story. Messed around with entry point, and poked around the alpine build a little, 100% correct, no VBoxManage! :)
I do not see packer examples. Is my use case the proper way to be using the packer image? I'm guessing not by my repeated failures!
I miss binary builders from official docker images of packer.
As docker, qemu-system-x86_64...
https://hub.docker.com/r/hashicorp/terraform/tags/ - looks like tags for 0.9.10 and 0.9.11 are missing. Are these tags pushed automatically on release?
The current terraform:full
image is based on golang:1.11.3
. This contains a nasty bug (golang/go#29278). Please upgrade to using golang1.11.4
(should be a matter of just rebuilding because the golang
base image has been updated to 1.11.4 already.
Hi!
Can an image based on the 'nightly' tag be added?
It would be great if we could add beta images to this repo aswell.
docker run --rm -it hashicorp/packer:light
standard_init_linux.go:211: exec user process caused "no such file or directory"
The hashicorp/packer:full contains an old dev version of packer instead of the latest
Actual Output:
$ docker run -it hashicorp/packer:full version
Packer v1.2.3-dev
Your version of Packer is out of date! The latest version
is 1.3.1. You can update by downloading from www.packer.io/downloads.html
Expected Output:
$ docker run -it hashicorp/packer:full version
Packer v1.3.1
The issue was initially described in the following ticket:
#37
The issue has been closed because the GitLab CI allows overwriting the entrypoint
command now.
However, Bamboo CI still does not seem to have a possibility to overwrite the entrypoint
with Docker Runner (not Docker CLI, but when you run the jobs/deployment inside the container, in as called "isolated" environment - https://confluence.atlassian.com/bamboo/docker-runner-946020207.html).
It is neither defined in the Bamboo Spec documentation nor available on the UI.
https://docs.atlassian.com/bamboo-specs/6.7.1/com/atlassian/bamboo/specs/api/builders/docker/DockerConfiguration.html
Thus, I would be happy to have this issue looked upon.
$ docker run -ti hashicorp/terraform:light version
docker: Error response from daemon: Container command '/bin/terraform' not found or does not exist..
The full version runs as expected:
$ docker run -ti hashicorp/terraform:0.6.16 version
Terraform v0.6.16
This is a regression with the linux binaries - v0.6.15 binaries have no issue.
For local development we are using localstack and terraform to manage resources.
This image creates state file as root user, therefore to delete state and apply resources to fresh localstack container, statefile has to be deleted with sudo rm
or permissions has to be changed with sudo.
Problem:
The example usage at https://hub.docker.com/r/hashicorp/terraform/ is outdated, which is likely to frustrate beginners.
How to reproduce it:
Using docker run -i -t hashicorp/terraform:light plan main.tf
results in sessions like following:
thomas@w500:~/source/terraform$ ls -l
total 8
-rw-rw-r-- 1 thomas thomas 2331 Jun 3 10:12 account.json
-rw-rw-r-- 1 thomas thomas 937 Jun 3 10:14 main.tf
thomas@w500:~/source/terraform$ docker run -it hashicorp/terraform:light version
Terraform v0.11.7
thomas@w500:~/source/terraform$ docker run -it hashicorp/terraform:light plan main.tf
Failed to load Terraform configuration or plan: open main.tf: no such file or directory
The solution is to follow the examples in the GitHub README.md.
Proposed solution:
Please edit the docs at DockerHub, remove the broken example, and point to the up-to-date docs at GitHub instead.
Hi guys!
I'm using Packer on a new project in my company, and I'm using the version 1.4.5, but when I want to use the Docker hub image in my CI/CD tool, I don't see a tag for that version of Packer.
Could you add a tag for this version in Docker Hub??
Thanks :)
Hi,
Are there any working examples on how to use the image. It would be great to see we packer should be reading json
Its not clear if we should mount the json file while running or something else needs to be done
docker run -it -v /tmp:/tmp hashicorp/packer:light validate -syntax-only dummy-vbox.json -w="/tmp"
When pulling the latest image, we are seeing a failure in installing openjdk11. It works fine with previous image.
With hashicorp/terraform:latest
Step 20/29 : RUN apk add --no-cache openjdk11
fetch http://dl-cdn.alpinelinux.org/alpine/v3.9/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.9/community/x86_64/APKINDEX.tar.gz
ERROR: unsatisfiable constraints:
openjdk11 (missing):
required by: world[openjdk11]
The command '/bin/sh -c apk add --no-cache openjdk11' returned a non-zero code: 1
With hashicorp/terraform:0.12.26
Step 22/31 : RUN apk add --no-cache openjdk11
fetch http://dl-cdn.alpinelinux.org/alpine/v3.11/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.11/community/x86_64/APKINDEX.tar.gz
(1/27) Installing openjdk11-jmods (11.0.5_p10-r0)
(2/27) Installing openjdk11-demos (11.0.5_p10-r0)
(3/27) Installing openjdk11-doc (11.0.5_p10-r0)
(4/27) Installing java-common (0.2-r0)
Sorry but reading https://hub.docker.com/r/hashicorp/terraform/ I have no idea how to use Terraform as a docker container.
alex@sun:~/some-path$ docker run -i -t hashicorp/terraform:light plan
Error: No configuration files found!
Plan requires configuration to be present. Planning without a configuration
would mark everything for destruction, which is normally not what is desired.
If you would like to destroy everything, please run plan with the "-destroy"
flag or create a single empty configuration file. Otherwise, please create
a Terraform configuration file in the path being executed and try again.
I would rather expect the same output as with terraform plan
...? So something is missing?
Hi,
I am currently trying to use the terraform image for opennebula. Opennebula is not an officially supported provider. Therefore I use the runtastic provider from github:
https://github.com/runtastic/terraform-provider-opennebula/commits/master
Using this provider works fine on my mac with the installed terraform.
Now I try to use it on a centos machine in a jenkins pipeline with the terraform image. I tried both :light and :full
Running terraform init works fine:
Initializing the backend...
Initializing provider plugins...
The following providers do not have any version constraints in configuration,
so the latest version was installed.
To prevent automatic upgrades to new major versions that may contain breaking
changes, it is recommended to add version = "..." constraints to the
corresponding provider blocks in configuration, with the constraint strings
suggested below.
* provider.powerdns: version = "~> 0.1"
* provider.template: version = "~> 1.0"
Terraform has been successfully initialized!
You may now begin working with Terraform. Try running "terraform plan" to see
any changes that are required for your infrastructure. All Terraform commands
should now work.
If you ever set or change modules or backend configuration for Terraform,
rerun this command to reinitialize your working directory. If you forget, other
commands will detect it and remind you to do so if necessary.
As you can see it does not initialise the opennebula provider.
If I remove the file however:
Initializing the backend...
Initializing provider plugins...
- Checking for available provider plugins on https://releases.hashicorp.com...
Provider "opennebula" not available for installation.
A provider named "opennebula" could not be found in the official repository.
This may result from mistyping the provider name, or the given provider may
be a third-party provider that cannot be installed automatically.
In the latter case, the plugin must be installed manually by locating and
downloading a suitable distribution package and placing the plugin's executable
file in the following directory:
terraform.d/plugins/linux_amd64
Terraform detects necessary plugins by inspecting the configuration and state.
To view the provider versions requested by each module, run
"terraform providers".
So of course I put the provider back into terraform.d/plugins/linux_amd64 to initialise terraform successfully again.
Running terraform plan however does not work as expected:
`Error: provider.opennebula: fork/exec /app/terraform.d/plugins/linux_amd64/terraform-provider-opennebula: no such file or directory`
The file is clearly present though:
file terraform.d/plugins/linux_amd64/terraform-provider-opennebula
terraform.d/plugins/linux_amd64/terraform-provider-opennebula: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), not stripped
I first thought it would be because of recursive mapping of volumes not working so I attached each subfolder seperately.
docker run --network host -w /app -v /var/lib/jenkins/.aws:/root/.aws \
-v /var/lib/jenkins/.ssh:/root/.ssh \
-v /var/lib/jenkins/workspace/PROJECTNAME:/app \
-v /var/lib/jenkins/workspace/PROJECTNAME/terraform.d:/app/terraform.d \
-v /var/lib/jenkins/workspace/PROJECTNAME/terraform.d/plugins:/app/terraform.d/plugins \
-v /var/lib/jenkins/workspace/PROJECTNAME/terraform.d/plugins/linux_amd64:/app/terraform.d/plugins/linux_amd64 \
-v /var/lib/jenkins/workspace/PROJECTNAME/terraform.d/plugins/linux_amd64/terraform-provider-opennebula:/app/terraform.d/plugins/linux_amd64/terraform-provider-opennebula \
hashicorp/terraform:full plan -out=tfplan -input=false
Still no luck however.
Anything GO specific I may be missing? Maybe a different approach to using the third party provider? Or is this just not supported currently?
Thanks in advance and thank you for providing terraform as a container
Hi there,
I'm new to Terraform & have tried running a simple init
command as follows…
$ docker run --rm -v $PWD:/data --workdir=/data hashicorp/terraform:light init
… but I get the error
Initializing provider plugins...
- Checking for available provider plugins on https://releases.hashicorp.com...
Error installing provider "aws": Get https://releases.hashicorp.com/terraform-provider-aws/: proxyconnect tcp: dial tcp 192.168.65.1:62123: getsockopt: connection refused.
I've also tried running it on the host network…
$ docker run --rm -v $PWD:/data --workdir=/data --network host hashicorp/terraform:light init
… but that didn't work either. Any ideas why not?
Hi,
Since last update, my gitlab runner seems to encounter a problem.
I'm using packer light install and bash isn't found anymore.
build_image:
stage: build
image:
name: hashicorp/packer:light
entrypoint: ["/bin/sh", "-c"]
script:
- /bin/bash scripts/build.sh
Result :
sh: eval: line 99: /bin/bash: not found
bash is present in version 1.5.1
The provided packer image builds on alpine, which doesn't appear to have session-manager-plugin
support: hashicorp/packer#8242. It'd be nice if the provided image or some variant did have support for that plugin.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.