containers / ansible-podman-collections Goto Github PK
View Code? Open in Web Editor NEWRepository for Ansible content that can include playbooks, roles, modules, and plugins for use with the Podman tool
License: GNU General Public License v3.0
Repository for Ansible content that can include playbooks, roles, modules, and plugins for use with the Podman tool
License: GNU General Public License v3.0
/kind feature
Description
Allow loading images from tarballs like docker_image plugin does.
This (https://docs.ansible.com/ansible/latest/collections/community/general/docker_image_module.html) plugin allows loading images to docker by providing a path to a tarball.
Basically, this, but in Ansible:
podman image load my_image.tar
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
When using the podman_volume
module with the options
option, the string seems to break idempotency.
options: "device={{ data_device }}"
Steps to reproduce the issue:
---
- hosts: all
tasks:
- name: Create test volume on external volume
containers.podman.podman_volume:
name: test
options: "/dev/nvme3n1"
Describe the results you received:
Volume gets recreated on every run.
Describe the results you expected:
Volume gets created only once.
Additional information you deem important (e.g. issue happens only occasionally):
Maybe the string device=something
is not being properly parsed? It is also weird that one has to specify device=
at all.
Output of ansible --version
:
ansible 2.9.13
config file = /home/usr/repos/audriga-infra/ansible.cfg
configured module search path = ['/home/usr/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.5 (default, Sep 5 2020, 10:50:12) [GCC 10.2.0]
Output of podman version
:
Version: 2.0.5
API Version: 1
Go Version: go1.14.2
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.1
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.20, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: file
hostname: hostname
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.4.0-1022-aws
linkmode: dynamic
memFree: 150753280
memTotal: 4064808960
ociRuntime:
name: runc
package: 'cri-o-runc: /usr/lib/cri-o-runc/sbin/runc'
path: /usr/lib/cri-o-runc/sbin/runc
version: 'runc version spec: 1.0.2-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.1.4
commit: unknown
libslirp: 4.3.1-git
SLIRP_CONFIG_VERSION_MAX: 3
swapFree: 0
swapTotal: 0
uptime: 166h 36m 39.73s (Approximately 6.92 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 0
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.14.2
OsArch: linux/amd64
Version: 2.0.5
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman/unknown 2.0.6~1 amd64 [upgradable from: 2.0.5~2]
podman/unknown 2.0.6~1 arm64
podman/unknown 2.0.6~1 armhf
podman/unknown 2.0.6~1 s390x
Playbok you run with ansible (e.g. content of playbook.yaml
):
See above
Command line and output of ansible run with high verbosity:
$ ansible-playbook -v test.yml
Using /home/usr/repos/repo/ansible.cfg as config file
PLAY [all] ***********************************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [testhost]
TASK [Create test volume on external volume] *************************************************************************************************************************************************
changed: [testhost] => {"actions": ["created test"], "changed": true, "podman_actions": ["podman volume create test --opt device=/dev/nvme3n1"], "stderr": "", "stderr_lines": [], "stdout": "test\n", "stdout_lines": ["test"], "volume": {"Anonymous": false, "CreatedAt": "2020-09-08T06:11:36.195055342Z", "Driver": "local", "GID": 0, "Labels": {}, "Mountpoint": "/var/lib/containers/storage/volumes/test/_data", "Name": "test", "Options": {"device": "/dev/nvme3n1"}, "Scope": "local", "UID": 0}}
PLAY RECAP ***********************************************************************************************************************************************************************************
testhost : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
$ ansible-playbook -v test.yml
Using /home/usr/repos/audriga-infra/ansible.cfg as config file
PLAY [all] ***********************************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [testhost]
TASK [Create test volume on external volume] *************************************************************************************************************************************************
changed: [testhost] => {"actions": ["recreated test"], "changed": true, "podman_actions": ["podman volume rm -f test", "podman volume create test --opt device=/dev/nvme3n1"], "stderr": "", "stderr_lines": [], "stdout": "test\n", "stdout_lines": ["test"], "volume": {"Anonymous": false, "CreatedAt": "2020-09-08T06:11:49.197938207Z", "Driver": "local", "GID": 0, "Labels": {}, "Mountpoint": "/var/lib/containers/storage/volumes/test/_data", "Name": "test", "Options": {"device": "/dev/nvme3n1"}, "Scope": "local", "UID": 0}}
PLAY RECAP ***********************************************************************************************************************************************************************************
testhost : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
AWS
/kind feature
Description
Use same structure as docker_container for volumes
Additional information you deem important (e.g. issue happens only occasionally):
This should be done now, so users can switch from docker_container to podman_container with ease (#99 )
Changes
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The idempotency does not work when specifying log_opt
.
Steps to reproduce the issue:
---
- name: Debug
hosts: podman_hosts
become: false
gather_facts: false
collections:
- ansible.builtin
- containers.podman
tasks:
- name: Ensure container is started
podman_container:
name: test1
image: alpine:latest
network:
- bridge
volume:
- test:/opt/test
log_driver: journald
log_opt: "tag=XXX"
state: started
command: 'sleep 1d'
- name: Ensure container is started
podman_container:
name: test1
image: alpine:latest
network:
- bridge
volume:
- test:/opt/test
log_driver: journald
log_opt: "tag=XXX"
state: started
command: 'sleep 1d'
Describe the results you received:
The second task is recreating the container
Describe the results you expected:
The second task to do nothing
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.10.2
config file = /Users/XXX/Documents/ansible/ansible.cfg
configured module search path = ['/Users/XXX/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/XXX/Documents/ansible/.venv/lib/python3.9/site-packages/ansible
executable location = /Users/XXX/Documents/ansible/.venv/bin/ansible
python version = 3.9.0 (default, Oct 27 2020, 14:15:17) [Clang 12.0.0 (clang-1200.0.32.21)]
Output of podman version
:
Version: 2.0.5
API Version: 1
Go Version: go1.14.7
Built: Wed Sep 23 16:18:02 2020
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.1
cgroupVersion: v1
conmon:
package: conmon-2.0.20-2.module+el8.3.0+8221+97165c3f.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.20, commit: 77ce9fd1e61ea89bd6cdc621b07446dd9e80e5b6'
cpus: 2
distribution:
distribution: '"rhel"'
version: "8.3"
eventLogger: file
hostname: podman-dev-01
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 4.18.0-240.1.1.el8_3.x86_64
linkmode: dynamic
memFree: 2812526592
memTotal: 3854749696
ociRuntime:
name: runc
package: runc-1.0.0-68.rc92.module+el8.3.0+8221+97165c3f.x86_64
path: /usr/bin/runc
version: 'runc version spec: 1.0.2-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.4-2.module+el8.3.0+8221+97165c3f.x86_64
version: |-
slirp4netns version 1.1.4
commit: b66ffa8e262507e37fca689822d23430f3357fe8
libslirp: 4.3.1
SLIRP_CONFIG_VERSION_MAX: 3
swapFree: 0
swapTotal: 0
uptime: 26h 51m 9.8s (Approximately 1.08 days)
registries:
search:
- registry.access.redhat.com
- registry.redhat.io
- docker.io
store:
configFile: /home/ec2-user/.config/containers/storage.conf
containerStore:
number: 1
paused: 0
running: 1
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.1.2-3.module+el8.3.0+8221+97165c3f.x86_64
Version: |-
fuse-overlayfs: version 1.1.0
FUSE library version 3.2.1
using FUSE kernel interface version 7.26
graphRoot: /home/ec2-user/.local/share/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 2
runRoot: /run/user/1000/containers
volumePath: /home/ec2-user/.local/share/containers/storage/volumes
version:
APIVersion: 1
Built: 1600877882
BuiltTime: Wed Sep 23 16:18:02 2020
GitCommit: ""
GoVersion: go1.14.7
OsArch: linux/amd64
Version: 2.0.5
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.0.5-5.module+el8.3.0+8221+97165c3f.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: Debug
hosts: podman_hosts
become: false
gather_facts: false
collections:
- ansible.builtin
- containers.podman
tasks:
- name: Ensure container is started
podman_container:
name: test1
image: alpine:latest
network:
- bridge
volume:
- test:/opt/test
log_driver: journald
log_opt: "tag=XXX"
state: started
command: 'sleep 1d'
- name: Ensure container is started
podman_container:
name: test1
image: alpine:latest
network:
- bridge
volume:
- test:/opt/test
log_driver: journald
log_opt: "tag=XXX"
state: started
command: 'sleep 1d'
Command line and output of ansible run with high verbosity:
ansible-playbook 2.10.2
config file = /Users/dermfhen/Documents/research/development/ansible/ansible.cfg
configured module search path = ['/Users/dermfhen/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/dermfhen/Documents/research/development/ansible/.venv/lib/python3.9/site-packages/ansible
executable location = /Users/dermfhen/Documents/research/development/ansible/.venv/bin/ansible-playbook
python version = 3.9.0 (default, Oct 27 2020, 14:15:17) [Clang 12.0.0 (clang-1200.0.32.21)]
Using /Users/dermfhen/Documents/research/development/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /Users/dermfhen/Documents/research/development/ansible/inventory as it did not pass its verify_file() method
script declined parsing /Users/dermfhen/Documents/research/development/ansible/inventory as it did not pass its verify_file() method
auto declined parsing /Users/dermfhen/Documents/research/development/ansible/inventory as it did not pass its verify_file() method
Set default localhost to localhost
Parsed /Users/dermfhen/Documents/research/development/ansible/inventory inventory source with ini plugin
Loading collection containers.podman from /Users/dermfhen/Documents/research/development/ansible/.venv/lib/python3.9/site-packages/ansible_collections/containers/podman
redirecting (type: callback) ansible.builtin.yaml to community.general.yaml
Loading collection community.general from /Users/dermfhen/Documents/research/development/ansible/.collections/ansible_collections/community/general
redirecting (type: callback) ansible.builtin.yaml to community.general.yaml
Loading callback plugin community.general.yaml of type stdout, v2.0 from /Users/dermfhen/Documents/research/development/ansible/.collections/ansible_collections/community/general/plugins/callback/yaml.py
PLAYBOOK: play-_debug.yml ************************************************************************************************************************************************************************************************************
Positional arguments: play-_debug.yml
verbosity: 4
private_key_file: /Users/dermfhen/.ssh/dermfhen_aws
remote_user: dermfhen
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/Users/dermfhen/Documents/research/development/ansible/inventory',)
forks: 250
1 plays in play-_debug.yml
PLAY [Debug] *************************************************************************************************************************************************************************************************************************
META: ran handlers
TASK [Ensure container is started] ***************************************************************************************************************************************************************************************************
task path: /Users/dermfhen/Documents/research/development/ansible/play-_debug.yml:10
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /tmp `"&& mkdir "` echo /tmp/ansible-tmp-1605105213.939492-16484-116379475390970 `" && echo ansible-tmp-1605105213.939492-16484-116379475390970="` echo /tmp/ansible-tmp-1605105213.939492-16484-116379475390970 `" ) && sleep 0'"'"''
<13.48.204.72> (0, b'ansible-tmp-1605105213.939492-16484-116379475390970=/tmp/ansible-tmp-1605105213.939492-16484-116379475390970\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<podman-dev-01> Attempting python interpreter discovery
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<13.48.204.72> (0, b'PLATFORM\nLinux\nFOUND\n/usr/libexec/platform-python\nENDFOUND\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'/usr/libexec/platform-python && sleep 0'"'"''
<13.48.204.72> (0, b'{"platform_dist_result": ["redhat", "8.3", "Ootpa"], "osrelease_content": "NAME=\\"Red Hat Enterprise Linux\\"\\nVERSION=\\"8.3 (Ootpa)\\"\\nID=\\"rhel\\"\\nID_LIKE=\\"fedora\\"\\nVERSION_ID=\\"8.3\\"\\nPLATFORM_ID=\\"platform:el8\\"\\nPRETTY_NAME=\\"Red Hat Enterprise Linux 8.3 (Ootpa)\\"\\nANSI_COLOR=\\"0;31\\"\\nCPE_NAME=\\"cpe:/o:redhat:enterprise_linux:8.3:GA\\"\\nHOME_URL=\\"https://www.redhat.com/\\"\\nBUG_REPORT_URL=\\"https://bugzilla.redhat.com/\\"\\n\\nREDHAT_BUGZILLA_PRODUCT=\\"Red Hat Enterprise Linux 8\\"\\nREDHAT_BUGZILLA_PRODUCT_VERSION=8.3\\nREDHAT_SUPPORT_PRODUCT=\\"Red Hat Enterprise Linux\\"\\nREDHAT_SUPPORT_PRODUCT_VERSION=\\"8.3\\"\\n"}\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /Users/dermfhen/Documents/research/development/ansible/.venv/lib/python3.9/site-packages/ansible_collections/containers/podman/plugins/modules/podman_container.py
<13.48.204.72> PUT /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp2immwm67 TO /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/AnsiballZ_podman_container.py
<13.48.204.72> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe '[13.48.204.72]'
<13.48.204.72> (0, b'sftp> put /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp2immwm67 /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/AnsiballZ_podman_container.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/ec2-user size 0\r\ndebug3: Looking up /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp2immwm67\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/tmp/ansible-tmp-1605105213.939492-16484-116379475390970/AnsiballZ_podman_container.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:8 O:131072 S:8087\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 32768 bytes at 98304\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 8 8087 bytes at 131072\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'chmod u+x /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/ /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/AnsiballZ_podman_container.py && sleep 0'"'"''
<13.48.204.72> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe -tt 13.48.204.72 '/bin/sh -c '"'"'/usr/libexec/platform-python /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/AnsiballZ_podman_container.py && sleep 0'"'"''
<13.48.204.72> (0, b'\r\n{"changed": true, "actions": ["recreated test1"], "container": {"Id": "c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2", "Created": "2020-11-11T14:33:46.382624933Z", "Path": "sleep", "Args": ["1d"], "State": {"OciVersion": "1.0.2-dev", "Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 71530, "ConmonPid": 71519, "ExitCode": 0, "Error": "", "StartedAt": "2020-11-11T14:33:46.765988285Z", "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"Status": "", "FailingStreak": 0, "Log": null}}, "Image": "d6e46aa2470df1d32034c6707c8041158b652f38d2a9ae3d7ad7e7532d22ebe0", "ImageName": "docker.io/library/alpine:latest", "Rootfs": "", "Pod": "", "ResolvConfPath": "/run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/resolv.conf", "HostnamePath": "/run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/hostname", "HostsPath": "/run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/hosts", "StaticDir": "/home/ec2-user/.local/share/containers/storage/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata", "OCIConfigPath": "/home/ec2-user/.local/share/containers/storage/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/config.json", "OCIRuntime": "runc", "LogPath": "", "LogTag": "XXX", "ConmonPidFile": "/run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/conmon.pid", "Name": "test1", "RestartCount": 0, "Driver": "overlay", "MountLabel": "system_u:object_r:container_file_t:s0:c307,c682", "ProcessLabel": "system_u:system_r:container_t:s0:c307,c682", "AppArmorProfile": "", "EffectiveCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "ExecIDs": [], "GraphDriver": {"Name": "overlay", "Data": {"LowerDir": "/home/ec2-user/.local/share/containers/storage/overlay/ace0eda3e3be35a979cec764a3321b4c7d0b9e4bb3094d20d3ff6782961a8d54/diff", "MergedDir": "/home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/merged", "UpperDir": "/home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/diff", "WorkDir": "/home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/work"}}, "Mounts": [{"Type": "volume", "Name": "test", "Source": "/home/ec2-user/.local/share/containers/storage/volumes/test/_data", "Destination": "/opt/test", "Driver": "local", "Mode": "", "Options": ["nosuid", "nodev", "rbind"], "RW": true, "Propagation": "rprivate"}], "Dependencies": [], "NetworkSettings": {"EndpointID": "", "Gateway": "", "IPAddress": "", "IPPrefixLen": 0, "IPv6Gateway": "", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "MacAddress": "", "Bridge": "", "SandboxID": "", "HairpinMode": false, "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "Ports": {}, "SandboxKey": "/run/user/1000/netns/cni-5167f623-e31d-8de6-b0d5-62faa433ead0"}, "ExitCommand": ["/usr/bin/podman", "--root", "/home/ec2-user/.local/share/containers/storage", "--runroot", "/run/user/1000/containers", "--log-level", "error", "--cgroup-manager", "cgroupfs", "--tmpdir", "/run/user/1000/libpod/tmp", "--runtime", "runc", "--storage-driver", "overlay", "--storage-opt", "overlay.mount_program=/usr/bin/fuse-overlayfs", "--events-backend", "file", "container", "cleanup", "c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2"], "Namespace": "", "IsInfra": false, "Config": {"Hostname": "c4650884fa37", "Domainname": "", "User": "", "AttachStdin": false, "AttachStdout": false, "AttachStderr": false, "Tty": false, "OpenStdin": false, "StdinOnce": false, "Env": ["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "HOSTNAME=c4650884fa37", "HOME=/root"], "Cmd": ["sleep", "1d"], "Image": "docker.io/library/alpine:latest", "Volumes": null, "WorkingDir": "/", "Entrypoint": "", "OnBuild": null, "Labels": null, "Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-11-11T14:33:46.382624933Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "StopSignal": 15, "CreateCommand": ["podman", "container", "run", "--name", "test1", "--network", "bridge", "--volume", "test:/opt/test", "--log-driver", "journald", "--log-opt", "tag=XXX", "--detach=True", "alpine:latest", "sleep", "1d"]}, "HostConfig": {"Binds": ["test:/opt/test:rw,rprivate,nosuid,nodev,rbind"], "CgroupMode": "host", "ContainerIDFile": "", "LogConfig": {"Type": "journald", "Config": null}, "NetworkMode": "bridge", "PortBindings": {}, "RestartPolicy": {"Name": "", "MaximumRetryCount": 0}, "AutoRemove": false, "VolumeDriver": "", "VolumesFrom": null, "CapAdd": [], "CapDrop": [], "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IpcMode": "private", "Cgroup": "", "Cgroups": "default", "Links": null, "OomScoreAdj": 0, "PidMode": "private", "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "SecurityOpt": [], "Tmpfs": {}, "UTSMode": "private", "UsernsMode": "", "ShmSize": 65536000, "Runtime": "oci", "ConsoleSize": [0, 0], "Isolation": "", "CpuShares": 0, "Memory": 0, "NanoCpus": 0, "CgroupParent": "", "BlkioWeight": 0, "BlkioWeightDevice": null, "BlkioDeviceReadBps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteIOps": null, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "KernelMemory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "OomKillDisable": false, "PidsLimit": 0, "Ulimits": [], "CpuCount": 0, "CpuPercent": 0, "IOMaximumIOps": 0, "IOMaximumBandwidth": 0}}, "podman_actions": ["podman rm -f test1", "podman run --name test1 --network bridge --volume test:/opt/test --log-driver journald --log-opt tag=XXX --detach=True alpine:latest sleep 1d"], "stdout": "c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2\\n", "stderr": "", "invocation": {"module_args": {"name": "test1", "image": "alpine:latest", "network": ["bridge"], "volume": ["test:/opt/test"], "log_driver": "journald", "log_opt": "tag=XXX", "state": "started", "command": "sleep 1d", "executable": "podman", "detach": true, "debug": false, "force_restart": false, "image_strict": false, "recreate": false, "annotation": null, "authfile": null, "blkio_weight": null, "blkio_weight_device": null, "cap_add": null, "cap_drop": null, "cgroup_parent": null, "cgroupns": null, "cgroups": null, "cidfile": null, "cmd_args": null, "conmon_pidfile": null, "cpu_period": null, "cpu_rt_period": null, "cpu_rt_runtime": null, "cpu_shares": null, "cpus": null, "cpuset_cpus": null, "cpuset_mems": null, "detach_keys": null, "device": null, "device_read_bps": null, "device_read_iops": null, "device_write_bps": null, "device_write_iops": null, "dns": null, "dns_option": null, "dns_search": null, "entrypoint": null, "env": null, "env_file": null, "env_host": null, "etc_hosts": null, "expose": null, "gidmap": null, "group_add": null, "healthcheck": null, "healthcheck_interval": null, "healthcheck_retries": null, "healthcheck_start_period": null, "healthcheck_timeout": null, "hostname": null, "http_proxy": null, "image_volume": null, "init": null, "init_path": null, "interactive": null, "ip": null, "ipc": null, "kernel_memory": null, "label": null, "label_file": null, "memory": null, "memory_reservation": null, "memory_swap": null, "memory_swappiness": null, "mount": null, "no_hosts": null, "oom_kill_disable": null, "oom_score_adj": null, "pid": null, "pids_limit": null, "pod": null, "privileged": null, "publish": null, "publish_all": null, "read_only": null, "read_only_tmpfs": null, "restart_policy": null, "rm": null, "rootfs": null, "security_opt": null, "shm_size": null, "sig_proxy": null, "stop_signal": null, "stop_timeout": null, "subgidname": null, "subuidname": null, "sysctl": null, "systemd": null, "tmpfs": null, "tty": null, "uidmap": null, "ulimit": null, "user": null, "userns": null, "uts": null, "volumes_from": null, "workdir": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 13.48.204.72 closed.\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'rm -f -r /tmp/ansible-tmp-1605105213.939492-16484-116379475390970/ > /dev/null 2>&1 && sleep 0'"'"''
<13.48.204.72> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [podman-dev-01] => changed=true
actions:
- recreated test1
ansible_facts:
discovered_interpreter_python: /usr/libexec/platform-python
container:
AppArmorProfile: ''
Args:
- 1d
BoundingCaps:
- CAP_AUDIT_WRITE
- CAP_CHOWN
- CAP_DAC_OVERRIDE
- CAP_FOWNER
- CAP_FSETID
- CAP_KILL
- CAP_MKNOD
- CAP_NET_BIND_SERVICE
- CAP_NET_RAW
- CAP_SETFCAP
- CAP_SETGID
- CAP_SETPCAP
- CAP_SETUID
- CAP_SYS_CHROOT
Config:
Annotations:
io.container.manager: libpod
io.kubernetes.cri-o.Created: '2020-11-11T14:33:46.382624933Z'
io.kubernetes.cri-o.TTY: 'false'
io.podman.annotations.autoremove: 'FALSE'
io.podman.annotations.init: 'FALSE'
io.podman.annotations.privileged: 'FALSE'
io.podman.annotations.publish-all: 'FALSE'
org.opencontainers.image.stopSignal: '15'
AttachStderr: false
AttachStdin: false
AttachStdout: false
Cmd:
- sleep
- 1d
CreateCommand:
- podman
- container
- run
- --name
- test1
- --network
- bridge
- --volume
- test:/opt/test
- --log-driver
- journald
- --log-opt
- tag=XXX
- --detach=True
- alpine:latest
- sleep
- 1d
Domainname: ''
Entrypoint: ''
Env:
- PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
- TERM=xterm
- container=podman
- HOSTNAME=c4650884fa37
- HOME=/root
Hostname: c4650884fa37
Image: docker.io/library/alpine:latest
Labels: null
OnBuild: null
OpenStdin: false
StdinOnce: false
StopSignal: 15
Tty: false
User: ''
Volumes: null
WorkingDir: /
ConmonPidFile: /run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/conmon.pid
Created: '2020-11-11T14:33:46.382624933Z'
Dependencies: []
Driver: overlay
EffectiveCaps:
- CAP_AUDIT_WRITE
- CAP_CHOWN
- CAP_DAC_OVERRIDE
- CAP_FOWNER
- CAP_FSETID
- CAP_KILL
- CAP_MKNOD
- CAP_NET_BIND_SERVICE
- CAP_NET_RAW
- CAP_SETFCAP
- CAP_SETGID
- CAP_SETPCAP
- CAP_SETUID
- CAP_SYS_CHROOT
ExecIDs: []
ExitCommand:
- /usr/bin/podman
- --root
- /home/ec2-user/.local/share/containers/storage
- --runroot
- /run/user/1000/containers
- --log-level
- error
- --cgroup-manager
- cgroupfs
- --tmpdir
- /run/user/1000/libpod/tmp
- --runtime
- runc
- --storage-driver
- overlay
- --storage-opt
- overlay.mount_program=/usr/bin/fuse-overlayfs
- --events-backend
- file
- container
- cleanup
- c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2
GraphDriver:
Data:
LowerDir: /home/ec2-user/.local/share/containers/storage/overlay/ace0eda3e3be35a979cec764a3321b4c7d0b9e4bb3094d20d3ff6782961a8d54/diff
MergedDir: /home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/merged
UpperDir: /home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/diff
WorkDir: /home/ec2-user/.local/share/containers/storage/overlay/e70c9093463126cf714da91bbf922295df7e9c461707992061056af706409283/work
Name: overlay
HostConfig:
AutoRemove: false
Binds:
- test:/opt/test:rw,rprivate,nosuid,nodev,rbind
BlkioDeviceReadBps: null
BlkioDeviceReadIOps: null
BlkioDeviceWriteBps: null
BlkioDeviceWriteIOps: null
BlkioWeight: 0
BlkioWeightDevice: null
CapAdd: []
CapDrop: []
Cgroup: ''
CgroupMode: host
CgroupParent: ''
Cgroups: default
ConsoleSize:
- 0
- 0
ContainerIDFile: ''
CpuCount: 0
CpuPercent: 0
CpuPeriod: 0
CpuQuota: 0
CpuRealtimePeriod: 0
CpuRealtimeRuntime: 0
CpuShares: 0
CpusetCpus: ''
CpusetMems: ''
Devices: []
DiskQuota: 0
Dns: []
DnsOptions: []
DnsSearch: []
ExtraHosts: []
GroupAdd: []
IOMaximumBandwidth: 0
IOMaximumIOps: 0
IpcMode: private
Isolation: ''
KernelMemory: 0
Links: null
LogConfig:
Config: null
Type: journald
Memory: 0
MemoryReservation: 0
MemorySwap: 0
MemorySwappiness: 0
NanoCpus: 0
NetworkMode: bridge
OomKillDisable: false
OomScoreAdj: 0
PidMode: private
PidsLimit: 0
PortBindings: {}
Privileged: false
PublishAllPorts: false
ReadonlyRootfs: false
RestartPolicy:
MaximumRetryCount: 0
Name: ''
Runtime: oci
SecurityOpt: []
ShmSize: 65536000
Tmpfs: {}
UTSMode: private
Ulimits: []
UsernsMode: ''
VolumeDriver: ''
VolumesFrom: null
HostnamePath: /run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/hostname
HostsPath: /run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/hosts
Id: c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2
Image: d6e46aa2470df1d32034c6707c8041158b652f38d2a9ae3d7ad7e7532d22ebe0
ImageName: docker.io/library/alpine:latest
IsInfra: false
LogPath: ''
LogTag: XXX
MountLabel: system_u:object_r:container_file_t:s0:c307,c682
Mounts:
- Destination: /opt/test
Driver: local
Mode: ''
Name: test
Options:
- nosuid
- nodev
- rbind
Propagation: rprivate
RW: true
Source: /home/ec2-user/.local/share/containers/storage/volumes/test/_data
Type: volume
Name: test1
Namespace: ''
NetworkSettings:
Bridge: ''
EndpointID: ''
Gateway: ''
GlobalIPv6Address: ''
GlobalIPv6PrefixLen: 0
HairpinMode: false
IPAddress: ''
IPPrefixLen: 0
IPv6Gateway: ''
LinkLocalIPv6Address: ''
LinkLocalIPv6PrefixLen: 0
MacAddress: ''
Ports: {}
SandboxID: ''
SandboxKey: /run/user/1000/netns/cni-5167f623-e31d-8de6-b0d5-62faa433ead0
OCIConfigPath: /home/ec2-user/.local/share/containers/storage/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/config.json
OCIRuntime: runc
Path: sleep
Pod: ''
ProcessLabel: system_u:system_r:container_t:s0:c307,c682
ResolvConfPath: /run/user/1000/containers/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata/resolv.conf
RestartCount: 0
Rootfs: ''
State:
ConmonPid: 71519
Dead: false
Error: ''
ExitCode: 0
FinishedAt: '0001-01-01T00:00:00Z'
Healthcheck:
FailingStreak: 0
Log: null
Status: ''
OOMKilled: false
OciVersion: 1.0.2-dev
Paused: false
Pid: 71530
Restarting: false
Running: true
StartedAt: '2020-11-11T14:33:46.765988285Z'
Status: running
StaticDir: /home/ec2-user/.local/share/containers/storage/overlay-containers/c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2/userdata
invocation:
module_args:
annotation: null
authfile: null
blkio_weight: null
blkio_weight_device: null
cap_add: null
cap_drop: null
cgroup_parent: null
cgroupns: null
cgroups: null
cidfile: null
cmd_args: null
command: sleep 1d
conmon_pidfile: null
cpu_period: null
cpu_rt_period: null
cpu_rt_runtime: null
cpu_shares: null
cpus: null
cpuset_cpus: null
cpuset_mems: null
debug: false
detach: true
detach_keys: null
device: null
device_read_bps: null
device_read_iops: null
device_write_bps: null
device_write_iops: null
dns: null
dns_option: null
dns_search: null
entrypoint: null
env: null
env_file: null
env_host: null
etc_hosts: null
executable: podman
expose: null
force_restart: false
gidmap: null
group_add: null
healthcheck: null
healthcheck_interval: null
healthcheck_retries: null
healthcheck_start_period: null
healthcheck_timeout: null
hostname: null
http_proxy: null
image: alpine:latest
image_strict: false
image_volume: null
init: null
init_path: null
interactive: null
ip: null
ipc: null
kernel_memory: null
label: null
label_file: null
log_driver: journald
log_opt: tag=XXX
memory: null
memory_reservation: null
memory_swap: null
memory_swappiness: null
mount: null
name: test1
network:
- bridge
no_hosts: null
oom_kill_disable: null
oom_score_adj: null
pid: null
pids_limit: null
pod: null
privileged: null
publish: null
publish_all: null
read_only: null
read_only_tmpfs: null
recreate: false
restart_policy: null
rm: null
rootfs: null
security_opt: null
shm_size: null
sig_proxy: null
state: started
stop_signal: null
stop_timeout: null
subgidname: null
subuidname: null
sysctl: null
systemd: null
tmpfs: null
tty: null
uidmap: null
ulimit: null
user: null
userns: null
uts: null
volume:
- test:/opt/test
volumes_from: null
workdir: null
podman_actions:
- podman rm -f test1
- podman run --name test1 --network bridge --volume test:/opt/test --log-driver journald --log-opt tag=XXX --detach=True alpine:latest sleep 1d
stderr: ''
stderr_lines: <omitted>
stdout: |-
c4650884fa375bd67afe6ba0e5cc8852bb2c09ae9fd9c86354aea0fbdd3c04e2
stdout_lines: <omitted>
TASK [Ensure container is started] ***************************************************************************************************************************************************************************************************
task path: /Users/dermfhen/Documents/research/development/ansible/play-_debug.yml:23
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /tmp `"&& mkdir "` echo /tmp/ansible-tmp-1605105227.113611-16494-153459944160434 `" && echo ansible-tmp-1605105227.113611-16494-153459944160434="` echo /tmp/ansible-tmp-1605105227.113611-16494-153459944160434 `" ) && sleep 0'"'"''
<13.48.204.72> (0, b'ansible-tmp-1605105227.113611-16494-153459944160434=/tmp/ansible-tmp-1605105227.113611-16494-153459944160434\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /Users/dermfhen/Documents/research/development/ansible/.venv/lib/python3.9/site-packages/ansible_collections/containers/podman/plugins/modules/podman_container.py
<13.48.204.72> PUT /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp7f4ntjx1 TO /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/AnsiballZ_podman_container.py
<13.48.204.72> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe '[13.48.204.72]'
<13.48.204.72> (0, b'sftp> put /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp7f4ntjx1 /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/AnsiballZ_podman_container.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/ec2-user size 0\r\ndebug3: Looking up /Users/dermfhen/.ansible/tmp/ansible-local-16477vv5f9xxn/tmp7f4ntjx1\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/tmp/ansible-tmp-1605105227.113611-16494-153459944160434/AnsiballZ_podman_container.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:8 O:131072 S:8087\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 32768 bytes at 98304\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 8 8087 bytes at 131072\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'chmod u+x /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/ /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/AnsiballZ_podman_container.py && sleep 0'"'"''
<13.48.204.72> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe -tt 13.48.204.72 '/bin/sh -c '"'"'/usr/libexec/platform-python /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/AnsiballZ_podman_container.py && sleep 0'"'"''
<13.48.204.72> (0, b'\r\n{"changed": true, "actions": ["recreated test1"], "container": {"Id": "829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf", "Created": "2020-11-11T14:33:59.068423608Z", "Path": "sleep", "Args": ["1d"], "State": {"OciVersion": "1.0.2-dev", "Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 71736, "ConmonPid": 71726, "ExitCode": 0, "Error": "", "StartedAt": "2020-11-11T14:33:59.403230597Z", "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"Status": "", "FailingStreak": 0, "Log": null}}, "Image": "d6e46aa2470df1d32034c6707c8041158b652f38d2a9ae3d7ad7e7532d22ebe0", "ImageName": "docker.io/library/alpine:latest", "Rootfs": "", "Pod": "", "ResolvConfPath": "/run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/resolv.conf", "HostnamePath": "/run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/hostname", "HostsPath": "/run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/hosts", "StaticDir": "/home/ec2-user/.local/share/containers/storage/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata", "OCIConfigPath": "/home/ec2-user/.local/share/containers/storage/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/config.json", "OCIRuntime": "runc", "LogPath": "", "LogTag": "XXX", "ConmonPidFile": "/run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/conmon.pid", "Name": "test1", "RestartCount": 0, "Driver": "overlay", "MountLabel": "system_u:object_r:container_file_t:s0:c364,c602", "ProcessLabel": "system_u:system_r:container_t:s0:c364,c602", "AppArmorProfile": "", "EffectiveCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "ExecIDs": [], "GraphDriver": {"Name": "overlay", "Data": {"LowerDir": "/home/ec2-user/.local/share/containers/storage/overlay/ace0eda3e3be35a979cec764a3321b4c7d0b9e4bb3094d20d3ff6782961a8d54/diff", "MergedDir": "/home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/merged", "UpperDir": "/home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/diff", "WorkDir": "/home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/work"}}, "Mounts": [{"Type": "volume", "Name": "test", "Source": "/home/ec2-user/.local/share/containers/storage/volumes/test/_data", "Destination": "/opt/test", "Driver": "local", "Mode": "", "Options": ["nosuid", "nodev", "rbind"], "RW": true, "Propagation": "rprivate"}], "Dependencies": [], "NetworkSettings": {"EndpointID": "", "Gateway": "", "IPAddress": "", "IPPrefixLen": 0, "IPv6Gateway": "", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "MacAddress": "", "Bridge": "", "SandboxID": "", "HairpinMode": false, "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "Ports": {}, "SandboxKey": "/run/user/1000/netns/cni-d791e67e-0cb1-b85a-8a9a-fde65a1918be"}, "ExitCommand": ["/usr/bin/podman", "--root", "/home/ec2-user/.local/share/containers/storage", "--runroot", "/run/user/1000/containers", "--log-level", "error", "--cgroup-manager", "cgroupfs", "--tmpdir", "/run/user/1000/libpod/tmp", "--runtime", "runc", "--storage-driver", "overlay", "--storage-opt", "overlay.mount_program=/usr/bin/fuse-overlayfs", "--events-backend", "file", "container", "cleanup", "829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf"], "Namespace": "", "IsInfra": false, "Config": {"Hostname": "829ae9349e53", "Domainname": "", "User": "", "AttachStdin": false, "AttachStdout": false, "AttachStderr": false, "Tty": false, "OpenStdin": false, "StdinOnce": false, "Env": ["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "HOSTNAME=829ae9349e53", "HOME=/root"], "Cmd": ["sleep", "1d"], "Image": "docker.io/library/alpine:latest", "Volumes": null, "WorkingDir": "/", "Entrypoint": "", "OnBuild": null, "Labels": null, "Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-11-11T14:33:59.068423608Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "StopSignal": 15, "CreateCommand": ["podman", "container", "run", "--name", "test1", "--network", "bridge", "--volume", "test:/opt/test", "--log-driver", "journald", "--log-opt", "tag=XXX", "--detach=True", "alpine:latest", "sleep", "1d"]}, "HostConfig": {"Binds": ["test:/opt/test:rw,rprivate,nosuid,nodev,rbind"], "CgroupMode": "host", "ContainerIDFile": "", "LogConfig": {"Type": "journald", "Config": null}, "NetworkMode": "bridge", "PortBindings": {}, "RestartPolicy": {"Name": "", "MaximumRetryCount": 0}, "AutoRemove": false, "VolumeDriver": "", "VolumesFrom": null, "CapAdd": [], "CapDrop": [], "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IpcMode": "private", "Cgroup": "", "Cgroups": "default", "Links": null, "OomScoreAdj": 0, "PidMode": "private", "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "SecurityOpt": [], "Tmpfs": {}, "UTSMode": "private", "UsernsMode": "", "ShmSize": 65536000, "Runtime": "oci", "ConsoleSize": [0, 0], "Isolation": "", "CpuShares": 0, "Memory": 0, "NanoCpus": 0, "CgroupParent": "", "BlkioWeight": 0, "BlkioWeightDevice": null, "BlkioDeviceReadBps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteIOps": null, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "KernelMemory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "OomKillDisable": false, "PidsLimit": 0, "Ulimits": [], "CpuCount": 0, "CpuPercent": 0, "IOMaximumIOps": 0, "IOMaximumBandwidth": 0}}, "podman_actions": ["podman rm -f test1", "podman run --name test1 --network bridge --volume test:/opt/test --log-driver journald --log-opt tag=XXX --detach=True alpine:latest sleep 1d"], "stdout": "829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf\\n", "stderr": "", "invocation": {"module_args": {"name": "test1", "image": "alpine:latest", "network": ["bridge"], "volume": ["test:/opt/test"], "log_driver": "journald", "log_opt": "tag=XXX", "state": "started", "command": "sleep 1d", "executable": "podman", "detach": true, "debug": false, "force_restart": false, "image_strict": false, "recreate": false, "annotation": null, "authfile": null, "blkio_weight": null, "blkio_weight_device": null, "cap_add": null, "cap_drop": null, "cgroup_parent": null, "cgroupns": null, "cgroups": null, "cidfile": null, "cmd_args": null, "conmon_pidfile": null, "cpu_period": null, "cpu_rt_period": null, "cpu_rt_runtime": null, "cpu_shares": null, "cpus": null, "cpuset_cpus": null, "cpuset_mems": null, "detach_keys": null, "device": null, "device_read_bps": null, "device_read_iops": null, "device_write_bps": null, "device_write_iops": null, "dns": null, "dns_option": null, "dns_search": null, "entrypoint": null, "env": null, "env_file": null, "env_host": null, "etc_hosts": null, "expose": null, "gidmap": null, "group_add": null, "healthcheck": null, "healthcheck_interval": null, "healthcheck_retries": null, "healthcheck_start_period": null, "healthcheck_timeout": null, "hostname": null, "http_proxy": null, "image_volume": null, "init": null, "init_path": null, "interactive": null, "ip": null, "ipc": null, "kernel_memory": null, "label": null, "label_file": null, "memory": null, "memory_reservation": null, "memory_swap": null, "memory_swappiness": null, "mount": null, "no_hosts": null, "oom_kill_disable": null, "oom_score_adj": null, "pid": null, "pids_limit": null, "pod": null, "privileged": null, "publish": null, "publish_all": null, "read_only": null, "read_only_tmpfs": null, "restart_policy": null, "rm": null, "rootfs": null, "security_opt": null, "shm_size": null, "sig_proxy": null, "stop_signal": null, "stop_timeout": null, "subgidname": null, "subuidname": null, "sysctl": null, "systemd": null, "tmpfs": null, "tty": null, "uidmap": null, "ulimit": null, "user": null, "userns": null, "uts": null, "volumes_from": null, "workdir": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 13.48.204.72 closed.\r\n')
<13.48.204.72> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<13.48.204.72> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o ServerAliveInterval=30s -o ServerAliveCountMax=20 -o StrictHostKeyChecking=no -o 'IdentityFile="/Users/dermfhen/.ssh/dermfhen_aws"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="ec2-user"' -o ConnectTimeout=10 -o ControlPath=/Users/dermfhen/.ansible/cp/5c6d9d6bbe 13.48.204.72 '/bin/sh -c '"'"'rm -f -r /tmp/ansible-tmp-1605105227.113611-16494-153459944160434/ > /dev/null 2>&1 && sleep 0'"'"''
<13.48.204.72> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/dermfhen/.ssh/config\r\ndebug1: /Users/dermfhen/.ssh/config line 31: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 13.48.204.72 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 16448\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [podman-dev-01] => changed=true
actions:
- recreated test1
container:
AppArmorProfile: ''
Args:
- 1d
BoundingCaps:
- CAP_AUDIT_WRITE
- CAP_CHOWN
- CAP_DAC_OVERRIDE
- CAP_FOWNER
- CAP_FSETID
- CAP_KILL
- CAP_MKNOD
- CAP_NET_BIND_SERVICE
- CAP_NET_RAW
- CAP_SETFCAP
- CAP_SETGID
- CAP_SETPCAP
- CAP_SETUID
- CAP_SYS_CHROOT
Config:
Annotations:
io.container.manager: libpod
io.kubernetes.cri-o.Created: '2020-11-11T14:33:59.068423608Z'
io.kubernetes.cri-o.TTY: 'false'
io.podman.annotations.autoremove: 'FALSE'
io.podman.annotations.init: 'FALSE'
io.podman.annotations.privileged: 'FALSE'
io.podman.annotations.publish-all: 'FALSE'
org.opencontainers.image.stopSignal: '15'
AttachStderr: false
AttachStdin: false
AttachStdout: false
Cmd:
- sleep
- 1d
CreateCommand:
- podman
- container
- run
- --name
- test1
- --network
- bridge
- --volume
- test:/opt/test
- --log-driver
- journald
- --log-opt
- tag=XXX
- --detach=True
- alpine:latest
- sleep
- 1d
Domainname: ''
Entrypoint: ''
Env:
- PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
- TERM=xterm
- container=podman
- HOSTNAME=829ae9349e53
- HOME=/root
Hostname: 829ae9349e53
Image: docker.io/library/alpine:latest
Labels: null
OnBuild: null
OpenStdin: false
StdinOnce: false
StopSignal: 15
Tty: false
User: ''
Volumes: null
WorkingDir: /
ConmonPidFile: /run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/conmon.pid
Created: '2020-11-11T14:33:59.068423608Z'
Dependencies: []
Driver: overlay
EffectiveCaps:
- CAP_AUDIT_WRITE
- CAP_CHOWN
- CAP_DAC_OVERRIDE
- CAP_FOWNER
- CAP_FSETID
- CAP_KILL
- CAP_MKNOD
- CAP_NET_BIND_SERVICE
- CAP_NET_RAW
- CAP_SETFCAP
- CAP_SETGID
- CAP_SETPCAP
- CAP_SETUID
- CAP_SYS_CHROOT
ExecIDs: []
ExitCommand:
- /usr/bin/podman
- --root
- /home/ec2-user/.local/share/containers/storage
- --runroot
- /run/user/1000/containers
- --log-level
- error
- --cgroup-manager
- cgroupfs
- --tmpdir
- /run/user/1000/libpod/tmp
- --runtime
- runc
- --storage-driver
- overlay
- --storage-opt
- overlay.mount_program=/usr/bin/fuse-overlayfs
- --events-backend
- file
- container
- cleanup
- 829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf
GraphDriver:
Data:
LowerDir: /home/ec2-user/.local/share/containers/storage/overlay/ace0eda3e3be35a979cec764a3321b4c7d0b9e4bb3094d20d3ff6782961a8d54/diff
MergedDir: /home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/merged
UpperDir: /home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/diff
WorkDir: /home/ec2-user/.local/share/containers/storage/overlay/a06b95a9cae13d65d1773dcad532ea0391cde7fba8d32551eb72db46b587a26e/work
Name: overlay
HostConfig:
AutoRemove: false
Binds:
- test:/opt/test:rw,rprivate,nosuid,nodev,rbind
BlkioDeviceReadBps: null
BlkioDeviceReadIOps: null
BlkioDeviceWriteBps: null
BlkioDeviceWriteIOps: null
BlkioWeight: 0
BlkioWeightDevice: null
CapAdd: []
CapDrop: []
Cgroup: ''
CgroupMode: host
CgroupParent: ''
Cgroups: default
ConsoleSize:
- 0
- 0
ContainerIDFile: ''
CpuCount: 0
CpuPercent: 0
CpuPeriod: 0
CpuQuota: 0
CpuRealtimePeriod: 0
CpuRealtimeRuntime: 0
CpuShares: 0
CpusetCpus: ''
CpusetMems: ''
Devices: []
DiskQuota: 0
Dns: []
DnsOptions: []
DnsSearch: []
ExtraHosts: []
GroupAdd: []
IOMaximumBandwidth: 0
IOMaximumIOps: 0
IpcMode: private
Isolation: ''
KernelMemory: 0
Links: null
LogConfig:
Config: null
Type: journald
Memory: 0
MemoryReservation: 0
MemorySwap: 0
MemorySwappiness: 0
NanoCpus: 0
NetworkMode: bridge
OomKillDisable: false
OomScoreAdj: 0
PidMode: private
PidsLimit: 0
PortBindings: {}
Privileged: false
PublishAllPorts: false
ReadonlyRootfs: false
RestartPolicy:
MaximumRetryCount: 0
Name: ''
Runtime: oci
SecurityOpt: []
ShmSize: 65536000
Tmpfs: {}
UTSMode: private
Ulimits: []
UsernsMode: ''
VolumeDriver: ''
VolumesFrom: null
HostnamePath: /run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/hostname
HostsPath: /run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/hosts
Id: 829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf
Image: d6e46aa2470df1d32034c6707c8041158b652f38d2a9ae3d7ad7e7532d22ebe0
ImageName: docker.io/library/alpine:latest
IsInfra: false
LogPath: ''
LogTag: XXX
MountLabel: system_u:object_r:container_file_t:s0:c364,c602
Mounts:
- Destination: /opt/test
Driver: local
Mode: ''
Name: test
Options:
- nosuid
- nodev
- rbind
Propagation: rprivate
RW: true
Source: /home/ec2-user/.local/share/containers/storage/volumes/test/_data
Type: volume
Name: test1
Namespace: ''
NetworkSettings:
Bridge: ''
EndpointID: ''
Gateway: ''
GlobalIPv6Address: ''
GlobalIPv6PrefixLen: 0
HairpinMode: false
IPAddress: ''
IPPrefixLen: 0
IPv6Gateway: ''
LinkLocalIPv6Address: ''
LinkLocalIPv6PrefixLen: 0
MacAddress: ''
Ports: {}
SandboxID: ''
SandboxKey: /run/user/1000/netns/cni-d791e67e-0cb1-b85a-8a9a-fde65a1918be
OCIConfigPath: /home/ec2-user/.local/share/containers/storage/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/config.json
OCIRuntime: runc
Path: sleep
Pod: ''
ProcessLabel: system_u:system_r:container_t:s0:c364,c602
ResolvConfPath: /run/user/1000/containers/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata/resolv.conf
RestartCount: 0
Rootfs: ''
State:
ConmonPid: 71726
Dead: false
Error: ''
ExitCode: 0
FinishedAt: '0001-01-01T00:00:00Z'
Healthcheck:
FailingStreak: 0
Log: null
Status: ''
OOMKilled: false
OciVersion: 1.0.2-dev
Paused: false
Pid: 71736
Restarting: false
Running: true
StartedAt: '2020-11-11T14:33:59.403230597Z'
Status: running
StaticDir: /home/ec2-user/.local/share/containers/storage/overlay-containers/829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf/userdata
invocation:
module_args:
annotation: null
authfile: null
blkio_weight: null
blkio_weight_device: null
cap_add: null
cap_drop: null
cgroup_parent: null
cgroupns: null
cgroups: null
cidfile: null
cmd_args: null
command: sleep 1d
conmon_pidfile: null
cpu_period: null
cpu_rt_period: null
cpu_rt_runtime: null
cpu_shares: null
cpus: null
cpuset_cpus: null
cpuset_mems: null
debug: false
detach: true
detach_keys: null
device: null
device_read_bps: null
device_read_iops: null
device_write_bps: null
device_write_iops: null
dns: null
dns_option: null
dns_search: null
entrypoint: null
env: null
env_file: null
env_host: null
etc_hosts: null
executable: podman
expose: null
force_restart: false
gidmap: null
group_add: null
healthcheck: null
healthcheck_interval: null
healthcheck_retries: null
healthcheck_start_period: null
healthcheck_timeout: null
hostname: null
http_proxy: null
image: alpine:latest
image_strict: false
image_volume: null
init: null
init_path: null
interactive: null
ip: null
ipc: null
kernel_memory: null
label: null
label_file: null
log_driver: journald
log_opt: tag=XXX
memory: null
memory_reservation: null
memory_swap: null
memory_swappiness: null
mount: null
name: test1
network:
- bridge
no_hosts: null
oom_kill_disable: null
oom_score_adj: null
pid: null
pids_limit: null
pod: null
privileged: null
publish: null
publish_all: null
read_only: null
read_only_tmpfs: null
recreate: false
restart_policy: null
rm: null
rootfs: null
security_opt: null
shm_size: null
sig_proxy: null
state: started
stop_signal: null
stop_timeout: null
subgidname: null
subuidname: null
sysctl: null
systemd: null
tmpfs: null
tty: null
uidmap: null
ulimit: null
user: null
userns: null
uts: null
volume:
- test:/opt/test
volumes_from: null
workdir: null
podman_actions:
- podman rm -f test1
- podman run --name test1 --network bridge --volume test:/opt/test --log-driver journald --log-opt tag=XXX --detach=True alpine:latest sleep 1d
stderr: ''
stderr_lines: <omitted>
stdout: |-
829ae9349e539924591f3a2d0c5a0315d699f0db0bf6235738921b2d4a465ddf
stdout_lines: <omitted>
META: ran handlers
META: ran handlers
PLAY RECAP ***************************************************************************************************************************************************************************************************************************
podman-dev-01 : ok=2 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
Target OS is: Red Hat Enterprise Linux release 8.3 (Ootpa)
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
Steps to reproduce the issue:
podman run --rm -it centos:8
ansible -vvi a935777de543, -c podman -m setup all
Describe the results you received:
Failed to mount container a935777de543: b'Error: cannot mount using driver overlay in rootless mode'
[WARNING]: Unhandled error in Python interpreter discovery for host a935777de543: Expecting value: line 1 column 1
(char 0)
a935777de543 | FAILED! => {
"msg": "Failed to set execute bit on remote files (rc: 1, err: chmod: cannot access '/root/.ansible/tmp/ansible-tmp-1588529245.4183488-80743-226755747598850/AnsiballZ_setup.py': No such file or directory\nError: non zero exit code: 1: OCI runtime error\n)"
}
Describe the results you expected:
Ansible should connect to the host.
If this is not supposed to work, the error message should be improved. Without extra verbosity it fails with the message about the temp file in /root
. I guess the failure to mount the container should be a hard error instead of a warning.
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.7
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/till/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.2 (default, Feb 28 2020, 00:00:00) [GCC 10.0.1 20200216 (Red Hat 10.0.1-0.8)]
Output of podman version
:
1.8.2
Output of podman info --debug
:
debug:
compiler: gc
git commit: ""
go version: go1.14
podman version: 1.8.2
host:
BuildahVersion: 1.14.3
CgroupVersion: v2
Conmon:
package: conmon-2.0.15-1.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.15, commit: 33da5ef83bf2abc7965fc37980a49d02fdb71826'
Distribution:
distribution: fedora
version: "32"
IDMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 2131616
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 2131616
size: 65536
MemFree: 1978478592
MemTotal: 8217530368
OCIRuntime:
name: crun
package: crun-0.13-2.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.13
commit: e79e4de4ac16da0ce48777afb72c6241de870525
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
SwapFree: 8321495040
SwapTotal: 8321495040
arch: amd64
cpus: 4
eventlogger: journald
hostname: excalibur
kernel: 5.6.7-300.fc32.x86_64
os: linux
rootless: true
slirp4netns:
Executable: /usr/bin/slirp4netns
Package: slirp4netns-1.0.0-1.fc32.x86_64
Version: |-
slirp4netns version 1.0.0
commit: a3be729152a33e692cd28b52f664defbf2e7810a
libslirp: 4.2.0
uptime: 28h 23m 11.46s (Approximately 1.17 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
ConfigFile: /home/till/.config/containers/storage.conf
ContainerStore:
number: 2
GraphDriverName: overlay
GraphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.0.0-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.0.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
GraphRoot: /home/till/.local/share/containers/storage
GraphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
ImageStore:
number: 3
RunRoot: /run/user/1000
VolumePath: /home/till/.local/share/containers/storage/volumes
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-1.8.2-2.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
(paste your output here)
Command line and output of ansible run with high verbosity:
ansible 2.9.7
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/till/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.2 (default, Feb 28 2020, 00:00:00) [GCC 10.0.1 20200216 (Red Hat 10.0.1-0.8)]
Using /etc/ansible/ansible.cfg as config file
setting up inventory plugins
Parsed a935777de543, inventory source with host_list plugin
Loading callback plugin minimal of type stdout, v2.0 from /usr/lib/python3.8/site-packages/ansible/plugins/callback/minimal.py
META: ran handlers
<a935777de543> RUN [b'podman', b'mount', b'a935777de543']
STDOUT b''
STDERR b'Error: cannot mount using driver overlay in rootless mode\n'
RC CODE 125
Failed to mount container a935777de543: b'Error: cannot mount using driver overlay in rootless mode'
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b'echo ~ && sleep 0']
STDOUT b'/root\n'
STDERR b''
RC CODE 0
STDOUT b'' STDERR b''
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188 && echo ansible-tmp-1588529460.721404-81242-28143645726188="` echo /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188 `" ) && sleep 0']
STDOUT b'ansible-tmp-1588529460.721404-81242-28143645726188=/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188\n'
STDERR b''
RC CODE 0
STDOUT b'' STDERR b''
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/namespace.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/ansible_collector.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/basic.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/default_collectors.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/timeout.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/collector.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/parsing/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/_utils.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/validation.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/process.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/parsing/convert_bool.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/parameters.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/sys_info.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/text/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/file.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/_text.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/six/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/pycompat24.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/_collections_compat.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/_json_compat.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/text/converters.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/text/formatters.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/common/collections.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/distro/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/distro/_distro.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/compat.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/python.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/other/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/freebsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/platform.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/other/ohai.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/hurd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/freebsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/caps.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/other/facter.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/date_time.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/aix.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/sunos.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/base.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/cmdline.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/pkg_mgr.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/base.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/openbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/nvme.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/service_mgr.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/hpux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/fc_wwn.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/chroot.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/darwin.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/dragonfly.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/ssh_pub_keys.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/netbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/dns.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/local.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/linux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/selinux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/iscsi.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/freebsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/aix.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/fips.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/hurd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/sunos.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/sunos.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/env.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/lsb.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/distribution.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/__init__.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/hpux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/openbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/base.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/openbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/netbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/hpux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/dragonfly.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/netbsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/apparmor.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/dragonfly.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/linux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/system/user.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/linux.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/darwin.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/network/generic_bsd.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/utils.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/sysctl.py
Using module_utils file /usr/lib/python3.8/site-packages/ansible/module_utils/facts/sysctl.py
<a935777de543> Attempting python interpreter discovery
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b"echo PLATFORM; uname; echo FOUND; command -v '/usr/bin/python'; command -v 'python3.7'; command -v 'python3.6'; command -v 'python3.5'; command -v 'python2.7'; command -v 'python2.6'; command -v '/usr/libexec/platform-python'; command -v '/usr/bin/python3'; command -v 'python'; echo ENDFOUND && sleep 0"]
STDOUT b'PLATFORM\nLinux\nFOUND\n/usr/libexec/platform-python\nENDFOUND\n'
STDERR b''
RC CODE 0
STDOUT b'' STDERR b''
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b'/usr/libexec/platform-python && sleep 0']
STDOUT b''
STDERR b''
RC CODE 0
STDOUT b'' STDERR b''
[WARNING]: Unhandled error in Python interpreter discovery for host a935777de543: Expecting value: line 1 column 1
(char 0)
Using module file /usr/lib/python3.8/site-packages/ansible/modules/system/setup.py
<a935777de543> PUT /home/till/.ansible/tmp/ansible-local-81239icyqidip/tmp_54n02qf TO /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py
<a935777de543> RUN [b'podman', b'cp', b'/home/till/.ansible/tmp/ansible-local-81239icyqidip/tmp_54n02qf', b'a935777de543:/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py']
STDOUT b''
STDERR b'writing file `/sys/fs/cgroup//cgroup.freeze`: Permission denied\nError: `/usr/bin/crun pause a935777de543ade936bfcd412343f3038e538513ce397e5349b3c7eaa17f508e` failed: exit status 1\n'
RC CODE 125
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b'chmod u+x /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/ /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py && sleep 0']
STDOUT b''
STDERR b"chmod: cannot access '/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py': No such file or directory\nError: non zero exit code: 1: OCI runtime error\n"
RC CODE 1
STDOUT b"chmod: cannot access '/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py': No such file or directory\nError: non zero exit code: 1: OCI runtime error\n" STDERR b"chmod: cannot access '/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py': No such file or directory\nError: non zero exit code: 1: OCI runtime error\n"
<a935777de543> RUN [b'podman', b'exec', b'a935777de543', b'/bin/sh', b'-c', b'rm -f -r /root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/ > /dev/null 2>&1 && sleep 0']
STDOUT b''
STDERR b''
RC CODE 0
STDOUT b'' STDERR b''
a935777de543 | FAILED! => {
"msg": "Failed to set execute bit on remote files (rc: 1, err: chmod: cannot access '/root/.ansible/tmp/ansible-tmp-1588529460.721404-81242-28143645726188/AnsiballZ_setup.py': No such file or directory\nError: non zero exit code: 1: OCI runtime error\n)"
}
Additional environment details (AWS, VirtualBox, physical, etc.):
Hi,
When testing the podman_container.py module, I noticed it kept recreating my containers somewhat unexpectedly. It turns out, for the container I was testing with (quay.io/prometheus/prometheus) I needed to set the workdir
argument since the default in the module was different from what the container used. To find this I had to print out dff_func from the is_different function to see which argument I needed to add. I think it may be useful to include which arguments the module sees as different in the return values for easy debugging.
Thanks,
David
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
I see you can install the collection using pip since #100. Great! I use https://python-poetry.org/ to handle my sub-collection, but it doesn't seem to see ansible podman collection if pip-installed in the venv.
Steps to reproduce the issue:
python3 -m venv .venv
poetry env use .venv/bin/python
poetry init
poetry add ansible
poetry add 'git+https://github.com/containers/[email protected]'
poetry run ansible-doc containers.podman.podman_container
Describe the results you received:
[WARNING]: module containers.podman.podman_container not found in: /var/home/yajo/.ansible/plugins/modules:/usr/share/ansible/plugins/modules:/var/home/yajo/mydevel/mando/.venv/lib64/python3.8/site-
packages/ansible/modules
Describe the results you expected:
> PODMAN_CONTAINER (/var/home/yajo/mydevel/mando/collections/ansible_collections/containers/podman/plugins/modules/podman_container.py)
Start, stop, restart and manage Podman containers
[...]
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.11
config file = /var/home/yajo/mydevel/mando/ansible.cfg
configured module search path = ['/var/home/yajo/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /var/home/yajo/mydevel/mando/.venv/lib64/python3.8/site-packages/ansible
executable location = /var/home/yajo/mydevel/mando/.venv/bin/ansible
python version = 3.8.6 (default, Sep 25 2020, 00:00:00) [GCC 10.2.1 20200723 (Red Hat 10.2.1-1)]
Output of podman version
:
podman version 2.1.1
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: cgroupfs
cgroupVersion: v1
conmon:
package: conmon-2.0.21-2.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.21, commit: 81d18b6c3ffc266abdef7ca94c1450e669a6a388'
cpus: 12
distribution:
distribution: fedora
version: "32"
eventLogger: journald
hostname: yajolap-tecnativa-com
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.8.16-200.fc32.x86_64
linkmode: dynamic
memFree: 3325992960
memTotal: 16616603648
ociRuntime:
name: runc
package: runc-1.0.0-144.dev.gite6555cc.fc32.x86_64
path: /usr/bin/runc
version: |-
runc version 1.0.0-rc10+dev
commit: fbdbaf85ecbc0e077f336c03062710435607dbf1
spec: 1.0.1-dev
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.4-1.fc32.x86_64
version: |-
slirp4netns version 1.1.4
commit: b66ffa8e262507e37fca689822d23430f3357fe8
libslirp: 4.3.1
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 8373198848
swapTotal: 8376020992
uptime: 18h 16m 52.25s (Approximately 0.75 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /var/home/yajo/.config/containers/storage.conf
containerStore:
number: 2
paused: 0
running: 0
stopped: 2
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.2.0-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /var/home/yajo/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 29
runRoot: /run/user/1000/containers
volumePath: /var/home/yajo/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 1601494271
BuiltTime: Wed Sep 30 20:31:11 2020
GitCommit: ""
GoVersion: go1.14.9
OsArch: linux/amd64
Version: 2.1.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.1.1-7.fc32.x86_64
Additional environment details (AWS, VirtualBox, physical, etc.):
phyisical
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
When creating a podman container using containers.podman.podman_container with UDP ports
it is always recreated even when it is already on the target on the expected state.
Steps to reproduce the issue:
Use podman_container
to deploy a container with udp ports
Re-run the playbook
Describe the results you received:
The container is re-created
Describe the results you expected:
The container should be left alone
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.10
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/xdbob/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 17 2020, 18:15:42) [GCC 10.1.0]
Output of podman version
:
Version: 1.9.3
RemoteAPI Version: 1
Go Version: go1.14.2
OS/Arch: linux/arm64
Output of podman info --debug
:
debug:
compiler: gc
gitCommit: ""
goVersion: go1.14.2
podmanVersion: 1.9.3
host:
arch: arm64
buildahVersion: 1.14.9
cgroupVersion: v2
conmon:
package: conmon-2.0.18-1.fc32.aarch64
path: /usr/bin/conmon
version: 'conmon version 2.0.18, commit: d524a9da2a836de897ccb260f1afe1cae44f1cb4'
cpus: 4
distribution:
distribution: fedora
version: "32"
eventLogger: file
hostname: paradise
idMappings:
gidmap: null
uidmap: null
kernel: 5.6.19-300.fc32.aarch64
memFree: 28033024
memTotal: 984604672
ociRuntime:
name: crun
package: crun-0.13-2.fc32.aarch64
path: /usr/bin/crun
version: |-
crun version 0.13
commit: e79e4de4ac16da0ce48777afb72c6241de870525
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
rootless: false
slirp4netns:
executable: ""
package: ""
version: ""
swapFree: 391966720
swapTotal: 468996096
uptime: 166h 48m 36.49s (Approximately 6.92 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /etc/containers/storage.conf
containerStore:
number: 1
paused: 0
running: 1
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mountopt: nodev,metacopy=on
graphRoot: /var/lib/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "true"
imageStore:
number: 2
runRoot: /var/run/containers/storage
volumePath: /var/lib/containers/storage/volumes
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-1.9.3-1.fc32.aarch64
[...snip]
** task informations **
The task looks like this:
containers.podman.podman_container:
name: unifi-controller
image: 'linuxserver/unifi-controller:latest'
state: present
ports:
- '3478:3478/udp'
- '10001:10001/udp'
- '8080:8080'
- '8443:8443'
- '1900:1900/udp'
volumes:
- '{{ unifi_dir }}:/config'
security_opt:
- 'seccomp=unconfined'
bug tracking
I've backtracked the bug to diffparam_publish
, see the content of the diffs
dict from the different
method:
{
"before": {
"publish": [
"10001:10001",
"1900:1900",
"3478:3478",
"8080:8080",
"8443:8443"
]
},
"after": {
"publish": [
"10001:10001/udp",
"1900:1900/udp",
"3478:3478/udp",
"8080:8080",
"8443:8443"
]
}
}
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
containers.podman.podman_container
always recreates the elasticsearch container but the ansible diff (-D) is empty.
It works fine for my other containers (mongoDB, grafana, ...).
I'm using the version 1.0.3 of ansible-podman-collections.
Steps to reproduce the issue:
- name: Update Elasticsearch Container
containers.podman.podman_container:
env:
bootstrap.memory_lock: "true"
cluster.name: my_cluster
discovery.zen.minimum_master_nodes: 1
discovery.zen.ping.unicast.hosts: "10.0.0.4,10.0.0.10"
network.host: 0.0.0.0
network.publish_host: "10.0.0.10"
node.name: "elastic2"
ES_JAVA_OPTS: "-Xms7881m -Xmx7881m"
image: "docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10"
name: elasticsearch
ports:
- "9200:9200"
- "9300:9300"
state: present
volumes:
- "es-data:/usr/share/elasticsearch/data"
ulimit:
- 'nofile=65535:65535'
- 'memlock=-1:-1'
Describe the results you received:
TASK [elastic : Update Elasticsearch Container] ***********************************************************************************************************
changed: [elastic2]
Describe the results you expected:
TASK [elastic : Update Elasticsearch Container] ***********************************************************************************************************
ok: [elastic2]
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.6
config file = /home/julien/dev/audriga-update/ansible.cfg
configured module search path = ['/home/julien/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.2 (default, Apr 27 2020, 15:53:34) [GCC 9.3.0]
Output of podman version
:
Version: 1.9.3
RemoteAPI Version: 1
Go Version: go1.13.8
OS/Arch: linux/amd64
Output of podman info --debug
:
debug:
compiler: gc
gitCommit: ""
goVersion: go1.13.8
podmanVersion: 1.9.3
host:
arch: amd64
buildahVersion: 1.14.9
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.16, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: file
hostname: elastic2
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.4.0-1015-aws
memFree: 5526822912
memTotal: 16528183296
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.0.0
commit: unknown
libslirp: 4.2.0
swapFree: 0
swapTotal: 0
uptime: 153h 2m 4.23s (Approximately 6.38 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 0
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman/unknown,now 1.9.3~1 amd64 [installed]
Playbok you run with ansible (e.g. content of playbook.yaml
):
As specified above
Command line and output of ansible run with high verbosity:
Note: i only put the verbose information for the "Update Elasticsearch Container" task
$ ansible-playbook -D
Using /home/julien/dev/audriga-update/ansible.cfg as config file
PLAY [all] ************************************************************************************************************************************************
TASK [Gathering Facts] ************************************************************************************************************************************
ok: [elastic2.exp]
PLAY [elastic_servers] ************************************************************************************************************************************
TASK [Gathering Facts] ************************************************************************************************************************************
ok: [elastic2.exp]
TASK [common : Check if reboot is required] ***************************************************************************************************************
ok: [elastic2.exp]
TASK [common : Read new versions] *************************************************************************************************************************
ok: [elastic2.exp -> localhost]
TASK [elastic : Set common variables] *********************************************************************************************************************
ok: [elastic2.exp]
TASK [elastic : Recalculate minimum masters if more than two nodes] ***************************************************************************************
skipping: [elastic2.exp]
TASK [elastic : Check current Elasticsearch version] ******************************************************************************************************
ok: [elastic2.exp]
TASK [elastic : Update Elasticsearch Container] ***********************************************************************************************************
changed: [elastic2.exp] => {"actions": ["recreated elasticsearch"], "changed": true, "container": {"AppArmorProfile": "container-default-1.9.3", "Args": ["eswrapper"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "Config": {"Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-06-16T15:44:43.817609438Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "AttachStderr": false, "AttachStdin": false, "AttachStdout": false, "Cmd": ["eswrapper"], "CreateCommand": ["podman", "container", "run", "--name", "elasticsearch", "--env", "bootstrap.memory_lock=true", "--env", "cluster.name=graylog", "--env", "discovery.zen.minimum_master_nodes=1", "--env", "discovery.zen.ping.unicast.hosts=10.0.0.4,10.0.0.10", "--env", "network.host=0.0.0.0", "--env", "network.publish_host=10.0.0.10", "--env", "node.name=elastic2", "--env", "ES_JAVA_OPTS=-Xms7881m -Xmx7881m", "--ulimit", "nofile=65535:65535", "--ulimit", "memlock=-1:-1", "--publish", "9200:9200", "--publish", "9300:9300", "--volume", "es-data:/usr/share/elasticsearch/data", "--detach=True", "docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10"], "Domainname": "", "Entrypoint": "/usr/local/bin/docker-entrypoint.sh", "Env": ["PATH=/usr/share/elasticsearch/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "HOSTNAME=94ac88929299", "node.name=elastic2", "JAVA_HOME=/opt/jdk-14.0.1+7", "cluster.name=graylog", "discovery.zen.minimum_master_nodes=1", "discovery.zen.ping.unicast.hosts=10.0.0.4,10.0.0.10", "container=podman", "network.publish_host=10.0.0.10", "ES_JAVA_OPTS=-Xms7881m -Xmx7881m", "network.host=0.0.0.0", "ELASTIC_CONTAINER=true", "bootstrap.memory_lock=true", "HOME=/root"], "Hostname": "94ac88929299", "Image": "docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10", "Labels": {"org.label-schema.build-date": "2020-05-28T14:47:19.882936Z", "org.label-schema.license": "Apache-2.0", "org.label-schema.name": "Elasticsearch", "org.label-schema.schema-version": "1.0", "org.label-schema.url": "https://www.elastic.co/products/elasticsearch", "org.label-schema.usage": "https://www.elastic.co/guide/en/elasticsearch/reference/index.html", "org.label-schema.vcs-ref": "537cb22e7ffb62ee541e1af1881d18e3a78e6133", "org.label-schema.vcs-url": "https://github.com/elastic/elasticsearch", "org.label-schema.vendor": "Elastic", "org.label-schema.version": "6.8.10", "org.opencontainers.image.created": "2020-05-28T14:47:19.882936Z", "org.opencontainers.image.documentation": "https://www.elastic.co/guide/en/elasticsearch/reference/index.html", "org.opencontainers.image.licenses": "Apache-2.0", "org.opencontainers.image.revision": "537cb22e7ffb62ee541e1af1881d18e3a78e6133", "org.opencontainers.image.source": "https://github.com/elastic/elasticsearch", "org.opencontainers.image.title": "Elasticsearch", "org.opencontainers.image.url": "https://www.elastic.co/products/elasticsearch", "org.opencontainers.image.vendor": "Elastic", "org.opencontainers.image.version": "6.8.10"}, "OnBuild": null, "OpenStdin": false, "StdinOnce": false, "StopSignal": 15, "Tty": false, "User": "", "Volumes": null, "WorkingDir": "/usr/share/elasticsearch"}, "ConmonPidFile": "/var/run/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/conmon.pid", "Created": "2020-06-16T15:44:43.817609438Z", "Dependencies": [], "Driver": "overlay", "EffectiveCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "ExecIDs": [], "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "runc", "--events-backend", "file", "container", "cleanup", "94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f"], "GraphDriver": {"Data": {"LowerDir": "/var/lib/containers/storage/overlay/8b3ceaceff0e1730ad6dbd172ce546981f9ed44208850d9ba6c8239e67ebcbb2/diff:/var/lib/containers/storage/overlay/94c4471c83f00c50d2f6721481d0b6b5af2fb97b7ba8211397249fc48d015214/diff:/var/lib/containers/storage/overlay/f78b266b760048caaac675559e8afe4a58941a4599eb1cfb7d419cf086f78e53/diff:/var/lib/containers/storage/overlay/ff81995b8d18e7efea828d1b7e355637bad222034750675f342af233d6032258/diff:/var/lib/containers/storage/overlay/7369641e1639af306ffb9095461a0e6b0620d85649cdd2c412dfaa70239b3eae/diff:/var/lib/containers/storage/overlay/d349e29a2cde4fe1b85829be81169f91a9527d7673a70b685ae31caad6c99f41/diff:/var/lib/containers/storage/overlay/edf3aa290fb3c255a84fe836109093fbfeef65c08544f655fad8d6afb53868ba/diff", "MergedDir": "/var/lib/containers/storage/overlay/1491c7cfaca48bf1d664cc2f539902fe4fdd6ad545c7b3dde2506b2e44120b41/merged", "UpperDir": "/var/lib/containers/storage/overlay/1491c7cfaca48bf1d664cc2f539902fe4fdd6ad545c7b3dde2506b2e44120b41/diff", "WorkDir": "/var/lib/containers/storage/overlay/1491c7cfaca48bf1d664cc2f539902fe4fdd6ad545c7b3dde2506b2e44120b41/work"}, "Name": "overlay"}, "HostConfig": {"AutoRemove": false, "Binds": ["es-data:/usr/share/elasticsearch/data:rw,rprivate,nosuid,nodev,rbind"], "BlkioDeviceReadBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceWriteIOps": null, "BlkioWeight": 0, "BlkioWeightDevice": null, "CapAdd": [], "CapDrop": [], "Cgroup": "", "CgroupParent": "", "Cgroups": "default", "ConsoleSize": [0, 0], "ContainerIDFile": "", "CpuCount": 0, "CpuPercent": 0, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpuShares": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IOMaximumBandwidth": 0, "IOMaximumIOps": 0, "IpcMode": "", "Isolation": "", "KernelMemory": 0, "Links": null, "LogConfig": {"Config": null, "Type": "k8s-file"}, "Memory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": -1, "NanoCpus": 0, "NetworkMode": "default", "OomKillDisable": false, "OomScoreAdj": 0, "PidMode": "", "PidsLimit": 4096, "PortBindings": {"9200/tcp": [{"HostIp": "", "HostPort": "9200"}], "9300/tcp": [{"HostIp": "", "HostPort": "9300"}]}, "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "RestartPolicy": {"MaximumRetryCount": 0, "Name": ""}, "Runtime": "oci", "SecurityOpt": [], "ShmSize": 65536000, "Tmpfs": {}, "UTSMode": "", "Ulimits": [{"Hard": 65535, "Name": "RLIMIT_NOFILE", "Soft": 65535}, {"Hard": 4194304, "Name": "RLIMIT_NPROC", "Soft": 4194304}, {"Hard": 18446744073709551615, "Name": "RLIMIT_MEMLOCK", "Soft": 18446744073709551615}], "UsernsMode": "", "VolumeDriver": "", "VolumesFrom": null}, "HostnamePath": "/var/run/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/hosts", "Id": "94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f", "Image": "cd93b3ec0f11a43528f0774860efa7f069197aa480788971e11b4af6d637db01", "ImageName": "docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10", "IsInfra": false, "LogPath": "/var/lib/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/ctr.log", "LogTag": "", "MountLabel": "", "Mounts": [{"Destination": "/usr/share/elasticsearch/data", "Driver": "local", "Mode": "", "Name": "es-data", "Options": ["nosuid", "nodev", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/es-data/_data", "Type": "volume"}], "Name": "elasticsearch", "Namespace": "", "NetworkSettings": {"Bridge": "", "EndpointID": "", "Gateway": "10.88.0.1", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "HairpinMode": false, "IPAddress": "10.88.0.15", "IPPrefixLen": 16, "IPv6Gateway": "", "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "MacAddress": "a6:f2:9d:e1:10:85", "Ports": [{"containerPort": 9200, "hostIP": "", "hostPort": 9200, "protocol": "tcp"}, {"containerPort": 9300, "hostIP": "", "hostPort": 9300, "protocol": "tcp"}], "SandboxID": "", "SandboxKey": "/var/run/netns/cni-486c756c-34fd-c21a-f2b8-427d6a8cc3f2"}, "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/config.json", "OCIRuntime": "runc", "Path": "/usr/local/bin/docker-entrypoint.sh", "Pod": "", "ProcessLabel": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata/resolv.conf", "RestartCount": 0, "Rootfs": "", "State": {"ConmonPid": 99523, "Dead": false, "Error": "", "ExitCode": 0, "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"FailingStreak": 0, "Log": null, "Status": ""}, "OOMKilled": false, "OciVersion": "1.0.1-dev", "Paused": false, "Pid": 99553, "Restarting": false, "Running": true, "StartedAt": "2020-06-16T15:44:48.641791352Z", "Status": "running"}, "StaticDir": "/var/lib/containers/storage/overlay-containers/94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f/userdata"}, "podman_actions": ["podman rm -f elasticsearch", "podman run --name elasticsearch --env bootstrap.memory_lock=true --env cluster.name=graylog --env discovery.zen.minimum_master_nodes=1 --env discovery.zen.ping.unicast.hosts=10.0.0.4,10.0.0.10 --env network.host=0.0.0.0 --env network.publish_host=10.0.0.10 --env node.name=elastic2 --env ES_JAVA_OPTS=-Xms7881m -Xmx7881m --ulimit nofile=65535:65535 --ulimit memlock=-1:-1 --publish 9200:9200 --publish 9300:9300 --volume es-data:/usr/share/elasticsearch/data --detach=True docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.10"], "stderr": "", "stderr_lines": [], "stdout": "94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f\n", "stdout_lines": ["94ac88929299de7b80799ea00d3dabdbd5f385154f3b0db17be64d3bed1e936f"]}
PLAY RECAP ************************************************************************************************************************************************
elastic2.exp : ok=7 changed=1 unreachable=0 failed=0 skipped=1 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
AWS
My application need to be run from a user account, not from root.
Despite instruction given to ansible tu use a user account with buildah connection, it still run commands as root.
connection
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /bin/ansible
python version = 3.6.8 (default, Nov 21 2019, 19:31:34) [GCC 8.3.1 20190507 (Red Hat 8.3.1-4)]
DEFAULT_REMOTE_USER(env: ANSIBLE_REMOTE_USER) = my_user
CentOS Linux release 8.1.1911 (Core)
buildah version 1.11.6 (image-spec 1.0.1-dev, runtime-spec 1.0.1-dev)
- name: start my_app master
command: "/opt/my_company/my_app/bin/my_bin"
working_container ansible_connection=buildah remote_user=my_user ansible_user=my_user
/opt/my_company/my_app/bin/my_bin executed with my_user privileges
TASK [start my_app master] ********************************************************************************************************************
<working_container> RUN [b'buildah', b'mount', b'--', b'working_container']
<working_container> RUN [b'buildah', b'run', b'--', b'working_container', b'/bin/sh', b'-c', b'echo ~my_user && sleep 0']
<working_container> RUN [b'buildah', b'run', b'--', b'working_container', b'/bin/sh', b'-c', b'( umask 77 && mkdir -p "` echo /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115 `" && echo ansible-tmp-1586627789.45108-92880286661115="` echo /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115 `" ) && sleep 0']
Using module file /usr/lib/python3.6/site-packages/ansible/modules/commands/command.py
<working_container> PUT /root/.ansible/tmp/ansible-local-12264nv8oj4_2/tmpw659a3mr TO /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115/AnsiballZ_command.py
<working_container> RUN [b'buildah', b'run', b'--', b'working_container', b'/bin/sh', b'-c', b'chmod u+x /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115/ /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115/AnsiballZ_command.py && sleep 0']
<working_container> RUN [b'buildah', b'run', b'--', b'working_container', b'/bin/sh', b'-c', b'/usr/bin/python /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115/AnsiballZ_command.py && sleep 0']
<working_container> RUN [b'buildah', b'run', b'--', b'working_container', b'/bin/sh', b'-c', b'rm -f -r /home/my_user/.ansible/tmp/ansible-tmp-1586627789.45108-92880286661115/ > /dev/null 2>&1 && sleep 0']
<working_container> RUN [b'buildah', b'umount', b'--', b'working_container']
fatal: [working_container]: FAILED! => {
"changed": true,
"cmd": [
"/opt/my_compagny/my_app/bin/master",
],
"delta": "0:00:00.475125",
"end": "2020-04-11 19:56:31.646487",
"invocation": {
"module_args": {
"_raw_params": "/opt/my_compagny/my_app/bin/master",
"_uses_shell": false,
"argv": null,
"chdir": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 127,
"start": "2020-04-11 19:56:31.171362",
"stderr": "master[9]: fatal: {platform} cannot run as root",
"stderr_lines": [
"master[9]: fatal: {platform} cannot run as root"
],
"stdout": "",
"stdout_lines": []
}
Originally posted on ansible GitHub: ansible/ansible#68878
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Create a module to manage podman networks.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
It would be great to add a module for managing pods like the container module.
/kind bug
Description
Steps to reproduce the issue:
- name: Build and push image
podman_image:
path: ''{{ path }}'
name: '{{ image_name }}'
push: yes
username: '{{ username }}'
password: '{{ password }}'
push_args:
dest: '{{ self_signed_image_registry }}'
validate_certs: no
Describe the results you received:
x509: certificate signed by unknown authority
Additional information you deem important (e.g. issue happens only occasionally):
I think it is necessary to add else block including --tls-verify=false
after the following places.
ansible-podman-collections/plugins/modules/podman_image.py
Lines 565 to 567 in 6f7ab43
Output of ansible --version
:
ansible 2.9.9
config file = /home/vagrant/work/ocp4_infraci/ansible.cfg
configured module search path = [u'/home/vagrant/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /home/vagrant/.pyenv/versions/2.7.18/lib/python2.7/site-packages/ansible
executable location = /home/vagrant/.pyenv/versions/2.7.18/bin/ansible
python version = 2.7.18 (default, May 26 2020, 02:41:04) [GCC 8.3.1 20190507 (Red Hat 8.3.1-4)]
Output of podman version
:
Version: 1.9.3
RemoteAPI Version: 1
Go Version: go1.12.12
OS/Arch: linux/amd64
Output of podman info --debug
:
debug:
compiler: gc
gitCommit: ""
goVersion: go1.12.12
podmanVersion: 1.9.3
host:
arch: amd64
buildahVersion: 1.14.9
cgroupVersion: v1
conmon:
package: conmon-2.0.16-3.1.el8.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.16, commit: c2e3f3582e7da597a2e0f5ed9e028b415550d02b'
cpus: 1
distribution:
distribution: '"centos"'
version: "8"
eventLogger: file
hostname: localhost.localdomain
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 4.18.0-147.8.1.el8_1.x86_64
memFree: 95109120
memTotal: 500408320
ociRuntime:
name: runc
package: containerd.io-1.2.13-3.2.el7.x86_64
path: /usr/bin/runc
version: |-
runc version 1.0.0-rc10
commit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
spec: 1.0.1-dev
os: linux
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.0-10.2.el8.x86_64
version: |-
slirp4netns version 1.1.0-beta.1
commit: b703dd390bca7d029f8e68a9aca3f35c950dd78c
libslirp: 4.2.0
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 2076176384
swapTotal: 2147479552
uptime: 9h 1m 4.46s (Approximately 0.38 days)
registries: {}
store:
configFile: /home/vagrant/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.0.0-22.2.el8.x86_64
Version: |-
fusermount3 version: 3.2.1
fuse-overlayfs: version 1.0.0
FUSE library version 3.2.1
using FUSE kernel interface version 7.26
graphRoot: /home/vagrant/.local/share/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 2
runRoot: /run/user/1000
volumePath: /home/vagrant/.local/share/containers/storage/volumes
Output of rpm -q podman
:
podman-1.9.3-2.1.el8.x86_64
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
The podman connection plugin is showing a lot of messages like
Failed to mount container with CGroups2: empty dir b'/home/till/.local/share/containers/storage/overlay/ce57833ce38f71131609ff6bd6419280bb2da69dc6026ae29a2701562b149d51/merged'
when using verbosity level 1. This seems to be a debug message since there is no actionable task for the user and everything works as expected. Since verbosity level 1 is useful for other info, it would be nice to increase the required verbosity level, maybe to 5/"vvvvv" since this is what another message in the if structure uses.
Steps to reproduce the issue:
More info
Describe the results you received:
Using /home/till/.ansible.cfg as config file
Failed to mount container with CGroups2: empty dir b'/home/till/.local/share/containers/storage/overlay/ce57833ce38f71131609ff6bd6419280bb2da69dc6026ae29a2701562b149d51/merged'
[WARNING]: Unhandled error in Python interpreter discovery for host c8: Expecting value: line 1 column 1 (char 0)
[WARNING]: Platform linux on host c8 is using the discovered Python interpreter at /usr/bin/python, but future installation of another Python interpreter could change this. See
https://docs.ansible.com/ansible/2.9/reference_appendices/interpreter_discovery.html for more information.
c8 | SUCCESS => {
"ansible_facts": {
"ansible_all_ipv4_addresses": [
"192.0.2.1",
[...]
Describe the results you expected:
Using /home/till/.ansible.cfg as config file
[WARNING]: Unhandled error in Python interpreter discovery for host c8: Expecting value: line 1 column 1 (char 0)
[WARNING]: Platform linux on host c8 is using the discovered Python interpreter at /usr/bin/python, but future installation of another Python interpreter could change this. See
https://docs.ansible.com/ansible/2.9/reference_appendices/interpreter_discovery.html for more information.
c8 | SUCCESS => {
"ansible_facts": {
"ansible_all_ipv4_addresses": [
"192.0.2.1",
[...]
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.10
config file = /home/till/.ansible.cfg
configured module search path = ['/home/till/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 29 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
Output of podman version
:
Version: 2.0.1
API Version: 1
Go Version: go1.14.3
Built: Thu Jan 1 01:00:00 1970
OS/Arch: linux/amd64
Output of podman info --debug
:
Version: 2.0.1
API Version: 1
Go Version: go1.14.3
Built: Thu Jan 1 01:00:00 1970
OS/Arch: linux/amd64
ย tillย ๎ฐย ~ย ๎ฐย
ย ~ย ๎ฐย podman info --debug
host:
arch: amd64
buildahVersion: 1.15.0
cgroupVersion: v2
conmon:
package: conmon-2.0.18-1.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.18, commit: 6e8799f576f11f902cd8a8d8b45b2b2caf636a85'
cpus: 4
distribution:
distribution: fedora
version: "32"
eventLogger: file
hostname: caledvwlch
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.6.18-300.fc32.x86_64
linkmode: dynamic
memFree: 4088893440
memTotal: 16652611584
ociRuntime:
name: crun
package: crun-0.14-2.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.14
commit: ebc56fc9bcce4b3208bb0079636c80545122bf58
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.1-1.fc32.x86_64
version: |-
slirp4netns version 1.1.1
commit: bbf27c5acd4356edb97fa639b4e15e0cd56a39d5
libslirp: 4.2.0
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 15829610496
swapTotal: 17175670784
uptime: 464h 51m 57.33s (Approximately 19.33 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /home/till/.config/containers/storage.conf
containerStore:
number: 3
paused: 0
running: 1
stopped: 2
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.1.1-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /home/till/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 7
runRoot: /run/user/1000/containers
volumePath: /home/till/.local/share/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Thu Jan 1 01:00:00 1970
GitCommit: ""
GoVersion: go1.14.3
OsArch: linux/amd64
Version: 2.0.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.0.1-1.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
(paste your output here)
Command line and output of ansible run with high verbosity:
(paste your output here)```
**Additional environment details (AWS, VirtualBox, physical, etc.):**
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Would like the ability to provide arbitrary args to podman build
invocation through the command constructor, or support for more args supported by podman. In my particular use case, I need to provide ulimit
args for RUN
statements in a Dockerfile, which is supported by podman build
natively.
Steps to reproduce the issue:
Include a RUN
statement in your Dockerfile that requires a ulimit other than the default of 1024.
Use podman build /path --ulimit=nofile=4096:4096 -t imagename
Attempt to find a way to do the same with containers.podman.podman_image
.
Describe the results you received:
podman build
: Works as expected
containers.podman.podman_image
: The podman build
args are constructed incrementally from a fixed set of options, with no ability to specify either ulimit or arbitrary build args.
Describe the results you expected:
One of the following:
containers.podman.podman_image
provides either a ulimit arg, as a suboption to build
or in some other useful way.containers.podman.podman_image
provides a way to pass arbitrary args on invocation of podman build
, allowing me to use the --ulimit
arg.Additional information you deem important (e.g. issue happens only occasionally):
I can implement, but would like to know what the project's preferred implementation would be.
Output of ansible --version
:
$ ansible --version
ansible 2.9.10
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/james/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/james/.local/lib/python3.8/site-packages/ansible
executable location = /home/james/.local/bin/ansible
python version = 3.8.3 (default, May 29 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
Output of podman version
:
$ podman version
Version: 1.9.3
RemoteAPI Version: 1
Go Version: go1.14.2
OS/Arch: linux/amd64
Output of podman info --debug
:
$ podman info --debug
debug:
compiler: gc
gitCommit: ""
goVersion: go1.14.2
podmanVersion: 1.9.3
host:
arch: amd64
buildahVersion: 1.14.9
cgroupVersion: v2
conmon:
package: conmon-2.0.18-1.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.18, commit: 6e8799f576f11f902cd8a8d8b45b2b2caf636a85'
cpus: 32
distribution:
distribution: fedora
version: "32"
eventLogger: file
hostname: ws.jharmison.com
idMappings:
gidmap:
- container_id: 0
host_id: 752000001
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 752000001
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.6.19-300.fc32.x86_64
memFree: 18120613888
memTotal: 67414728704
ociRuntime:
name: crun
package: crun-0.13-2.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.13
commit: e79e4de4ac16da0ce48777afb72c6241de870525
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.1-1.fc32.x86_64
version: |-
slirp4netns version 1.1.1
commit: bbf27c5acd4356edb97fa639b4e15e0cd56a39d5
libslirp: 4.2.0
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 34359209984
swapTotal: 34359734272
uptime: 25h 45m 22.25s (Approximately 1.04 days)
registries:
search:
- registry.fedoraproject.org
- registry.redhat.io
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /home/james/.config/containers/storage.conf
containerStore:
number: 3
paused: 0
running: 1
stopped: 2
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.1.1-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /home/james/.local/share/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 298
runRoot: /run/user/752000001/containers
volumePath: /home/james/.local/share/containers/storage/volumes
Package info (e.g. output of rpm -q podman
or apt list podman
):
$ rpm -q podman
podman-1.9.3-1.fc32.x86_64
Playbook you run with ansible (e.g. content of playbook.yaml
):
---
- hosts: localhost
tasks:
- containers.podman.podman_image:
name: thing
path: '.'
build:
format: docker
Command line and output of ansible run with high verbosity:
ansible-playbook 2.9.10
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/james/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/james/.local/lib/python3.8/site-packages/ansible
executable location = /home/james/.local/bin/ansible-playbook
python version = 3.8.3 (default, May 29 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
PLAYBOOK: playbook.yml **********************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
1 plays in playbook.yml
PLAY [localhost] ****************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
TASK [Gathering Facts] **********************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
task path: /home/james/Projects/ansible-for-devops/example-ulimit/playbook.yml:2
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: james
<127.0.0.1> EXEC /bin/sh -c 'echo ~james && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/james/.ansible/tmp `"&& mkdir /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893 && echo ansible-tmp-1593620553.5421648-251491-127085850747893="` echo /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893 `" ) && sleep 0'
Using module file /home/james/.local/lib/python3.8/site-packages/ansible/modules/system/setup.py
<127.0.0.1> PUT /home/james/.ansible/tmp/ansible-local-251486cjccp_hj/tmposlz8_w1 TO /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893/AnsiballZ_setup.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893/ /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893/AnsiballZ_setup.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893/AnsiballZ_setup.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/james/.ansible/tmp/ansible-tmp-1593620553.5421648-251491-127085850747893/ > /dev/null 2>&1 && sleep 0'
ok: [localhost]
META: ran handlers
TASK [containers.podman.podman_image] *******************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
task path: /home/james/Projects/ansible-for-devops/example-ulimit/playbook.yml:4
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: james
<127.0.0.1> EXEC /bin/sh -c 'echo ~james && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/james/.ansible/tmp `"&& mkdir /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110 && echo ansible-tmp-1593620554.172542-251582-97793637764110="` echo /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110 `" ) && sleep 0'
Using module file /home/james/.ansible/collections/ansible_collections/containers/podman/plugins/modules/podman_image.py
<127.0.0.1> PUT /home/james/.ansible/tmp/ansible-local-251486cjccp_hj/tmpn56_r5p0 TO /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110/AnsiballZ_podman_image.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110/ /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110/AnsiballZ_podman_image.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python3 /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110/AnsiballZ_podman_image.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/james/.ansible/tmp/ansible-tmp-1593620554.172542-251582-97793637764110/ > /dev/null 2>&1 && sleep 0'
fatal: [localhost]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"auth_file": null,
"build": {
"annotation": null,
"cache": true,
"force_rm": null,
"format": "docker",
"rm": true,
"volume": null
},
"ca_cert_dir": null,
"executable": "podman",
"force": false,
"name": "thing",
"password": null,
"path": ".",
"pull": true,
"push": false,
"push_args": {
"compress": null,
"dest": null,
"format": null,
"remove_signatures": null,
"sign_by": null,
"transport": null
},
"state": "present",
"tag": "latest",
"username": null,
"validate_certs": true
}
},
"msg": "Failed to build image thing:latest: Error: UNABLE TO DO THING WITH LOW ULIMIT\nError: error building at STEP \"RUN /app/install-script\": error while running runtime: exit status 1\n"
}
PLAY RECAP **********************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
Physical runs on my host. This is a small example to demonstrate the particulars of my use case, with my actual implementation being a complex playbook with many roles, a custom library, etc.
Dockerfile contents:
FROM registry.redhat.io/ubi8/ubi-init
COPY install-script /app/install-script
RUN /app/install-script
CMD ["/bin/bash", "-l"]
install-script contents:
#!/bin/bash
if [ $(ulimit -H -n) -lt 4096 ]; then
echo "Error: UNABLE TO DO THING WITH LOW ULIMIT" >&2
exit 1
fi
echo 'echo "I did the thing."' >> /root/.bashrc
Output of podman build
running with --ulimit
arg:
$ podman build . -t thing
STEP 1: FROM registry.redhat.io/ubi8/ubi-init
STEP 2: COPY install-script /app/install-script
--> 05990234291
STEP 3: RUN /app/install-script
Error: UNABLE TO DO THING WITH LOW ULIMIT
Error: error building at STEP "RUN /app/install-script": error while running runtime: exit status 1
$ podman build . -t thing --ulimit=nofile=4096:4096
STEP 1: FROM registry.redhat.io/ubi8/ubi-init
STEP 2: COPY install-script /app/install-script
--> Using cache 0599023429196fc3ceb7c209070de0ff7c5beafd33fea0def632afd41098fe21
STEP 3: RUN /app/install-script
--> 441ed24adae
STEP 4: CMD ["/bin/bash", "-l"]
STEP 5: COMMIT thing
--> 306b5b50c8c
306b5b50c8c32c87e62664290eaf1a6be2205c1611dfc8b8b5854ebc74c3759a
$ podman run -it --rm thing
I did the thing.
[root@bd60e2d7101c /]# exit
logout
Hi,
I am interested in implementing systemd units for containers and especially pods, would you see that as part of podman_container or should that go into a new module?
I already looked into the podman command, I think podman generate systemd
needs another output format for it to be easily parseable in python, but I would welcome any suggestions.
Greetings
Klaas
Add podman_network_info module for retrieving network information
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
I am running the below playbook and everytime it is running, the container will be re-created.
This also applies when recreate is set to false.
Steps to reproduce the issue:
podman ps -a
Describe the results you received:
The described container will be recreated, even if recreate: false
Describe the results you expected:
The container should only be recreated when recreate: true
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.10.2
Output of podman version
:
podman version 2.1.1
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: systemd
cgroupVersion: v2
conmon:
package: conmon-2.0.21-2.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.21, commit: 81d18b6c3ffc266abdef7ca94c1450e669a6a388'
cpus: 8
distribution:
distribution: fedora
version: "32"
eventLogger: journald
hostname: nb01
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.8.15-201.fc32.x86_64
linkmode: dynamic
memFree: 12203294720
memTotal: 16600129536
ociRuntime:
name: crun
package: crun-0.15-5.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.15
commit: 56ca95e61639510c7dbd39ff512f80f626404969
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.4-1.fc32.x86_64
version: |-
slirp4netns version 1.1.4
commit: b66ffa8e262507e37fca689822d23430f3357fe8
libslirp: 4.3.1
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 8371826688
swapTotal: 8371826688
uptime: 9m 46.43s
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /var/home/dschier/.config/containers/storage.conf
containerStore:
number: 1
paused: 0
running: 0
stopped: 1
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.2.0-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /var/home/dschier/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 11
runRoot: /run/user/1000
volumePath: /var/home/dschier/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 1601494271
BuiltTime: Wed Sep 30 21:31:11 2020
GitCommit: ""
GoVersion: go1.14.9
OsArch: linux/amd64
Version: 2.1.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.1.1-7.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: "Run httpd Container"
podman_container:
name: "web-test"
image: "docker.io/library/httpd:2"
state: "started"
force_restart: false
become: true
tags:
- "container"
- "podman"
- "httpd"
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Create module to manage podman volumes
As I can see in podman_image.py
, the podman_image module does not really do the authentication if I pull the image:
def pull_image(self, image_name=None):
if image_name is None:
image_name = self.image_name
args = ['pull', image_name, '-q']
if self.auth_file:
args.extend(['--authfile', self.auth_file])
if self.validate_certs:
args.append('--tls-verify')
if self.ca_cert_dir:
args.extend(['--cert-dir', self.ca_cert_dir])
rc, out, err = self._run(args, ignore_errors=True)
if rc != 0:
self.module.fail_json(msg='Failed to pull image {image_name}'.format(image_name=image_name))
return self.inspect_image(out.strip())
If I add username
it is ignored and Ansible tries to authenticate with an empty string as the username.
This behaviour is not clearly described in the documentation: https://docs.ansible.com/ansible/2.9/modules/podman_image_module.html#parameter-username
Username to use when authenticating to remote registries
would in my opinion include authentication when pulling, not only pushing.
In comparison, the push_image
function does include authentication:
def push_image(self):
args = ['push']
if self.validate_certs:
args.append('--tls-verify')
if self.ca_cert_dir:
args.extend(['--cert-dir', self.ca_cert_dir])
if self.username and self.password:
cred_string = '{user}:{password}'.format(user=self.username, password=self.password)
args.extend(['--creds', cred_string])
Although the best would be to fix the behaviour of course, so the pull_image function also includes the if self.username and self.password:...
part.
podman_image.py
ansible 2.9.14
config file = ..../ansible.cfg
configured module search path = ['..../modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.6.8 (default, Apr 16 2020, 01:36:27) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The following play only works the first time:
- name: Make sure postgres container exists
containers.podman.podman_container:
name: postgres
image: docker.io/postgres:12-alpine
pod: guac
env:
POSTGRES_PASSWORD: "test"
The second time, I get:
The full traceback is:
Traceback (most recent call last):
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1605166771.9013383-2968342-189581704990373/AnsiballZ_podman_container.py", line 102, in <module>
_ansiballz_main()
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1605166771.9013383-2968342-189581704990373/AnsiballZ_podman_container.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1605166771.9013383-2968342-189581704990373/AnsiballZ_podman_container.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)
File "/usr/lib/python3.8/runpy.py", line 207, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib/python3.8/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2237, in <module>
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2233, in main
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2104, in execute
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2041, in make_started
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1886, in different
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1814, in is_different
File "/tmp/ansible_containers.podman.podman_container_payload_et5a7idb/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1721, in diffparam_stop_signal
KeyError: 'SIGINT'
Steps to reproduce the issue:
ansible-playbook playbook.yml
This works as expected.
ansible-playbook playbook.yml
This fails with the error mentioned above.
Describe the results you received:
The full traceback is shown above.
Describe the results you expected:
I expect the second run to be idempotent.
Additional information you deem important (e.g. issue happens only occasionally):
This error happens on target hosts with CentOS 7, CentOS 8, Ubuntu 18.04 and Ubuntu 20.04.
Output of ansible --version
:
ansible 2.10.2
config file = /home/deployment/sebastian/src/spsrc-vm-config/ansible.cfg
configured module search path = ['/home/deployment/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/deployment/sebastian/conda-venvs/spsrc-vm-config/conda-install/envs/ansible/lib/python3.9/site-packages/ansible
executable location = /home/deployment/sebastian/conda-venvs/spsrc-vm-config/conda-install/envs/ansible/bin/ansible
python version = 3.9.0 | packaged by conda-forge | (default, Oct 14 2020, 22:59:50) [GCC 7.5.0]
Output of podman version
:
# Ubuntu 20.04
Version: 2.1.1
API Version: 2.0.0
Go Version: go1.15.2
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
# Ubuntu 18.04
Version: 2.1.1
API Version: 2.0.0
Go Version: go1.15.2
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
# CentOS 7
Version: 1.6.4
RemoteAPI Version: 1
Go Version: go1.12.12
OS/Arch: linux/amd64
# CentOS 8
Version: 1.6.4
RemoteAPI Version: 1
Go Version: go1.13.4
OS/Arch: linux/amd64
Output of podman info --debug
:
# CentOS 7
debug:
compiler: gc
git commit: ""
go version: go1.12.12
podman version: 1.6.4
host:
BuildahVersion: 1.12.0-dev
CgroupVersion: v1
Conmon:
package: conmon-2.0.8-1.el7.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.8, commit: f85c8b1ce77b73bcd48b2d802396321217008762'
Distribution:
distribution: '"centos"'
version: "7"
IDMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
MemFree: 167780352
MemTotal: 3973292032
OCIRuntime:
name: runc
package: runc-1.0.0-67.rc10.el7_8.x86_64
path: /usr/bin/runc
version: 'runc version spec: 1.0.1-dev'
SwapFree: 0
SwapTotal: 0
arch: amd64
cpus: 2
eventlogger: journald
hostname: slv-centos-7.novalocal
kernel: 3.10.0-1160.2.2.el7.x86_64
os: linux
rootless: true
slirp4netns:
Executable: /usr/bin/slirp4netns
Package: slirp4netns-0.4.3-4.el7_8.x86_64
Version: |-
slirp4netns version 0.4.3
commit: 2244b9b6461afeccad1678fac3d6e478c28b4ad6
uptime: 160h 59m 40.29s (Approximately 6.67 days)
registries:
blocked: null
insecure: null
search:
- registry.access.redhat.com
- registry.redhat.io
- docker.io
store:
ConfigFile: /home/centos/.config/containers/storage.conf
ContainerStore:
number: 4
GraphDriverName: overlay
GraphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-0.7.2-6.el7_8.x86_64
Version: |-
fuse-overlayfs: version 0.7.2
FUSE library version 3.6.1
using FUSE kernel interface version 7.29
GraphRoot: /home/centos/.local/share/containers/storage
GraphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
ImageStore:
number: 4
RunRoot: /run/user/1000/containers
VolumePath: /home/centos/.local/share/containers/storage/volumes
# CentOS 8
debug:
compiler: gc
git commit: ""
go version: go1.13.4
podman version: 1.6.4
host:
BuildahVersion: 1.12.0-dev
CgroupVersion: v1
Conmon:
package: conmon-2.0.6-1.module_el8.2.0+305+5e198a41.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.6, commit: a2b11288060ebd7abd20e0b4eb1a834bbf0aec3e'
Distribution:
distribution: '"centos"'
version: "8"
IDMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
MemFree: 403824640
MemTotal: 3961720832
OCIRuntime:
name: runc
package: runc-1.0.0-65.rc10.module_el8.2.0+305+5e198a41.x86_64
path: /usr/bin/runc
version: 'runc version spec: 1.0.1-dev'
SwapFree: 0
SwapTotal: 0
arch: amd64
cpus: 2
eventlogger: journald
hostname: slv-centos-8.novalocal
kernel: 4.18.0-193.6.3.el8_2.x86_64
os: linux
rootless: true
slirp4netns:
Executable: /usr/bin/slirp4netns
Package: slirp4netns-0.4.2-3.git21fdece.module_el8.2.0+305+5e198a41.x86_64
Version: |-
slirp4netns version 0.4.2+dev
commit: 21fdece2737dc24ffa3f01a341b8a6854f8b13b4
uptime: 160h 19m 13.77s (Approximately 6.67 days)
registries:
blocked: null
insecure: null
search:
- registry.access.redhat.com
- registry.redhat.io
- docker.io
store:
ConfigFile: /home/centos/.config/containers/storage.conf
ContainerStore:
number: 4
GraphDriverName: overlay
GraphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-0.7.2-5.module_el8.2.0+305+5e198a41.x86_64
Version: |-
fuse-overlayfs: version 0.7.2
FUSE library version 3.2.1
using FUSE kernel interface version 7.26
GraphRoot: /home/centos/.local/share/containers/storage
GraphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
ImageStore:
number: 4
RunRoot: /run/user/1000
VolumePath: /home/centos/.local/share/containers/storage/volumes
# Ubuntu 18.04
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: cgroupfs
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.20, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "18.04"
eventLogger: journald
hostname: slv-ubuntu-1804
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 165536
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 165536
size: 65536
kernel: 4.15.0-111-generic
linkmode: dynamic
memFree: 348729344
memTotal: 4136075264
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.1.4
commit: unknown
libslirp: 4.3.1-git
SLIRP_CONFIG_VERSION_MAX: 3
swapFree: 0
swapTotal: 0
uptime: 206h 14m 48.27s (Approximately 8.58 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 3
paused: 0
running: 3
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 4
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.15.2
OsArch: linux/amd64
Version: 2.1.1
# Ubuntu 20.04
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: cgroupfs
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.20, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: journald
hostname: slv-ubuntu-2004
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.4.0-52-generic
linkmode: dynamic
memFree: 337821696
memTotal: 4127318016
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.1.4
commit: unknown
libslirp: 4.3.1-git
SLIRP_CONFIG_VERSION_MAX: 3
swapFree: 0
swapTotal: 0
uptime: 134h 28m 58.62s (Approximately 5.58 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 4
paused: 0
running: 4
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: 'fuse-overlayfs: /usr/bin/fuse-overlayfs'
Version: |-
fusermount3 version: 3.9.0
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.0
using FUSE kernel interface version 7.31
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 4
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.15.2
OsArch: linux/amd64
Version: 2.1.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
# CentOS 7
podman-1.6.4-18.el7_8.x86_64
# CentOS 8
podman-1.6.4-10.module_el8.2.0+305+5e198a41.x86_64
# Ubuntu 18.04
ii podman 2.1.1~2 amd64 Manage pods, containers and container images.
ii podman-plugins 0.0.0~1 amd64 Plugins for podman
# Ubuntu 20.04
ii podman 2.1.1~2 amd64 Manage pods, containers and container images.
ii podman-plugins 0.0.0~1 amd64 Plugins for podman
Playbok you run with ansible (e.g. content of playbook.yaml
):
The play is pasted above.
Command line and output of ansible run with high verbosity:
The full traceback is pasted above.
Additional environment details (AWS, VirtualBox, physical, etc.):
Virtual machines inside an academic OpenStack deployment.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
Assigning an a string that includes a =
in environments seems to break idempotency.
For example:
GRAYLOG_MONGODB_URI: "mongodb://graylogadmin:[email protected]:27017,host2.domain.com:27017,host3.domain.com:27017/graylog?replicaset=rs01"
Steps to reproduce the issue:
- name: Install Graylog Container
containers.podman.podman_container:
env:
GRAYLOG_MONGODB_URI: "mongodb://graylogadmin:[email protected]:27017,host2.domain.com:27017,host3.domain.com:27017/graylog?replicaset=rs01"
image: "graylog/graylog:3.3.2-1"
name: graylog2
- name: Install Graylog Container Again
containers.podman.podman_container:
env:
GRAYLOG_MONGODB_URI: "mongodb://graylogadmin:[email protected]:27017,host2.domain.com:27017,host3.domain.com:27017/graylog?replicaset=rs01"
image: "graylog/graylog:3.3.2-1"
name: graylog2
Describe the results you received:
Container is recreated
Describe the results you expected:
Container is not recreated.
Additional information you deem important (e.g. issue happens only occasionally):
This seems to be the only remaining idempotency issue for my containers :) .
Output of ansible --version
:
ansible 2.9.11
config file = /home/usr/repos/audriga-update/ansible.cfg
configured module search path = ['/home/usr/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 17 2020, 18:15:42) [GCC 10.1.0]
Output of podman version
:
Version: 2.0.3
API Version: 1
Go Version: go1.14.2
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.0
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.18, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: file
hostname: host1
idMappings:
gidmap: null
uidmap: null
kernel: 5.4.0-1020-aws
linkmode: dynamic
memFree: 358780928
memTotal: 3876061184
ociRuntime:
name: runc
package: 'cri-o-runc: /usr/lib/cri-o-runc/sbin/runc'
path: /usr/lib/cri-o-runc/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/podman/podman.sock
rootless: false
slirp4netns:
executable: ""
package: ""
version: ""
swapFree: 0
swapTotal: 0
uptime: 101h 55m 38.7s (Approximately 4.21 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /etc/containers/storage.conf
containerStore:
number: 2
paused: 0
running: 2
stopped: 0
graphDriverName: overlay
graphOptions: {}
graphRoot: /var/lib/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "true"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 11
runRoot: /var/run/containers/storage
volumePath: /var/lib/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.14.2
OsArch: linux/amd64
Version: 2.0.3
Package info (e.g. output of rpm -q podman
or apt list podman
):
Listing... Done
podman/unknown,now 2.0.3~1 amd64 [installed]
podman/unknown 2.0.3~1 arm64
podman/unknown 2.0.3~1 armhf
podman/unknown 2.0.3~1 s390x
Playbok you run with ansible (e.g. content of playbook.yaml
):
See above
Command line and output of ansible run with high verbosity:
Using /home/usr/repos/repo/ansible.cfg as config file
PLAY [all] ***********************************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [host1]
PLAY [logging_servers] ***********************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [host1]
TASK [graylog : Install Graylog Container] ***************************************************************************************************************************************************
changed: [host1] => {"actions": ["started graylog2"], "changed": true, "container": {"AppArmorProfile": "containers-default-0.14.6", "Args": ["--", "/docker-entrypoint.sh", "graylog"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "Config": {"Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-07-27T14:12:42.29520322Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "AttachStderr": false, "AttachStdin": false, "AttachStdout": false, "Cmd": ["graylog"], "CreateCommand": ["podman", "container", "run", "--name", "graylog2", "--env", "GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01", "--detach=True", "graylog/graylog:3.3.2-1"], "Domainname": "", "Entrypoint": "tini -- /docker-entrypoint.sh", "Env": ["PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01", "LANG=C.UTF-8", "JAVA_HOME=/usr/local/openjdk-8", "JAVA_VERSION=8u252", "JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u252-b09/OpenJDK8U-jre_", "JAVA_URL_VERSION=8u252b09", "HOSTNAME=3fdb5f7e6a12", "HOME=/usr/share/graylog"], "Healthcheck": {"Interval": 10000000000, "Retries": 12, "Test": ["CMD-SHELL", "/health_check.sh"], "Timeout": 2000000000}, "Hostname": "3fdb5f7e6a12", "Image": "docker.io/graylog/graylog:3.3.2-1", "Labels": {"com.microscaling.docker.dockerfile": "/Dockerfile", "com.microscaling.license": "Apache 2.0", "maintainer": "Graylog, Inc. <[email protected]>", "org.label-schema.build-date": "", "org.label-schema.description": "Official Graylog Docker image", "org.label-schema.name": "Graylog Docker Image", "org.label-schema.schema-version": "1.0", "org.label-schema.url": "https://www.graylog.org/", "org.label-schema.vcs-ref": "3bb6c813bb39fd761c323f871cbf6eaaa90be638", "org.label-schema.vcs-url": "https://github.com/Graylog2/graylog-docker", "org.label-schema.vendor": "Graylog, Inc.", "org.label-schema.version": "3.3.2"}, "OnBuild": null, "OpenStdin": false, "StdinOnce": false, "StopSignal": 15, "Tty": false, "User": "graylog", "Volumes": null, "WorkingDir": "/usr/share/graylog"}, "ConmonPidFile": "/var/run/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/conmon.pid", "Created": "2020-07-27T14:12:42.29520322Z", "Dependencies": [], "Driver": "overlay", "EffectiveCaps": null, "ExecIDs": [], "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "runc", "--events-backend", "file", "container", "cleanup", "3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d"], "GraphDriver": {"Data": {"LowerDir": "/var/lib/containers/storage/overlay/62788dad9f291973faf0c8e780610fc81db4c4bc4c0983e73029952c0ee86605/diff:/var/lib/containers/storage/overlay/2fbda6e3dc0683f3dbf314d883dc21220099b6c3f6b8dbf689bd99f95c366213/diff:/var/lib/containers/storage/overlay/630a5b42a38e7aa03e2152234e031aeb1f01839a8937e2ba33125a71651f93b0/diff:/var/lib/containers/storage/overlay/07d3fe8ebc914e1dd280c0b516cb2d86de30290bc1bf298bddc9fd6a71a14f82/diff:/var/lib/containers/storage/overlay/1febe7e589f155969c4d806d905f331bc0ce817e16421f43de444281115f99cf/diff:/var/lib/containers/storage/overlay/8bef4e70c5a8de419524114452936e4d088afff61b34b8697d2102ad2c7781dd/diff:/var/lib/containers/storage/overlay/298524bacaec3c59c01deec164df3cf660a9ad2301f3885de2cfabda22ceb85b/diff:/var/lib/containers/storage/overlay/7ae5e2823227aee633e9e9b7adc3927af89f1840d9e5c2b0086f97438cccfbd3/diff:/var/lib/containers/storage/overlay/13cb14c2acd34e45446a50af25cb05095a17624678dbafbcc9e26086547c1d74/diff", "MergedDir": "/var/lib/containers/storage/overlay/703045316d3488675d8a3e585b0af7dbe07e645b19ac1c85da020ec48697cd52/merged", "UpperDir": "/var/lib/containers/storage/overlay/703045316d3488675d8a3e585b0af7dbe07e645b19ac1c85da020ec48697cd52/diff", "WorkDir": "/var/lib/containers/storage/overlay/703045316d3488675d8a3e585b0af7dbe07e645b19ac1c85da020ec48697cd52/work"}, "Name": "overlay"}, "HostConfig": {"AutoRemove": false, "Binds": ["d3bdd42dbb86a5174bd6285e059f555102f7d4021208d17f2d6f25f8cff7ef9f:/usr/share/graylog/data:rprivate,rw,nodev,exec,nosuid,rbind"], "BlkioDeviceReadBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceWriteIOps": null, "BlkioWeight": 0, "BlkioWeightDevice": null, "CapAdd": [], "CapDrop": [], "Cgroup": "", "CgroupMode": "host", "CgroupParent": "", "Cgroups": "default", "ConsoleSize": [0, 0], "ContainerIDFile": "", "CpuCount": 0, "CpuPercent": 0, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpuShares": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IOMaximumBandwidth": 0, "IOMaximumIOps": 0, "IpcMode": "private", "Isolation": "", "KernelMemory": 0, "Links": null, "LogConfig": {"Config": null, "Type": "k8s-file"}, "Memory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "NanoCpus": 0, "NetworkMode": "bridge", "OomKillDisable": false, "OomScoreAdj": 0, "PidMode": "private", "PidsLimit": 2048, "PortBindings": {}, "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "RestartPolicy": {"MaximumRetryCount": 0, "Name": ""}, "Runtime": "oci", "SecurityOpt": [], "ShmSize": 65536000, "Tmpfs": {}, "UTSMode": "private", "Ulimits": [{"Hard": 1048576, "Name": "RLIMIT_NOFILE", "Soft": 1048576}, {"Hard": 4194304, "Name": "RLIMIT_NPROC", "Soft": 4194304}], "UsernsMode": "", "VolumeDriver": "", "VolumesFrom": null}, "HostnamePath": "/var/run/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/hosts", "Id": "3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d", "Image": "087b038ded46066345c929e2298053c55a1ef2fd1456c60845868fc3e8b16339", "ImageName": "docker.io/graylog/graylog:3.3.2-1", "IsInfra": false, "LogPath": "/var/lib/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/ctr.log", "LogTag": "", "MountLabel": "", "Mounts": [{"Destination": "/usr/share/graylog/data", "Driver": "local", "Mode": "", "Name": "d3bdd42dbb86a5174bd6285e059f555102f7d4021208d17f2d6f25f8cff7ef9f", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/d3bdd42dbb86a5174bd6285e059f555102f7d4021208d17f2d6f25f8cff7ef9f/_data", "Type": "volume"}], "Name": "graylog2", "Namespace": "", "NetworkSettings": {"Bridge": "", "EndpointID": "", "Gateway": "10.88.0.1", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "HairpinMode": false, "IPAddress": "10.88.0.140", "IPPrefixLen": 16, "IPv6Gateway": "", "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "MacAddress": "fa:e4:4b:e5:25:b2", "Ports": {}, "SandboxID": "", "SandboxKey": "/var/run/netns/cni-458e23bd-96b7-5a37-3832-a3bb515c2084"}, "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/config.json", "OCIRuntime": "runc", "Path": "tini", "Pod": "", "ProcessLabel": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata/resolv.conf", "RestartCount": 0, "Rootfs": "", "State": {"ConmonPid": 3100305, "Dead": false, "Error": "", "ExitCode": 0, "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"FailingStreak": 0, "Log": null, "Status": "starting"}, "OOMKilled": false, "OciVersion": "1.0.2-dev", "Paused": false, "Pid": 3100329, "Restarting": false, "Running": true, "StartedAt": "2020-07-27T14:12:42.607092896Z", "Status": "running"}, "StaticDir": "/var/lib/containers/storage/overlay-containers/3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d/userdata"}, "podman_actions": ["podman run --name graylog2 --env GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01 --detach=True graylog/graylog:3.3.2-1"], "stderr": "", "stderr_lines": [], "stdout": "3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d\n", "stdout_lines": ["3fdb5f7e6a12352d2b9f0a7e1091ac94b774ae6c44a09c960387b04667b1320d"]}
TASK [graylog : Install Graylog Container Again] *********************************************************************************************************************************************
changed: [host1] => {"actions": ["recreated graylog2"], "changed": true, "container": {"AppArmorProfile": "containers-default-0.14.6", "Args": ["--", "/docker-entrypoint.sh", "graylog"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "Config": {"Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-07-27T14:12:46.477088322Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "AttachStderr": false, "AttachStdin": false, "AttachStdout": false, "Cmd": ["graylog"], "CreateCommand": ["podman", "container", "run", "--name", "graylog2", "--env", "GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01", "--detach=True", "graylog/graylog:3.3.2-1"], "Domainname": "", "Entrypoint": "tini -- /docker-entrypoint.sh", "Env": ["PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "JAVA_URL_VERSION=8u252b09", "GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01", "LANG=C.UTF-8", "JAVA_HOME=/usr/local/openjdk-8", "JAVA_VERSION=8u252", "JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u252-b09/OpenJDK8U-jre_", "HOSTNAME=16cbbefe6fe3", "HOME=/usr/share/graylog"], "Healthcheck": {"Interval": 10000000000, "Retries": 12, "Test": ["CMD-SHELL", "/health_check.sh"], "Timeout": 2000000000}, "Hostname": "16cbbefe6fe3", "Image": "docker.io/graylog/graylog:3.3.2-1", "Labels": {"com.microscaling.docker.dockerfile": "/Dockerfile", "com.microscaling.license": "Apache 2.0", "maintainer": "Graylog, Inc. <[email protected]>", "org.label-schema.build-date": "", "org.label-schema.description": "Official Graylog Docker image", "org.label-schema.name": "Graylog Docker Image", "org.label-schema.schema-version": "1.0", "org.label-schema.url": "https://www.graylog.org/", "org.label-schema.vcs-ref": "3bb6c813bb39fd761c323f871cbf6eaaa90be638", "org.label-schema.vcs-url": "https://github.com/Graylog2/graylog-docker", "org.label-schema.vendor": "Graylog, Inc.", "org.label-schema.version": "3.3.2"}, "OnBuild": null, "OpenStdin": false, "StdinOnce": false, "StopSignal": 15, "Tty": false, "User": "graylog", "Volumes": null, "WorkingDir": "/usr/share/graylog"}, "ConmonPidFile": "/var/run/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/conmon.pid", "Created": "2020-07-27T14:12:46.477088322Z", "Dependencies": [], "Driver": "overlay", "EffectiveCaps": null, "ExecIDs": [], "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "runc", "--events-backend", "file", "container", "cleanup", "16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e"], "GraphDriver": {"Data": {"LowerDir": "/var/lib/containers/storage/overlay/62788dad9f291973faf0c8e780610fc81db4c4bc4c0983e73029952c0ee86605/diff:/var/lib/containers/storage/overlay/2fbda6e3dc0683f3dbf314d883dc21220099b6c3f6b8dbf689bd99f95c366213/diff:/var/lib/containers/storage/overlay/630a5b42a38e7aa03e2152234e031aeb1f01839a8937e2ba33125a71651f93b0/diff:/var/lib/containers/storage/overlay/07d3fe8ebc914e1dd280c0b516cb2d86de30290bc1bf298bddc9fd6a71a14f82/diff:/var/lib/containers/storage/overlay/1febe7e589f155969c4d806d905f331bc0ce817e16421f43de444281115f99cf/diff:/var/lib/containers/storage/overlay/8bef4e70c5a8de419524114452936e4d088afff61b34b8697d2102ad2c7781dd/diff:/var/lib/containers/storage/overlay/298524bacaec3c59c01deec164df3cf660a9ad2301f3885de2cfabda22ceb85b/diff:/var/lib/containers/storage/overlay/7ae5e2823227aee633e9e9b7adc3927af89f1840d9e5c2b0086f97438cccfbd3/diff:/var/lib/containers/storage/overlay/13cb14c2acd34e45446a50af25cb05095a17624678dbafbcc9e26086547c1d74/diff", "MergedDir": "/var/lib/containers/storage/overlay/15ca6dbadccdea7faa1d3ef5adc4befd38ea00d4b42a0186f703405b4c3a6390/merged", "UpperDir": "/var/lib/containers/storage/overlay/15ca6dbadccdea7faa1d3ef5adc4befd38ea00d4b42a0186f703405b4c3a6390/diff", "WorkDir": "/var/lib/containers/storage/overlay/15ca6dbadccdea7faa1d3ef5adc4befd38ea00d4b42a0186f703405b4c3a6390/work"}, "Name": "overlay"}, "HostConfig": {"AutoRemove": false, "Binds": ["bb23bc0c19542c8989aadd15c9ae4216a72f99822149f4acd779676f0ac10e8c:/usr/share/graylog/data:rprivate,rw,nodev,exec,nosuid,rbind"], "BlkioDeviceReadBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceWriteIOps": null, "BlkioWeight": 0, "BlkioWeightDevice": null, "CapAdd": [], "CapDrop": [], "Cgroup": "", "CgroupMode": "host", "CgroupParent": "", "Cgroups": "default", "ConsoleSize": [0, 0], "ContainerIDFile": "", "CpuCount": 0, "CpuPercent": 0, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpuShares": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IOMaximumBandwidth": 0, "IOMaximumIOps": 0, "IpcMode": "private", "Isolation": "", "KernelMemory": 0, "Links": null, "LogConfig": {"Config": null, "Type": "k8s-file"}, "Memory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "NanoCpus": 0, "NetworkMode": "bridge", "OomKillDisable": false, "OomScoreAdj": 0, "PidMode": "private", "PidsLimit": 2048, "PortBindings": {}, "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "RestartPolicy": {"MaximumRetryCount": 0, "Name": ""}, "Runtime": "oci", "SecurityOpt": [], "ShmSize": 65536000, "Tmpfs": {}, "UTSMode": "private", "Ulimits": [{"Hard": 1048576, "Name": "RLIMIT_NOFILE", "Soft": 1048576}, {"Hard": 4194304, "Name": "RLIMIT_NPROC", "Soft": 4194304}], "UsernsMode": "", "VolumeDriver": "", "VolumesFrom": null}, "HostnamePath": "/var/run/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/hosts", "Id": "16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e", "Image": "087b038ded46066345c929e2298053c55a1ef2fd1456c60845868fc3e8b16339", "ImageName": "docker.io/graylog/graylog:3.3.2-1", "IsInfra": false, "LogPath": "/var/lib/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/ctr.log", "LogTag": "", "MountLabel": "", "Mounts": [{"Destination": "/usr/share/graylog/data", "Driver": "local", "Mode": "", "Name": "bb23bc0c19542c8989aadd15c9ae4216a72f99822149f4acd779676f0ac10e8c", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/bb23bc0c19542c8989aadd15c9ae4216a72f99822149f4acd779676f0ac10e8c/_data", "Type": "volume"}], "Name": "graylog2", "Namespace": "", "NetworkSettings": {"Bridge": "", "EndpointID": "", "Gateway": "10.88.0.1", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "HairpinMode": false, "IPAddress": "10.88.0.141", "IPPrefixLen": 16, "IPv6Gateway": "", "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "MacAddress": "92:44:89:35:c0:a9", "Ports": {}, "SandboxID": "", "SandboxKey": "/var/run/netns/cni-9c1da729-a366-72c0-3ec8-d87604fa7101"}, "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/config.json", "OCIRuntime": "runc", "Path": "tini", "Pod": "", "ProcessLabel": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata/resolv.conf", "RestartCount": 0, "Rootfs": "", "State": {"ConmonPid": 3100814, "Dead": false, "Error": "", "ExitCode": 0, "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"FailingStreak": 0, "Log": null, "Status": "starting"}, "OOMKilled": false, "OciVersion": "1.0.2-dev", "Paused": false, "Pid": 3100826, "Restarting": false, "Running": true, "StartedAt": "2020-07-27T14:12:46.735201065Z", "Status": "running"}, "StaticDir": "/var/lib/containers/storage/overlay-containers/16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e/userdata"}, "podman_actions": ["podman rm -f graylog2", "podman run --name graylog2 --env GRAYLOG_MONGODB_URI=mongodb://graylogAdmin:[email protected]:27017/graylog?replicaSet=rs01 --detach=True graylog/graylog:3.3.2-1"], "stderr": "", "stderr_lines": [], "stdout": "16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e\n", "stdout_lines": ["16cbbefe6fe3d6dea43686ee84f1a0c96930887e9e73540da8f207d5a1554d0e"]}
PLAY RECAP ***********************************************************************************************************************************************************************************
host1 : ok=2 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
AWS
Thank you for working on this project. I gave this repo a shot, and had to make some changes to get close to idempotence for my usage. This isn't meant to turn into a PR or anything, just a trip report of what I saw, in case it helps with development. Cheers!
diff --git a/plugins/modules/podman_container.py b/plugins/modules/podman_container.py
index 4849799..150c889 100644
--- a/plugins/modules/podman_container.py
+++ b/plugins/modules/podman_container.py
@@ -1293,7 +1293,7 @@ class PodmanDefaults:
"user": "",
"uts": "",
"volume": [],
- "workdir": "/",
+ # "workdir": "/", This is specified in image itself
}
def default_dict(self):
@@ -1327,7 +1327,7 @@ class PodmanContainerDiff:
return params_with_defaults
def _diff_update_and_compare(self, param_name, before, after):
- if before != after:
+ if before != after and ((before and before != 'None') or (after and after != '0')):
self.diff['before'].update({param_name: before})
self.diff['after'].update({param_name: after})
return True
@@ -1454,7 +1454,7 @@ class PodmanContainerDiff:
def diffparam_device(self):
before = [":".join([i['pathonhost'], i['pathincontainer']])
- for i in self.info['hostconfig']['devices']]
+ for i in self.info['hostconfig']['devices'] or []]
after = [":".join(i.split(":")[:2]) for i in self.params['device']]
before, after = sorted(list(set(before))), sorted(list(set(after)))
return self._diff_update_and_compare('devices', before, after)
@@ -1530,11 +1530,16 @@ class PodmanContainerDiff:
for repl in strip_from_name:
before = before.replace(repl, "")
after = after.replace(repl, "")
+
+ if before.count('/') == after.count('/') + 1:
+ before = before.split('/', 1)[1]
+
return self._diff_update_and_compare('image', before, after)
def diffparam_ipc(self):
before = self.info['hostconfig']['ipcmode']
after = self.params['ipc']
+ after = before
return self._diff_update_and_compare('ipc', before, after)
def diffparam_label(self):
@@ -1584,6 +1589,8 @@ class PodmanContainerDiff:
def diffparam_network(self):
before = [self.info['hostconfig']['networkmode']]
after = self.params['network']
+ if after == ['default']:
+ after = before
return self._diff_update_and_compare('network', before, after)
def diffparam_no_hosts(self):
@@ -1640,6 +1647,8 @@ class PodmanContainerDiff:
def diffparam_uts(self):
before = self.info['hostconfig']['utsmode']
after = self.params['uts']
+ if not after:
+ after = before
return self._diff_update_and_compare('uts', before, after)
def diffparam_volume(self):
@@ -1658,13 +1667,16 @@ class PodmanContainerDiff:
return self._diff_update_and_compare('volume', before, after)
def diffparam_volumes_from(self):
- before = self.info['hostconfig']['volumesfrom'] or []
+ # Key doesn't exist?
+ before = self.info['hostconfig'].get('volumesfrom', [])
after = self.params['volumes_from'] or []
return self._diff_update_and_compare('volumes_from', before, after)
def diffparam_workdir(self):
before = self.info['config']['workingdir']
after = self.params['workdir']
+ if after is None:
+ after = before
return self._diff_update_and_compare('workdir', before, after)
def is_different(self):
Shall the above module (previously seen in TripleO-Ansible) be published here?
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
containers.podman.podman_container
always recreates the container, becausestop_signal
and volume
properties differ.
Steps to reproduce the issue:
containers.podman.podman_container:
image: graphiteapp/graphite-statsd:latest
name: graphite
state: present
Run it
Run it again
Describe the results you received:
Using -D
one can see that it recreates the container:
--- before
+++ after
@@ -1,2 +1,2 @@
-stop_signal - 1
-volume - ['06772653142bd35ffca83ae2f9e699ec2cc411cd543cc29847e790021847101b:/opt/graphite/webapp/graphite/functions/custom', '555afa8eaeb589dac212ae263b7f9324cb11d7799190e46e3ce0535ec56de6d6:/opt/graphite/conf', '5ebcd4a298b2f88c38b5822837f687bf7cdf735f1bae3378eae6c0629505ed61:/etc/nginx', '8dd74e99acf2066d57ec073436ec39d31d63f78e58b73fc885147583e338699e:/etc/logrotate.d', '9c48cc57bd07793e29fbba2d9ff3d8d923603fdc19463b962bcfc06059f734c4:/opt/graphite/storage', 'b69ca8e9763b07a0a81123166650d40028a7a2d3bdf60e0824a4d64a19fee872:/opt/statsd/config', 'c96501d6576215c463234d0f3d8faf2127199e55282140cc2a7fa2736a53eb55:/var/lib/redis', 'f1268bcaff63744914c03982d43098fa0e4ab753d8aaf50594d7321985a578db:/var/log']
+stop_signal - 15
+volume - []
Describe the results you expected:
The container should have been OK
and not changed
.
Additional information you deem important (e.g. issue happens only occasionally):
One could add an additional parameter like comparisions to fix this.
Output of ansible --version
:
ansible 2.9.7
config file = /home/jrsr/repos/cloudmovr-infra/ansible.cfg
configured module search path = ['/home/jrsr/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.2 (default, Apr 8 2020, 14:31:25) [GCC 9.3.0]
Output of podman version
:
Version: 1.9.1
RemoteAPI Version: 1
Go Version: go1.10.1
OS/Arch: linux/amd64
Output of podman info --debug
:
debug:
compiler: gc
gitCommit: ""
goVersion: go1.10.1
podmanVersion: 1.9.1
host:
arch: amd64
buildahVersion: 1.14.8
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.15, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "18.04"
eventLogger: file
hostname: graphite
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 165536
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 165536
size: 65536
kernel: 5.3.0-1017-aws
memFree: 1186320384
memTotal: 4125073408
ociRuntime:
name: runc
package: 'containerd.io: /usr/bin/runc'
path: /usr/bin/runc
version: |-
runc version 1.0.0-rc10
commit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
spec: 1.0.1-dev
os: linux
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 0.4.3
commit: unknown
swapFree: 0
swapTotal: 0
uptime: 19h 41m 43.13s (Approximately 0.79 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 1
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
Package info (e.g. output of rpm -q podman
or apt list podman
):
Listing... Done
podman/unknown,now 1.9.1~1 amd64 [installed]
Playbok you run with ansible (e.g. content of playbook.yaml
):
See above
Command line and output of ansible run with high verbosity:
$ ansible-playbook site.yml -Dv --start-at-task 'Install Graphite container v2'
PLAY [all] ***********************************************************************************************************************************************************************************
PLAY [monitor] *******************************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [graphite.exp]
TASK [monitor : Install Graphite container v2] ***********************************************************************************************************************************************
--- before
+++ after
@@ -1,2 +1,2 @@
-stop_signal - 1
-volume - ['2db6bc0fed32019e43c7dacc6a291bd5a931e195ed86cee82e55a019af1caa25:/var/log', '2fb5dcf1bb6ed48ded7196bfc8e726a0b1b807f8e278ee4209b09cc60ecef5e6:/etc/logrotate.d', '38c38cef3ee721c92cf9f7d83728bae1cc4355ef4702740de7e898cf7353e810:/etc/nginx', '47297b3b08e4ff970625ea9515611f42fec8de8eb46ab9d6a2eaa543fda8a9e4:/opt/statsd/config', '6582362071a12554ef08ad47353c17a013a0afc4d97093333297b3705d1819bc:/opt/graphite/webapp/graphite/functions/custom', '84c67d4dcaaefdc3a14e75a26069d3ba61abff41ebc3c392b6b8149218ac2a61:/opt/graphite/conf', '90571483c885ed58be14cd1eac13550315e5fa1613e188c7c9764051b6f12321:/opt/graphite/storage', 'fb142a8102a329561717ae5b74a2e37944227083fcf2ad6172b1a876194595c5:/var/lib/redis']
+stop_signal - 15
+volume - []
changed: [graphite.exp] => {"actions": ["recreated graphite2"], "changed": true, "container": {"AppArmorProfile": "container-default-1.9.1", "Args": ["/entrypoint"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "Config": {"Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-05-05T10:21:02.469287227Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "1"}, "AttachStderr": false, "AttachStdin": false, "AttachStdout": false, "Cmd": null, "CreateCommand": ["podman", "container", "run", "--name", "graphite2", "--detach=True", "graphiteapp/graphite-statsd:latest"], "Domainname": "", "Entrypoint": "/entrypoint", "Env": ["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "STATSD_INTERFACE=udp", "HOSTNAME=9f672d71a482", "HOME=/root"], "Hostname": "9f672d71a482", "Image": "docker.io/graphiteapp/graphite-statsd:latest", "Labels": {"maintainer": "Denys Zhdanov <[email protected]>"}, "OnBuild": null, "OpenStdin": false, "StdinOnce": false, "StopSignal": 1, "Tty": false, "User": "", "Volumes": null, "WorkingDir": "/"}, "ConmonPidFile": "/var/run/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/conmon.pid", "Created": "2020-05-05T10:21:02.469287227Z", "Dependencies": [], "Driver": "overlay", "EffectiveCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "ExecIDs": [], "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "runc", "--events-backend", "file", "container", "cleanup", "9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506"], "GraphDriver": {"Data": {"LowerDir": "/var/lib/containers/storage/overlay/07eb7a8f8b30c8d9f2a2be1d3169b06b350f9e2bb87dcc4e8585ef8403e64d75/diff:/var/lib/containers/storage/overlay/4ef8d93c3f7e0fcafcb8e91640739f254030b78e780de8ae3bac8be8b1ab4008/diff:/var/lib/containers/storage/overlay/858f6f478edde3c923ec55b03da94412f35468d2e7771a594eed39bdd3b10a07/diff:/var/lib/containers/storage/overlay/beee9f30bc1f711043e78d4a2be0668955d4b761d587d6f60c2c8dc081efb203/diff", "MergedDir": "/var/lib/containers/storage/overlay/099bf625189c4b35d3cae22ab364d687dbcae2edc2bf6f4d26086a3e4a80445b/merged", "UpperDir": "/var/lib/containers/storage/overlay/099bf625189c4b35d3cae22ab364d687dbcae2edc2bf6f4d26086a3e4a80445b/diff", "WorkDir": "/var/lib/containers/storage/overlay/099bf625189c4b35d3cae22ab364d687dbcae2edc2bf6f4d26086a3e4a80445b/work"}, "Name": "overlay"}, "HostConfig": {"AutoRemove": false, "Binds": ["2d288dd1dd14ff9ac154bb2823943a046db447a1ff6e68e77b1e50b8e3dbb1a4:/etc/nginx:rprivate,rw,nodev,exec,nosuid,rbind", "f2c6f689510660b7a0b994dcc6a0457f76e8aa38b913d2746fb1490741f8e834:/opt/graphite/conf:rprivate,rw,nodev,exec,nosuid,rbind", "875f77afdd72c2108256601aa01abb068b176c44138d230d51490f4352fc3ba8:/opt/graphite/storage:rprivate,rw,nodev,exec,nosuid,rbind", "93ee3eb80a4286979b15b1e163ad120dd518f7a3d080923bff9d60e41797b8f7:/opt/graphite/webapp/graphite/functions/custom:rprivate,rw,nodev,exec,nosuid,rbind", "756dac97bfeb69ab40f400ab0c21b9e90e00a4e14b5b7f1b1c2d4d284bbc37f3:/opt/statsd/config:rprivate,rw,nodev,exec,nosuid,rbind", "41a1f1a7ddd29354dd112d5231e7c19d0e4252145e21cd6fe3bbed8ffe3040f3:/var/lib/redis:rprivate,rw,nodev,exec,nosuid,rbind", "b576e54922cfaca9dd94886c1afe5850b5e38bf5f64c68b171dd3ca8d0e91bd8:/var/log:rprivate,rw,nodev,exec,nosuid,rbind", "a32a2e391736e8d1234bdc1bdf0171d86c3151b4837b8d321250078e8cda5ee0:/etc/logrotate.d:rprivate,rw,nodev,exec,nosuid,rbind"], "BlkioDeviceReadBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceWriteIOps": null, "BlkioWeight": 0, "BlkioWeightDevice": null, "CapAdd": [], "CapDrop": [], "Cgroup": "", "CgroupParent": "", "Cgroups": "default", "ConsoleSize": [0, 0], "ContainerIDFile": "", "CpuCount": 0, "CpuPercent": 0, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpuShares": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IOMaximumBandwidth": 0, "IOMaximumIOps": 0, "IpcMode": "", "Isolation": "", "KernelMemory": 0, "Links": null, "LogConfig": {"Config": null, "Type": "k8s-file"}, "Memory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": -1, "NanoCpus": 0, "NetworkMode": "default", "OomKillDisable": false, "OomScoreAdj": 0, "PidMode": "", "PidsLimit": 4096, "PortBindings": {}, "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "RestartPolicy": {"MaximumRetryCount": 0, "Name": ""}, "Runtime": "oci", "SecurityOpt": [], "ShmSize": 65536000, "Tmpfs": {}, "UTSMode": "", "Ulimits": [{"Hard": 1048576, "Name": "RLIMIT_NOFILE", "Soft": 1048576}, {"Hard": 32768, "Name": "RLIMIT_NPROC", "Soft": 32768}], "UsernsMode": "", "VolumeDriver": "", "VolumesFrom": null}, "HostnamePath": "/var/run/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/hosts", "Id": "9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506", "Image": "0869fbfe98ecbb9438e53c9bc733fcb2e73b652d519fe04912bae0f69126abf2", "ImageName": "docker.io/graphiteapp/graphite-statsd:latest", "IsInfra": false, "LogPath": "/var/lib/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/ctr.log", "LogTag": "", "MountLabel": "", "Mounts": [{"Destination": "/etc/nginx", "Driver": "local", "Mode": "", "Name": "2d288dd1dd14ff9ac154bb2823943a046db447a1ff6e68e77b1e50b8e3dbb1a4", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/2d288dd1dd14ff9ac154bb2823943a046db447a1ff6e68e77b1e50b8e3dbb1a4/_data", "Type": "volume"}, {"Destination": "/opt/graphite/conf", "Driver": "local", "Mode": "", "Name": "f2c6f689510660b7a0b994dcc6a0457f76e8aa38b913d2746fb1490741f8e834", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/f2c6f689510660b7a0b994dcc6a0457f76e8aa38b913d2746fb1490741f8e834/_data", "Type": "volume"}, {"Destination": "/opt/graphite/storage", "Driver": "local", "Mode": "", "Name": "875f77afdd72c2108256601aa01abb068b176c44138d230d51490f4352fc3ba8", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/875f77afdd72c2108256601aa01abb068b176c44138d230d51490f4352fc3ba8/_data", "Type": "volume"}, {"Destination": "/opt/graphite/webapp/graphite/functions/custom", "Driver": "local", "Mode": "", "Name": "93ee3eb80a4286979b15b1e163ad120dd518f7a3d080923bff9d60e41797b8f7", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/93ee3eb80a4286979b15b1e163ad120dd518f7a3d080923bff9d60e41797b8f7/_data", "Type": "volume"}, {"Destination": "/opt/statsd/config", "Driver": "local", "Mode": "", "Name": "756dac97bfeb69ab40f400ab0c21b9e90e00a4e14b5b7f1b1c2d4d284bbc37f3", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/756dac97bfeb69ab40f400ab0c21b9e90e00a4e14b5b7f1b1c2d4d284bbc37f3/_data", "Type": "volume"}, {"Destination": "/var/lib/redis", "Driver": "local", "Mode": "", "Name": "41a1f1a7ddd29354dd112d5231e7c19d0e4252145e21cd6fe3bbed8ffe3040f3", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/41a1f1a7ddd29354dd112d5231e7c19d0e4252145e21cd6fe3bbed8ffe3040f3/_data", "Type": "volume"}, {"Destination": "/var/log", "Driver": "local", "Mode": "", "Name": "b576e54922cfaca9dd94886c1afe5850b5e38bf5f64c68b171dd3ca8d0e91bd8", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/b576e54922cfaca9dd94886c1afe5850b5e38bf5f64c68b171dd3ca8d0e91bd8/_data", "Type": "volume"}, {"Destination": "/etc/logrotate.d", "Driver": "local", "Mode": "", "Name": "a32a2e391736e8d1234bdc1bdf0171d86c3151b4837b8d321250078e8cda5ee0", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/a32a2e391736e8d1234bdc1bdf0171d86c3151b4837b8d321250078e8cda5ee0/_data", "Type": "volume"}], "Name": "graphite2", "Namespace": "", "NetworkSettings": {"Bridge": "", "EndpointID": "", "Gateway": "10.88.0.1", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "HairpinMode": false, "IPAddress": "10.88.0.45", "IPPrefixLen": 16, "IPv6Gateway": "", "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "MacAddress": "e2:dc:01:ce:ee:1c", "Ports": [], "SandboxID": "", "SandboxKey": "/var/run/netns/cni-52d61d8b-f436-8b64-8786-af5cdca10319"}, "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/config.json", "OCIRuntime": "runc", "Path": "/entrypoint", "Pod": "", "ProcessLabel": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata/resolv.conf", "RestartCount": 0, "Rootfs": "", "State": {"ConmonPid": 19056, "Dead": false, "Error": "", "ExitCode": 0, "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"FailingStreak": 0, "Log": null, "Status": ""}, "OOMKilled": false, "OciVersion": "1.0.1-dev", "Paused": false, "Pid": 19091, "Restarting": false, "Running": true, "StartedAt": "2020-05-05T10:21:02.89654263Z", "Status": "running"}, "StaticDir": "/var/lib/containers/storage/overlay-containers/9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506/userdata"}, "podman_actions": ["podman rm -f graphite2", "podman run --name graphite2 --detach=True graphiteapp/graphite-statsd:latest"], "stderr": "", "stderr_lines": [], "stdout": "9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506\n", "stdout_lines": ["9f672d71a4828d1717e476c28cd9299f8d503378b4d61e34f79779c90b382506"]}
PLAY RECAP ***********************************************************************************************************************************************************************************
graphite.exp : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.): AWS
Issue:
Right now a lot of Ansible users deploy Docker containers using docker_*
modules of Ansible. Because Docker is not supported on RHEL-8/CnetOS-8 officially and user might want still to use their playbooks and roles, instead of rewriting them. Podman is supposed to replace Docker in RH-8 distros. Current there are a few podman_*
modules.
How can we make migration of docker_*
users as easy and transparent as possible?
For example task:
- name: Run some docker container
docker_container:
name: grafana
image: grafana
detach: true
ports:
- 3000:3000
env:
GF_INSTALL_PLUGINS: "grafana-clock-panel,grafana-simple-json-datasource"
What is the best solution (considering collections) that can allow user to continue using this task in Podman only host?
The ideal situation is:
- hosts: all
collections:
- containers.whatever
tasks:
... all tasks of users as is ....
Currently options on the table are (additional are welcome!):
containers.whatever
collection. It will pick up podman and docker collections and each action module will trigger either docker or podman module, depending on what is on the host. Kind of package
module model.containers.docker
. Most of usual things would work. Some, like pods and other specific, will be available for Podman only.generic
container module basing on podman API v2, which should be a copy of Docker-Py. Write it from scratch with support of both Docker and Podman as underlying engines.Also there is an opened question what to do with specific Podman features like pods, play-kube and others. And what to do with Ansible Docker Compose feature.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
Podman container module should inspect image of container to understand better what are options will be set for container. It's required for better idempotency of podman_container module.
Steps to reproduce the issue:
workdir
, volume
and any other supported, it'll be non-default in container inspection.- containers.podman.podman_container:
image: graphiteapp/graphite-statsd:latest
name: graphite
state: present
- containers.podman.podman_container:
name: mysql
image: mysql
state: present
pod: telemetry
env:
MYSQL_ROOT_PASSWORD: "2211"
strict image
idempotency we'll need to check SHAs of various images.Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Create podman logout module. which logs out from container registries.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The publish
option will always trigger a container recreate if it's defined in the short-hand form, for example:
publish:
- "3306"
But the below works correctly:
publish:
- "3306:3306"
Steps to reproduce the issue:
- name: Create mysql container
containers.podman.podman_container:
name: db
image: docker.io/mysql
state: present
publish:
- "3306"
Describe the results you received:
Container is always recreated despite configuration not being changed.
Describe the results you expected:
Container should not be recreated if the configuration is the same.
Output of ansible --version
:
ansible 2.9.13
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/roxifas/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.7/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.7.9 (default, Aug 19 2020, 17:05:11) [GCC 9.3.1 20200408 (Red Hat 9.3.1-2)]
Output of podman version
:
Version: 2.1.1
API Version: 2.0.0
Go Version: go1.13.15
Built: Sun Sep 27 23:37:44 2020
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: cgroupfs
cgroupVersion: v1
conmon:
package: conmon-2.0.21-1.el8.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.21, commit: 3460cd1ad859a79bd27df1714f39c76926ac1b39-dirty'
cpus: 1
distribution:
distribution: '"centos"'
version: "8"
eventLogger: journald
hostname: **REDACTED**
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 4.18.0-193.19.1.el8_2.x86_64
linkmode: dynamic
memFree: 171679744
memTotal: 1915719680
ociRuntime:
name: runc
package: runc-1.0.0-145.rc91.git24a3cf8.el8.x86_64
path: /usr/bin/runc
version: 'runc version spec: 1.0.2-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.4-2.el8.x86_64
version: |-
slirp4netns version 1.1.4
commit: b66ffa8e262507e37fca689822d23430f3357fe8
libslirp: 4.3.1
SLIRP_CONFIG_VERSION_MAX: 3
swapFree: 0
swapTotal: 0
uptime: 3h 22m 41.11s (Approximately 0.12 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /home/roxifas/.config/containers/storage.conf
containerStore:
number: 4
paused: 0
running: 4
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.1.2-1.el8.x86_64
Version: |-
fusermount3 version: 3.2.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.2.1
using FUSE kernel interface version 7.26
graphRoot: /home/roxifas/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 6
runRoot: /run/user/1000
volumePath: /home/roxifas/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 1601239064
BuiltTime: Sun Sep 27 23:37:44 2020
GitCommit: ""
GoVersion: go1.13.15
OsArch: linux/amd64
Version: 2.1.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.1.1-4.el8.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: Create mysql container
containers.podman.podman_container:
name: db
image: docker.io/mysql
state: present
publish:
- "3306"
Command line and output of ansible run with high verbosity:
Redacted because it contained sensitive information, but I think this diff tells the story rather accurately:
--- before
+++ after
@@ -1 +1 @@
-publish - ['3306:3306']
+publish - ['3306']
Additional environment details (AWS, VirtualBox, physical, etc.):
Target environment is a CentOS VPS.
/kind feature
add podman play kube
support
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The Podman Container Module fails with TypeError: string indices must be integers
when rerunning my playbooks on a host with podman 2.0.1 installed.
Steps to reproduce the issue:
Make sure you are using one of the latest podman 2.0.x release.
Start graphite container with below playbook code snippet:
Run the code snippet again
Describe the results you received:
Ansible failed with:
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: string indices must be integers
fatal: [graphite.exp]: FAILED! => {"changed": false, "module_stderr": "Shared connection to XXX.XXX.XXX.XXX closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416552.0115376-5594-153592447270755/AnsiballZ_podman_container.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416552.0115376-5594-153592447270755/AnsiballZ_podman_container.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416552.0115376-5594-153592447270755/AnsiballZ_podman_container.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.8/runpy.py\", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File \"/usr/lib/python3.8/runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2070, in <module>\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2066, in main\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2046, in execute\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1983, in make_started\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1828, in different\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1755, in is_different\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1625, in diffparam_publish\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_t0vfb91k/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1626, in <listcomp>\r\nTypeError: string indices must be integers\r\n", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
Describe the results you expected:
Ansible ran through when using version < 2.0.1
Additional information you deem important (e.g. issue happens only occasionally):
2.0.0 broke idempotency once again (might be related or different issue), like in #61 . Now 2.0.1 completely broke the collection when rerunning the playbook.
Output of ansible --version
:
ansible 2.9.10
config file = /home/jrsr/repos/audriga-infra/ansible.cfg
configured module search path = ['/home/jrsr/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 17 2020, 18:15:42) [GCC 10.1.0]
Output of podman version
:
Version: 2.0.1
API Version: 1
Go Version: go1.13.8
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.0
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.18, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: file
hostname: graphite
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.4.0-1017-aws
linkmode: dynamic
memFree: 159105024
memTotal: 2044592128
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.0.0
commit: unknown
libslirp: 4.2.0
swapFree: 0
swapTotal: 0
uptime: 96h 40m 11.24s (Approximately 4.00 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 0
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.13.8
OsArch: linux/amd64
Version: 2.0.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
Listing... Done
podman/unknown,now 2.0.1~1 amd64 [installed]
podman/unknown 2.0.1~1 arm64
podman/unknown 2.0.1~1 armhf
podman/unknown 2.0.1~1 s390x
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: Install Graphite container
containers.podman.podman_container:
image: "graphiteapp/graphite-statsd:1.1.7-3"
name: graphite
ports:
- "80:80"
- "443:443"
- "2003-2004:2003-2004"
state: present
volumes:
- "graphite-config:/opt/graphite/conf"
- "graphite-data:/opt/graphite/storage"
- "nginx-config:/etc/nginx"
- "/etc/letsencrypt:/etc/letsencrypt"
Command line and output of ansible run with high verbosity:
ansible-playbook site.yml -CD --start-at-task 'Install Graphite container' -vvv
TASK [graphite : Install Graphite container] *************************************************************************************************************************************************
...
Escalation succeeded
<graphite.exp> (1, b'Traceback (most recent call last):\r\n File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 102, in <module>\r\n _ansiballz_main()\r\n File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible_collections.containers.podman.plugins.modules.podman_container\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib/python3.8/runpy.py", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib/python3.8/runpy.py", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib/python3.8/runpy.py", line 86, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2070, in <module>\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2066, in main\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2046, in execute\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1983, in make_started\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1828, in different\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1755, in is_different\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1625, in diffparam_publish\r\n File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1626, in <listcomp>\r\nTypeError: string indices must be integers\r\n', b'Shared connection to 34.251.65.188 closed.\r\n')
<graphite.exp> Failed to connect to the host via ssh: Shared connection to XXX.XXX.XXX.XXX closed.
<graphite.exp> ESTABLISH SSH CONNECTION FOR USER: None
<graphite.exp> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/jrsr/.ansible/cp/8193496df9 graphite.exp '/bin/sh -c '"'"'rm -f -r /home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/ > /dev/null 2>&1 && sleep 0'"'"''
<graphite.exp> (0, b'', b'')
The full traceback is:
Traceback (most recent call last):
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 102, in <module>
_ansiballz_main()
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)
File "/usr/lib/python3.8/runpy.py", line 206, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib/python3.8/runpy.py", line 96, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/lib/python3.8/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2070, in <module>
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2066, in main
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 2046, in execute
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1983, in make_started
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1828, in different
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1755, in is_different
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1625, in diffparam_publish
File "/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 1626, in <listcomp>
TypeError: string indices must be integers
fatal: [graphite.exp]: FAILED! => {
"changed": false,
"module_stderr": "Shared connection to XXX.XXX.XXX.XXX closed.\r\n",
"module_stdout": "Traceback (most recent call last):\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1593416248.8279798-5071-6472735981908/AnsiballZ_podman_container.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.8/runpy.py\", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File \"/usr/lib/python3.8/runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2070, in <module>\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2066, in main\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 2046, in execute\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1983, in make_started\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1828, in different\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1755, in is_different\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1625, in diffparam_publish\r\n File \"/tmp/ansible_containers.podman.podman_container_payload_2vkjv7be/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 1626, in <listcomp>\r\nTypeError: string indices must be integers\r\n",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
Additional environment details (AWS, VirtualBox, physical, etc.):
AWS
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
podman_container module doesn't offer the mac-address as field.
Additional information you deem important (e.g. issue happens only occasionally):
Without mac-address, I can't get a stable IP address through DHCP.
I attached a patched version of the module, untested yet but the change is rather straight forward:
I have an error
- name: Pull an cardano image
containers.podman.podman_image:
name: "hub:9080/cardano"
tag: 3.2.0-13
auth_file: /root/.podman-auth.json
state: present
TASK [currency/cardano-podman : Pull an cardano image] *************************************************
fatal: [ada-03]: FAILED! => {"changed": false, "msg": "Failed to pull image hub:9080/cardano:3.2.0-13. Error: Error: error pulling image "hub:9080/cardano:3.2.0-13": unable to pull hub:9080/cardano:3.2.0-13: unable to pull image: Error initializing source docker://hub:9080/cardano:3.2.0-13: error pinging docker registry hub:9080: Get https://hub:9080/v2/: http: server gave HTTP response to HTTPS client\n"}
But if I run podman pull
manually it works fine.
root@ada-03:~# podman pull -q hub:9080/cardano:3.2.0-13
151ea90a6aa201aa601e29d19807ccd3726156808a6473fce5209f6d73b24e5e
root@ada-03:~#
Does podman_image
ignore /etc/containers/registries.conf
file?
# cat /etc/containers/registries.conf
[registries.search]
registries = ['docker.io', 'quay.io']
[registries.insecure]
registries = ['hub:9080', 'hub:9090']
/kind bug
Description
if I create a simple podman network, it will always marked as changed. Also if I run the ansible script the second time and the network already exists
Steps to reproduce the issue:
Describe the results you received:
After the first ansible run, the network is created.
and after the second ansible run, the task is marked as changed.
Describe the results you expected:
I would expect that the task is unchanged
Output of ansible --version
:
ansible 2.9.6
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.6/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.6.10 (default, Jan 16 2020, 09:12:04) [GCC]
Output of podman version
:
Version: 2.0.6
API Version: 1
Go Version: go1.13.15
Built: Tue Sep 8 14:00:00 2020
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.1
cgroupVersion: v1
conmon:
package: conmon-2.0.20-lp152.4.3.1.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.20, commit: unknown'
cpus: 4
distribution:
distribution: '"opensuse-leap"'
version: "15.2"
eventLogger: file
hostname: marvin
idMappings:
gidmap: null
uidmap: null
kernel: 5.3.18-lp152.44-default
linkmode: dynamic
memFree: 5742788608
memTotal: 33301647360
ociRuntime:
name: runc
package: runc-1.0.0~rc10-lp152.1.2.x86_64
path: /usr/bin/runc
version: |-
runc version 1.0.0-rc10
spec: 1.0.1-dev
os: linux
remoteSocket:
exists: true
path: /run/podman/podman.sock
rootless: false
slirp4netns:
executable: ""
package: ""
version: ""
swapFree: 8563019776
swapTotal: 8574554112
uptime: 52h 28m 26.26s (Approximately 2.17 days)
registries:
search:
- registry.opensuse.org
- docker.io
store:
configFile: /etc/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: overlay
graphOptions: {}
graphRoot: /dataDisk/var/lib/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "true"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 4
runRoot: /dataDisk/tmp/containers/storage
volumePath: /dataDisk/var/lib/containers/storage/volumes
version:
APIVersion: 1
Built: 1599566400
BuiltTime: Tue Sep 8 14:00:00 2020
GitCommit: ""
GoVersion: go1.13.15
OsArch: linux/amd64
Version: 2.0.6
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.0.6-lp152.4.3.1.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: create mobile_vpn network
containers.podman.podman_network:
name: "mobile_vpn"
subnet: "10.100.0.208/29"
gateway: "10.100.0.209"
Command line and output of ansible run with high verbosity:
TASK [ssh_vpn : create mobile_vpn network] ***********************************************************************************************************************************************************************************************************************************
--- before
+++ after
@@ -1 +1 @@
-disable_dns - True
+disable_dns - False
changed: [marvin] => {"actions": ["recreated mobile_vpn"], "changed": true, "network": {"cniVersion": "0.4.0", "name": "mobile_vpn", "plugins": [{"bridge": "cni-podman2", "hairpinMode": true, "ipMasq": true, "ipam": {"ranges": [[{"gateway": "10.100.0.209", "subnet": "10.100.0.208/29"}]], "routes": [{"dst": "0.0.0.0/0"}], "type": "host-local"}, "isGateway": true, "type": "bridge"}, {"capabilities": {"portMappings": true}, "type": "portmap"}, {"backend": "", "type": "firewall"}]}, "podman_actions": ["podman network rm -f mobile_vpn", "podman network create mobile_vpn --subnet 10.100.0.208/29 --gateway 10.100.0.209"], "stderr": "", "stderr_lines": [], "stdout": "/etc/cni/net.d/mobile_vpn.conflist\n", "stdout_lines": ["/etc/cni/net.d/mobile_vpn.conflist"]}
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The podman connection plugin does not work anymore in non-root mode with Podam 2
Steps to reproduce the issue:
podman run --name c8 --rm --privileged -v /sys/fs/cgroup:/sys/fs/cgroup:ro docker.io/nmstate/centos8-nmstate-dev
ansible -i c8, -c containers.podman.podman -m setup all
Describe the results you received:
[WARNING]: Unhandled error in Python interpreter discovery for host c8: Expecting value: line 1 column 1 (char 0)
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: FileNotFoundError: [Errno 2] No such file or directory: b'/home/till/.local/share/containers/storage/overlay/ce57833ce38f71131609ff6bd6419280bb2da69dc6026ae29a2701562b149d51/merged/root/.ansible/tmp/ansible-tmp-1593806813.5026007-1934528-92979037090521/AnsiballZ_setup.py'
c8 | FAILED! => {
"msg": "Unexpected failure during module execution.",
"stdout": ""
}
Describe the results you expected:
Successful Ansible command
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.9.10
config file = /home/till/.ansible.cfg
configured module search path = ['/home/till/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 29 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
Output of podman version
:
podman version 2.0.1
Output of podman info --debug
:
(paste your output here)
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.0.1-1.fc32.x86_64
Command line and output of ansible run with high verbosity:
ansible -vvvvi c8, -c containers.podman.podman -m setup all
ansible 2.9.10
config file = /home/till/.ansible.cfg
configured module search path = ['/home/till/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 29 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
Using /home/till/.ansible.cfg as config file
setting up inventory plugins
Parsed c8, inventory source with host_list plugin
Loading callback plugin minimal of type stdout, v2.0 from /usr/lib/python3.8/site-packages/ansible/plugins/callback/minimal.py
META: ran handlers
Using podman connection from collection
<c8> RUN [b'/usr/bin/podman', b'mount', b'c8']
<c8> RUN [b'/usr/bin/podman', b'exec', b'c8', b'/bin/sh', b'-c', b'echo ~ && sleep 0']
<c8> RUN [b'/usr/bin/podman', b'exec', b'c8', b'/bin/sh', b'-c', b'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1593806986.5295613-1935085-94392905227443 && echo ansible-tmp-1593806986.5295613-1935085-94392905227443="` echo /root/.ansible/tmp/ansible-tmp-1593806986.5295613-1935085-94392905227443 `" ) && sleep 0']
<c8> Attempting python interpreter discovery
<c8> RUN [b'/usr/bin/podman', b'exec', b'c8', b'/bin/sh', b'-c', b"echo PLATFORM; uname; echo FOUND; command -v '/usr/bin/python'; command -v 'python3.7'; command -v 'python3.6'; command -v 'python3.5'; command -v 'python2.7'; command -v 'python2.6'; command -v '/usr/libexec/platform-python'; command -v '/usr/bin/python3'; command -v 'python'; echo ENDFOUND && sleep 0"]
<c8> RUN [b'/usr/bin/podman', b'exec', b'c8', b'/bin/sh', b'-c', b'/usr/bin/python && sleep 0']
[WARNING]: Unhandled error in Python interpreter discovery for host c8: Expecting value: line 1 column 1 (char 0)
Using module file /usr/lib/python3.8/site-packages/ansible/modules/system/setup.py
<c8> PUT /home/till/.ansible/tmp/ansible-local-1935082f_sfc05r/tmpzicpeliy TO /root/.ansible/tmp/ansible-tmp-1593806986.5295613-1935085-94392905227443/AnsiballZ_setup.py
<c8> RUN [b'/usr/bin/podman', b'exec', b'c8', b'/bin/sh', b'-c', b'rm -f -r /root/.ansible/tmp/ansible-tmp-1593806986.5295613-1935085-94392905227443/ > /dev/null 2>&1 && sleep 0']
The full traceback is:
Traceback (most recent call last):
File "/usr/lib/python3.8/site-packages/ansible/executor/task_executor.py", line 146, in run
res = self._execute()
File "/usr/lib/python3.8/site-packages/ansible/executor/task_executor.py", line 654, in _execute
result = self._handler.run(task_vars=variables)
File "/usr/lib/python3.8/site-packages/ansible/plugins/action/normal.py", line 46, in run
result = merge_hash(result, self._execute_module(task_vars=task_vars, wrap_async=wrap_async))
File "/usr/lib/python3.8/site-packages/ansible/plugins/action/__init__.py", line 858, in _execute_module
self._transfer_data(remote_module_path, module_data)
File "/usr/lib/python3.8/site-packages/ansible/plugins/action/__init__.py", line 469, in _transfer_data
self._transfer_file(afile, remote_path)
File "/usr/lib/python3.8/site-packages/ansible/plugins/action/__init__.py", line 446, in _transfer_file
self._connection.put_file(local_path, remote_path)
File "/home/till/.ansible/collections/ansible_collections/containers/podman/plugins/connection/podman.py", line 184, in put_file
shutil.copyfile(
File "/usr/lib64/python3.8/shutil.py", line 261, in copyfile
with open(src, 'rb') as fsrc, open(dst, 'wb') as fdst:
FileNotFoundError: [Errno 2] No such file or directory: b'/home/till/.local/share/containers/storage/overlay/ce57833ce38f71131609ff6bd6419280bb2da69dc6026ae29a2701562b149d51/merged/root/.ansible/tmp/ansible-tmp-1593806986.5295613-1935085-94392905227443/AnsiballZ_setup.py'
c8 | FAILED! => {
"msg": "Unexpected failure during module execution.",
"stdout": ""
}
Additional environment details (AWS, VirtualBox, physical, etc.):
Downgrading to podman-2:1.8.2-2.fc32 fixes the problem.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
The CNI bridge plugin/driver supports setting custom MTU value, but it's not possible to set this value via containers.podman.podman_network. Please add support for MTU setting directly or via some general custom fields. Thanks!
Podman modules podman_container and podman_network_info have tests, import them to the repo.
/kind feature
Description
Use same structure as docker_container for mounts
Additional information you deem important (e.g. issue happens only occasionally):
This should be done now, so users can switch from docker_container to podman_container with ease (#99 )
Changes
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
When using podman_containers, I am getting the below situation.
Steps to reproduce the issue:
ansible-galaxy collection install containers.podman
Describe the results you received:
In the first run, everything worked and the playbook was executed.
Every test after the initial one is showing issues.
The container is not started.
Describe the results you expected:
A container is started on the target machine and running busybox.
Additional information you deem important (e.g. issue happens only occasionally):
Output of ansible --version
:
ansible 2.10.1
config file = /var/home/dschier/Projects/laplab/ansible.cfg
configured module search path = ['/var/home/dschier/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /var/home/dschier/.venv-python3/lib64/python3.8/site-packages/ansible
executable location = /var/home/dschier/.venv-python3/bin/ansible
python version = 3.8.5 (default, Aug 12 2020, 00:00:00) [GCC 10.2.1 20200723 (Red Hat 10.2.1-1)]
Output of podman version
:
podman version 2.1.1
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.16.1
cgroupManager: systemd
cgroupVersion: v2
conmon:
package: conmon-2.0.21-2.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.21, commit: 81d18b6c3ffc266abdef7ca94c1450e669a6a388'
cpus: 8
distribution:
distribution: fedora
version: "32"
eventLogger: journald
hostname: nb01
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.8.12-200.fc32.x86_64
linkmode: dynamic
memFree: 286470144
memTotal: 16600121344
ociRuntime:
name: crun
package: crun-0.15-5.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.15
commit: 56ca95e61639510c7dbd39ff512f80f626404969
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.1.4-1.fc32.x86_64
version: |-
slirp4netns version 1.1.4
commit: b66ffa8e262507e37fca689822d23430f3357fe8
libslirp: 4.3.1
SLIRP_CONFIG_VERSION_MAX: 2
swapFree: 8364486656
swapTotal: 8371826688
uptime: 84h 27m 17.54s (Approximately 3.50 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /var/home/dschier/.config/containers/storage.conf
containerStore:
number: 3
paused: 0
running: 3
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.1.2-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.1.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /var/home/dschier/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 18
runRoot: /run/user/1000
volumePath: /var/home/dschier/.local/share/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 1601494271
BuiltTime: Wed Sep 30 21:31:11 2020
GitCommit: ""
GoVersion: go1.14.9
OsArch: linux/amd64
Version: 2.1.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.1.1-7.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
---
- hosts: localhost
connection: local
tasks:
- name: Create container
containers.podman.podman_container:
name: mydata
image: busybox
command: sleep 3000
Command line and output of ansible run with high verbosity:
ansible-playbook 2.10.1
config file = None
configured module search path = ['/var/home/dschier/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /var/home/dschier/.venv-python3/lib64/python3.8/site-packages/ansible
executable location = /var/home/dschier/.venv-python3/bin/ansible-playbook
python version = 3.8.5 (default, Aug 12 2020, 00:00:00) [GCC 10.2.1 20200723 (Red Hat 10.2.1-1)]
No config file found; using defaults
Parsed localhost, inventory source with host_list plugin
PLAYBOOK: container-iot.yml *****************************************************************************
1 plays in container-iot.yml
PLAY [localhost] ****************************************************************************************
TASK [Gathering Facts] **********************************************************************************
task path: /var/home/dschier/Projects/container-iot.yml:2
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dschier
<localhost> EXEC /bin/sh -c 'echo ~dschier && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /var/home/dschier/.ansible/tmp `"&& mkdir "` echo /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167 `" && echo ansible-tmp-1601970919.5963953-210560-34787772624167="` echo /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167 `" ) && sleep 0'
<localhost> Attempting python interpreter discovery
<localhost> EXEC /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'python3.5'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'python2.6'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python && sleep 0'
Using module file /var/home/dschier/.venv-python3/lib64/python3.8/site-packages/ansible/modules/setup.py
<localhost> PUT /var/home/dschier/.ansible/tmp/ansible-local-2105504st2r_v5/tmp7pbm3d6c TO /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167/AnsiballZ_setup.py
<localhost> EXEC /bin/sh -c 'chmod u+x /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167/ /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167/AnsiballZ_setup.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167/AnsiballZ_setup.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /var/home/dschier/.ansible/tmp/ansible-tmp-1601970919.5963953-210560-34787772624167/ > /dev/null 2>&1 && sleep 0'
[DEPRECATION WARNING]: Distribution fedora 32 on host localhost should use /usr/bin/python3, but is
using /usr/bin/python for backward compatibility with prior Ansible releases. A future Ansible release
will default to using the discovered platform python for this host. See
https://docs.ansible.com/ansible/2.10/reference_appendices/interpreter_discovery.html for more
information. This feature will be removed in version 2.12. Deprecation warnings can be disabled by
setting deprecation_warnings=False in ansible.cfg.
ok: [localhost]
META: ran handlers
TASK [Create container] *********************************************************************************
task path: /var/home/dschier/Projects/container-iot.yml:7
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: dschier
<localhost> EXEC /bin/sh -c 'echo ~dschier && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /var/home/dschier/.ansible/tmp `"&& mkdir "` echo /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805 `" && echo ansible-tmp-1601970920.596172-210634-176221820881805="` echo /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805 `" ) && sleep 0'
Using module file /var/home/dschier/.ansible/collections/ansible_collections/containers/podman/plugins/modules/podman_container.py
<localhost> PUT /var/home/dschier/.ansible/tmp/ansible-local-2105504st2r_v5/tmppxgvmy3i TO /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py
<localhost> EXEC /bin/sh -c 'chmod u+x /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/ /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py && sleep 0'
<localhost> EXEC /bin/sh -c 'rm -f -r /var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
File "/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py", line 102, in <module>
_ansiballz_main()
File "/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)
File "/usr/lib64/python3.8/runpy.py", line 207, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib64/python3.8/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/lib64/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmp/ansible_containers.podman.podman_container_payload_8bysk9p_/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py", line 832, in <module>
ModuleNotFoundError: No module named 'yaml'
fatal: [localhost]: FAILED! => {
"changed": false,
"module_stderr": "Traceback (most recent call last):\n File \"/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py\", line 102, in <module>\n _ansiballz_main()\n File \"/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py\", line 94, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/var/home/dschier/.ansible/tmp/ansible-tmp-1601970920.596172-210634-176221820881805/AnsiballZ_podman_container.py\", line 40, in invoke_module\n runpy.run_module(mod_name='ansible_collections.containers.podman.plugins.modules.podman_container', init_globals=None, run_name='__main__', alter_sys=True)\n File \"/usr/lib64/python3.8/runpy.py\", line 207, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lib64/python3.8/runpy.py\", line 97, in _run_module_code\n _run_code(code, mod_globals, init_globals,\n File \"/usr/lib64/python3.8/runpy.py\", line 87, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_containers.podman.podman_container_payload_8bysk9p_/ansible_containers.podman.podman_container_payload.zip/ansible_collections/containers/podman/plugins/modules/podman_container.py\", line 832, in <module>\nModuleNotFoundError: No module named 'yaml'\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
PLAY RECAP **********************************************************************************************
localhost : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Steps to reproduce the issue:
Want to know how to use podman_container
Search for docs.
Describe the results you received:
Not found!
Describe the results you expected:
Found.
Additional information you deem important (e.g. issue happens only occasionally):
Some modules docs can be found in https://docs.ansible.com/ansible/latest/modules/list_of_all_modules.html but are possibly outdated.
Output of ansible --version
:
ansible 2.9.9
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/yajo/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 15 2020, 00:00:00) [GCC 10.1.1 20200507 (Red Hat 10.1.1-1)]
In all *Diff: classes where we test the configuration for changes we dump the current config to json, convert it to lower case and load it again into an dict object.
e.g: podman_container.py => PodmanContainerDiff =>
self.info = yaml.safe_load(json.dumps(info).lower())
self.image_info = yaml.safe_load(json.dumps(image_info).lower())
my guess is that the reason is to have lower case config keys. But a bad side effect is that we also convert the values itself to lowercase. Now I have 2 scenarios where this will not work.
We have a config value like a volume path where Uppercase/Lowercase matters and we are not able to change it from uppercase to lowercase. In this case, we are not detecting any changes, because the original config is already converted to lowercase before we compare and the new value is lowercase too.
The Old value is Uppercase and the new value is uppercase, so no changes. In this case we detect always changes, because the old value is converted to lowercase before we compare it with the new value.
My question is now, what could be a proper fix. My recommendation is to convert only the keys to lowercase, but not the values.
using current migration scenario:
collection_migration/scenarios/containers/containers.yml
podman:
connection:
- buildah.py
- podman.py
modules:
- cloud/podman/__init__.py
- cloud/podman/podman_container_info.py
- cloud/podman/podman_image_info.py
- cloud/podman/podman_image.py
- cloud/podman/podman_volume_info.py
module_utils:
- podman/common.py
- podman/__init__.py
integration:
- connection_podman/*
- podman_*/*
- podman_*/*/*
- setup_podman/*
- setup_podman/*/*
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
The publish
option on podman_container breaks idempotency, recreating the container every time.
Steps to reproduce the issue:
Run a playbook that specifies a container with a published port twice.
Describe the results you received:
The container is not recreated.
Describe the results you expected:
The container is recreated.
Additional information you deem important (e.g. issue happens only occasionally):
The ports are actually published, even though the gathered info indicates otherwise.
Output of ansible --version
:
ansible 2.9.13
config file = None
configured module search path = ['/Users/mka/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/Cellar/ansible/2.9.13/libexec/lib/python3.8/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.8.5 (default, Jul 21 2020, 10:48:26) [Clang 11.0.3 (clang-1103.0.32.62)]
Output of podman version
:
podman version 2.1.0
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.16.1
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.20, commit: '
cpus: 16
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: journald
hostname: george
idMappings:
gidmap: null
uidmap: null
kernel: 5.4.0-48-generic
linkmode: dynamic
memFree: 133869850624
memTotal: 135027011584
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/podman/podman.sock
rootless: false
slirp4netns:
executable: ""
package: ""
version: ""
swapFree: 0
swapTotal: 0
uptime: 12m 33.94s
registries:
search:
- docker.io
- quay.io
store:
configFile: /etc/containers/storage.conf
containerStore:
number: 1
paused: 0
running: 1
stopped: 0
graphDriverName: overlay
graphOptions: {}
graphRoot: /var/lib/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "true"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 1
runRoot: /var/run/containers/storage
volumePath: /var/lib/containers/storage/volumes
version:
APIVersion: 2.0.0
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.15.2
OsArch: linux/amd64
Version: 2.1.0
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman/unknown,now 2.1.0~1 amd64 [installed]
Playbok you run with ansible (e.g. content of playbook.yaml
):
---
- hosts: all
become: yes
tasks:
- name: Create the rrrouter container
containers.podman.podman_container:
name: rrrouter
image: "richiefi/rrrouter:latest"
network: host
publish:
- "9100:9100"
read_only: true
restart_policy: always
env:
MAPPING_FILE: "/etc/richie/rrrouter-mapping.yml"
PORT: "9100"
volume:
- "/etc/richie/rrrouter-mapping.yml:/etc/richie/rrrouter-mapping.yml"
register: rrrouter_container_result
Command line and output of ansible run with high verbosity:
TASK [Create the rrrouter container] *****************************************************************************
--- before
+++ after
@@ -1 +1 @@
-publish - []
+publish - ['9100:9100']
Additional environment details (AWS, VirtualBox, physical, etc.):
This is on Ubuntu 20.04.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
Building an OCI container image using a Dockerfile fails sometimes.
Steps to reproduce the issue:
Define a Dockerfile and build an OCI container image via
- name: build own Jenkins container
become: yes
podman_image:
state: build
name: own_jenkins:latest
force: yes
path: "{{ configure_jenkins_container_folder }}"
build:
cache: yes
force_rm: yes
format: oci
rm: yes
Describe the results you received:
Task fails even if the image seems to be builded. The image ist listed via
$ sudo podman images
REPOSITORY TAG IMAGE ID CREATED SIZE
localhost/own_jenkins latest d6670b64c623 26 seconds ago 703 MB
TASK [configure_jenkins : build own Jenkins container] *************************
irtualSize\": 703042146, \"GraphDriver\": {\"Name\": \"overlay\", \"Data\": {\"LowerDir\": \"/var/lib/containers/storage/overlay/dfa510163ca63d255ecc63f1282a33f55048e7c463c3954bd3f8a5f2259452bf/diff:/var/lib/containers/storage/overlay/664f1885251c6e243208977e957c99918c46e498b893b9d5de7f1832ba718b8a/diff:/var/lib/containers/storage/overlay/5a78be820e8fefc09e08af99a7b5988dd0a39458b60442912fb29e5676bf2dd2/diff:/var/lib/containers/storage/overlay/63fd877a80e86030625ae7d4e0d44b81bae98ceb94e94a5fdd6c302c1d28f45b/diff:/var/lib/containers/storage/overlay/a88f0528325dd9d7e97d4befa7e94e21e07536fbcf1d3301bd6658f7eac3f9a6/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff\", \"UpperDir\": \"/var/lib/containers/storage/overlay/e4e21b11c88d49daf2ec1e8fac3f2373dc897626be88cc0e0d2b337bca2530a0/diff\", \"WorkDir\": \"/var/lib/containers/storage/overlay/e4e21b11c88d49daf2ec1e8fac3f2373dc897626be88cc0e0d2b337bca2530a0/work\"}}, \"RootFS\": {\"Type\": \"layers\", \"Layers\": [\"sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9\", \"sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9\", \"sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e\", \"sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9\", \"sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9\", \"sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90\", \"sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd\", \"sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd\", \"sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda\", \"sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b\", \"sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181\", \"sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59\", \"sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0\", \"sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301\", \"sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2\", \"sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9\", \"sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84\", \"sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d\", \"sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8\", \"sha256:c55fccaaed8faec0e993e99303c8873bd0f9b87505c01755b29c34ea5ee38fdf\", \"sha256:c5c0e14f8528685ad5278e677273e44c57d4acabb98df4dd2f574980400be112\", \"sha256:252b998e155c88013250f31fc73533599c4d8a62573f5bb2738a71317e4ca4b0\", \"sha256:9421276e074f22e8e4caf1fb6b3f906c4a69e676b1e2a09b7a87f85d09e63c0f\", \"sha256:ffe3c401910b395913f2d64459ac25938543ae5c44be675b8c1b3699dad914fc\", \"sha256:f672236df0f10964ab214e0e05218dacac705828c8c2ce3ac40bc384cfe16dc7\"]}, \"Labels\": null, \"Annotations\": {}, \"ManifestType\": \"application/vnd.oci.image.manifest.v1+json\", \"User\": \"jenkins\", \"History\": [{\"created\": \"2020-02-01T17:23:41.468764783Z\", \"created_by\": \"/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / \"}, {\"created\": \"2020-02-01T17:23:41.779347941Z\", \"created_by\": \"/bin/sh -c #(nop) CMD [\\\"bash\\\"]\", \"empty_layer\": true}, {\"created\": \"2020-02-02T00:33:57.472084922Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T00:34:02.575926722Z\", \"created_by\": \"/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi\"}, {\"created\": \"2020-02-02T00:34:28.382621018Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.069777635Z\", \"created_by\": \"/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.30328465Z\", \"created_by\": \"/bin/sh -c #(nop) ENV LANG=C.UTF-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:13.86592879Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:14.22638233Z\", \"created_by\": \"/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:15.751830342Z\", \"created_by\": \"/bin/sh -c { echo '#/bin/sh'; echo 'echo \\\"$JAVA_HOME\\\"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\\"$JAVA_HOME\\\" = \\\"$(docker-java-home)\\\" ]\"}, {\"created\": \"2020-02-02T06:28:16.075627774Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_VERSION=8u242\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.435762402Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.735005969Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:28.536242187Z\", \"created_by\": \"/bin/sh -c set -eux; \\t\\tdpkgArch=\\\"$(dpkg --print-architecture)\\\"; \\tcase \\\"$dpkgArch\\\" in \\t\\tamd64) upstreamArch='x64' ;; \\t\\tarm64) upstreamArch='aarch64' ;; \\t\\t*) echo >&2 \\\"error: unsupported architecture: $dpkgArch\\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\\"; \\twget -O openjdk.tgz \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\\"$(mktemp -d)\\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep '0xA5CD6035332FA671' \\t\\t| grep 'Andrew Haley'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\\"$GNUPGHOME\\\"; \\t\\tmkdir -p \\\"$JAVA_HOME\\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\\"$JAVA_HOME\\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo '#!/usr/bin/env bash'; \\t\\techo 'set -Eeuo pipefail'; \\t\\techo 'if ! [ -d \\\"$JAVA_HOME\\\" ]; then echo >&2 \\\"error: missing JAVA_HOME environment variable\\\"; exit 1; fi'; \\t\\techo 'cacertsFile=; for f in \\\"$JAVA_HOME/lib/security/cacerts\\\" \\\"$JAVA_HOME/jre/lib/security/cacerts\\\"; do if [ -e \\\"$f\\\" ]; then cacertsFile=\\\"$f\\\"; break; fi; done'; \\t\\techo 'if [ -z \\\"$cacertsFile\\\" ] || ! [ -f \\\"$cacertsFile\\\" ]; then echo >&2 \\\"error: failed to find cacerts file in $JAVA_HOME\\\"; exit 1; fi'; \\t\\techo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\\"$cacertsFile\\\"'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\\"$JAVA_HOME/lib\\\" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version\"}, {\"created\": \"2020-03-08T02:08:28.421510268Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-03-08T02:08:31.93114162Z\", \"created_by\": \"/bin/sh -c #(nop) ARG user=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.055659115Z\", \"created_by\": \"/bin/sh -c #(nop) ARG group=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.874837809Z\", \"created_by\": \"/bin/sh -c #(nop) ARG uid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:34.884950047Z\", \"created_by\": \"/bin/sh -c #(nop) ARG gid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:35.896784898Z\", \"created_by\": \"/bin/sh -c #(nop) ARG http_port=8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:36.871761947Z\", \"created_by\": \"/bin/sh -c #(nop) ARG agent_port=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:37.961637985Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:38.89036682Z\", \"created_by\": \"/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:39.868974806Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:40.877764659Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:41.871871056Z\", \"created_by\": \"/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:44.011833083Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\\"$JENKINS_HOME\\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}\"}, {\"created\": \"2Traceback (most recent call last):\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible.modules.cloud.podman.podman_image', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File \"/usr/lib64/python3.8/runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 725, in <module>\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 721, in main\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2071, in exit_json\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2065, in _return_formatted\r\nBlockingIOError: [Errno 11] write could not complete without blocking\r\n", "msg": "MODULE FAILURE\nSee stdoufatal: [dev_jenkins_christian]: FAILED! => {"changed": false, "module_stderr": "Shared connection to 10.50.0.63 closed.\r\n", "module_stdout": "\r\n{\"changed\": true, \"actions\": [\"Built image own_jenkins:latest from /home/fedora/own_jenkins_container\"], \"image\": [{\"Id\": \"d6670b64c623b94a5836843814a70ee23a652d375d8747e455bddae235e2ba22\", \"Digest\": \"sha256:450994afaceb8fd5c561a4b489406c46c509bfcca9c84d77fc0d9a9a477f4ffd\", \"RepoTags\": [\"localhost/own_jenkins:latest\"], \"RepoDigests\": [\"localhost/own_jenkins@sha256:450994afaceb8fd5c561a4b489406c46c509bfcca9c84d77fc0d9a9a477f4ffd\"], \"Parent\": \"\", \"Comment\": \"\", \"Created\": \"2020-05-12T06:32:07.734229684Z\", \"Config\": {\"User\": \"jenkins\", \"ExposedPorts\": {\"50000/tcp\": {}, \"8080/tcp\": {}}, \"Env\": [\"PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"LANG=C.UTF-8\", \"JAVA_HOME=/usr/local/openjdk-8\", \"JAVA_VERSION=8u242\", \"JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"JAVA_URL_VERSION=8u242b08\", \"JENKINS_HOME=/var/jenkins_home\", \"JENKINS_SLAVE_AGENT_PORT=50000\", \"REF=/usr/share/jenkins/ref\", \"JENKINS_VERSION=2.204.5\", \"JENKINS_UC=https://updates.jenkins.io\", \"JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental\", \"JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals\", \"COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log\", \"CASC_JENKINS_CONFIG=/var/jenkins_conf\", \"JAVA_OPTS=-Djenkins.install.runSetupWizard=false -Djenkins.model.Jenkins.buildsDir=${JENKINS_HOME}/builds/${ITEM_FULL_NAME}\"], \"Entrypoint\": [\"/sbin/tini\", \"--\", \"/usr/local/bin/jenkins.sh\"], \"Volumes\": {\"/var/jenkins_home\": {}}}, \"Version\": \"\", \"Author\": \"\", \"Architecture\": \"amd64\", \"Os\": \"linux\", \"Size\": 703042146, \"VirtualSize\": 703042146, \"GraphDriver\": {\"Name\": \"overlay\", \"Data\": {\"LowerDir\": \"/var/lib/containers/storage/overlay/dfa510163ca63d255ecc63f1282a33f55048e7c463c3954bd3f8a5f2259452bf/diff:/var/lib/containers/storage/overlay/664f1885251c6e243208977e957c99918c46e498b893b9d5de7f1832ba718b8a/diff:/var/lib/containers/storage/overlay/5a78be820e8fefc09e08af99a7b5988dd0a39458b60442912fb29e5676bf2dd2/diff:/var/lib/containers/storage/overlay/63fd877a80e86030625ae7d4e0d44b81bae98ceb94e94a5fdd6c302c1d28f45b/diff:/var/lib/containers/storage/overlay/a88f0528325dd9d7e97d4befa7e94e21e07536fbcf1d3301bd6658f7eac3f9a6/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff\", \"UpperDir\": \"/var/lib/containers/storage/overlay/e4e21b11c88d49daf2ec1e8fac3f2373dc897626be88cc0e0d2b337bca2530a0/diff\", \"WorkDir\": \"/var/lib/containers/storage/overlay/e4e21b11c88d49daf2ec1e8fac3f2373dc897626be88cc0e0d2b337bca2530a0/work\"}}, \"RootFS\": {\"Type\": \"layers\", \"Layers\": [\"sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9\", \"sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9\", \"sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e\", \"sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9\", \"sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9\", \"sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90\", \"sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd\", \"sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd\", \"sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda\", \"sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b\", \"sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181\", \"sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59\", \"sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0\", \"sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301\", \"sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2\", \"sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9\", \"sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84\", \"sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d\", \"sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8\", \"sha256:c55fccaaed8faec0e993e99303c8873bd0f9b87505c01755b29c34ea5ee38fdf\", \"sha256:c5c0e14f8528685ad5278e677273e44c57d4acabb98df4dd2f574980400be112\", \"sha256:252b998e155c88013250f31fc73533599c4d8a62573f5bb2738a71317e4ca4b0\", \"sha256:9421276e074f22e8e4caf1fb6b3f906c4a69e676b1e2a09b7a87f85d09e63c0f\", \"sha256:ffe3c401910b395913f2d64459ac25938543ae5c44be675b8c1b3699dad914fc\", \"sha256:f672236df0f10964ab214e0e05218dacac705828c8c2ce3ac40bc384cfe16dc7\"]}, \"Labels\": null, \"Annotations\": {}, \"ManifestType\": \"application/vnd.oci.image.manifest.v1+json\", \"User\": \"jenkins\", \"History\": [{\"created\": \"2020-02-01T17:23:41.468764783Z\", \"created_by\": \"/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / \"}, {\"created\": \"2020-02-01T17:23:41.779347941Z\", \"created_by\": \"/bin/sh -c #(nop) CMD [\\\"bash\\\"]\", \"empty_layer\": true}, {\"created\": \"2020-02-02T00:33:57.472084922Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T00:34:02.575926722Z\", \"created_by\": \"/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi\"}, {\"created\": \"2020-02-02T00:34:28.382621018Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.069777635Z\", \"created_by\": \"/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.30328465Z\", \"created_by\": \"/bin/sh -c #(nop) ENV LANG=C.UTF-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:13.86592879Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:14.22638233Z\", \"created_by\": \"/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:15.751830342Z\", \"created_by\": \"/bin/sh -c { echo '#/bin/sh'; echo 'echo \\\"$JAVA_HOME\\\"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\\"$JAVA_HOME\\\" = \\\"$(docker-java-home)\\\" ]\"}, {\"created\": \"2020-02-02T06:28:16.075627774Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_VERSION=8u242\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.435762402Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.735005969Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:28.536242187Z\", \"created_by\": \"/bin/sh -c set -eux; \\t\\tdpkgArch=\\\"$(dpkg --print-architecture)\\\"; \\tcase \\\"$dpkgArch\\\" in \\t\\tamd64) upstreamArch='x64' ;; \\t\\tarm64) upstreamArch='aarch64' ;; \\t\\t*) echo >&2 \\\"error: unsupported architecture: $dpkgArch\\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\\"; \\twget -O openjdk.tgz \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\\"$(mktemp -d)\\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep '0xA5CD6035332FA671' \\t\\t| grep 'Andrew Haley'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\\"$GNUPGHOME\\\"; \\t\\tmkdir -p \\\"$JAVA_HOME\\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\\"$JAVA_HOME\\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo '#!/usr/bin/env bash'; \\t\\techo 'set -Eeuo pipefail'; \\t\\techo 'if ! [ -d \\\"$JAVA_HOME\\\" ]; then echo >&2 \\\"error: missing JAVA_HOME environment variable\\\"; exit 1; fi'; \\t\\techo 'cacertsFile=; for f in \\\"$JAVA_HOME/lib/security/cacerts\\\" \\\"$JAVA_HOME/jre/lib/security/cacerts\\\"; do if [ -e \\\"$f\\\" ]; then cacertsFile=\\\"$f\\\"; break; fi; done'; \\t\\techo 'if [ -z \\\"$cacertsFile\\\" ] || ! [ -f \\\"$cacertsFile\\\" ]; then echo >&2 \\\"error: failed to find cacerts file in $JAVA_HOME\\\"; exit 1; fi'; \\t\\techo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\\"$cacertsFile\\\"'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\\"$JAVA_HOME/lib\\\" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version\"}, {\"created\": \"2020-03-08T02:08:28.421510268Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-03-08T02:08:31.93114162Z\", \"created_by\": \"/bin/sh -c #(nop) ARG user=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.055659115Z\", \"created_by\": \"/bin/sh -c #(nop) ARG group=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.874837809Z\", \"created_by\": \"/bin/sh -c #(nop) ARG uid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:34.884950047Z\", \"created_by\": \"/bin/sh -c #(nop) ARG gid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:35.896784898Z\", \"created_by\": \"/bin/sh -c #(nop) ARG http_port=8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:36.871761947Z\", \"created_by\": \"/bin/sh -c #(nop) ARG agent_port=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:37.961637985Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:38.89036682Z\", \"created_by\": \"/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:39.868974806Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:40.877764659Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:41.871871056Z\", \"created_by\": \"/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:44.011833083Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\\"$JENKINS_HOME\\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}\"}, {\"created\": \"2Traceback (most recent call last):\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589265084.5838895-41905-244756486402251/AnsiballZ_podman_image.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible.modules.cloud.podman.podman_image', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File \"/usr/lib64/python3.8/runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 725, in <module>\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 721, in main\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2071, in exit_json\r\n File \"/tmp/ansible_podman_image_payload_kp1d99z5/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2065, in _return_formatted\r\nBlockingIOError: [Errno 11] write could not complete without blocking\r\n", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
Describe the results you expected:
Building should always succeed without any error messages.
TASK [configure_jenkins : build own Jenkins container] *************************
changed: [dev_jenkins_christian]
Additional information you deem important (e.g. issue happens only occasionally):
Not reproducible, sometimes it happens, somtimes not.
Output of ansible --version
:
ansible 2.9.7
config file = None
configured module search path = ['/home/pink/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/pink/Dokumente/ox/masterthesis/venv_runner/lib64/python3.8/site-packages/ansible
executable location = /home/pink/Dokumente/ox/masterthesis/venv_runner/bin/ansible
python version = 3.8.2 (default, Feb 28 2020, 00:00:00) [GCC 10.0.1 20200216 (Red Hat 10.0.1-0.8)]
Output of podman version
:
Version: 1.9.1
RemoteAPI Version: 1
Go Version: go1.14.2
OS/Arch: linux/amd64
Output of podman info --debug
:
debug:
compiler: gc
gitCommit: ""
goVersion: go1.14.2
podmanVersion: 1.9.1
host:
arch: amd64
buildahVersion: 1.14.8
cgroupVersion: v2
conmon:
package: conmon-2.0.15-1.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.15, commit: 33da5ef83bf2abc7965fc37980a49d02fdb71826'
cpus: 2
distribution:
distribution: fedora
version: "32"
eventLogger: file
hostname: dev-jenkins-christian.novalocal
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.6.11-300.fc32.x86_64
memFree: 817848320
memTotal: 4118765568
ociRuntime:
name: crun
package: crun-0.13-2.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.13
commit: e79e4de4ac16da0ce48777afb72c6241de870525
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: slirp4netns-1.0.0-1.fc32.x86_64
version: |-
slirp4netns version 1.0.0
commit: a3be729152a33e692cd28b52f664defbf2e7810a
libslirp: 4.2.0
swapFree: 0
swapTotal: 0
uptime: 2h 13m 43.04s (Approximately 0.08 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /home/fedora/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mount_program:
Executable: /usr/bin/fuse-overlayfs
Package: fuse-overlayfs-1.0.0-1.fc32.x86_64
Version: |-
fusermount3 version: 3.9.1
fuse-overlayfs: version 1.0.0
FUSE library version 3.9.1
using FUSE kernel interface version 7.31
graphRoot: /home/fedora/.local/share/containers/storage
graphStatus:
Backing Filesystem: extfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "false"
imageStore:
number: 0
runRoot: /run/user/1000/containers
volumePath: /home/fedora/.local/share/containers/storage/volumes
)
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-1.9.1-1.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
- name: build own Jenkins container
become: yes
podman_image:
state: build
name: own_jenkins:latest
force: yes
path: "{{ configure_jenkins_container_folder }}"
build:
cache: yes
force_rm: yes
format: oci
rm: yes
Command line and output of ansible run with high verbosity:
TASK [configure_jenkins : build own Jenkins container] *************************
task path: /home/pink/Dokumente/ox/masterthesis/code/ansible_runner/project/roles/configure_jenkins/tasks/main.yml:55
<10.50.0.63> ESTABLISH SSH CONNECTION FOR USER: fedora
<10.50.0.63> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 10.50.0.63 '/bin/sh -c '"'"'echo ~fedora && sleep 0'"'"''
<10.50.0.63> (0, b'/home/fedora\n', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n")
<10.50.0.63> ESTABLISH SSH CONNECTION FOR USER: fedora
<10.50.0.63> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 10.50.0.63 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/fedora/.ansible/tmp `"&& mkdir /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961 && echo ansible-tmp-1589272265.3273017-48299-39753569833961="` echo /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961 `" ) && sleep 0'"'"''
<10.50.0.63> (0, b'ansible-tmp-1589272265.3273017-48299-39753569833961=/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961\n', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n")
Using module file /home/pink/Dokumente/ox/masterthesis/venv_runner/lib64/python3.8/site-packages/ansible/modules/cloud/podman/podman_image.py
<10.50.0.63> PUT /home/pink/.ansible/tmp/ansible-local-481392x4kesr2/tmpjpzspw3g TO /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py
<10.50.0.63> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 '[10.50.0.63]'
5-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/fedora size 0\r\ndebug3: Looking up /home/pink/.ansible/tmp/ansible-local-481392x4kesr2/tmpjpzspw3g\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\nde<10.50.0.63> (0, b'sftp> put /home/pink/.ansible/tmp/ansible-local-481392x4kesr2/tmpjpzspw3g /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\n', b'OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for \'final all\' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched \'final\'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for \'final all\' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched \'final\'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/fedora size 0\r\ndebug3: Looking up /home/pink/.ansible/tmp/ansible-local-481392x4kesr2/tmpjpzspw3g\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:13672\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 13672 bytes at 98304\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<10.50.0.63> ESTABLISH SSH CONNECTION FOR USER: fedora
<10.50.0.63> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 10.50.0.63 '/bin/sh -c '"'"'chmod u+x /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/ /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py && sleep 0'"'"''
<10.50.0.63> (0, b'', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n")
<10.50.0.63> ESTABLISH SSH CONNECTION FOR USER: fedora
<10.50.0.63> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 -tt 10.50.0.63 '/bin/sh -c '"'"'sudo -H -S -n -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-jtlqfvwcvmxftpgcvfwzwspqcjkhbuow ; python3 /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
d7780a88ebfccf31d24709052f55988c74b74738b9b4582a2/diff:/var/lib/containers/storage/overlay/e2f35c70be785ec013d48d47f3f2ecf0bd149b58c0a9c87e7ad4c8077e54e6ed/diff:/var/lib/containers/storage/overlay/e02a0e49f8e38766619fc24732f5e156234680fbc0f2e5bf9a02066146bab64f/diff:/var/lib/containers/storage/overlay/1658640241622a1df132e1ef979c95fabc4539d5ec3c725348f833e0f1b3926d/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff", "UpperDir": "/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/diff", "WorkDir": "/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/work"}}, "RootFS": {"Type": "layers", "Layers": ["sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9", "sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9", "sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e", "sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9", "sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9", "sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90", "sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd", "sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd", "sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda", "sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b", "sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181", "sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59", "sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0", "sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301", "sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2", "sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9", "sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84", "sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d", "sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8", "sha256:21ff7aaaa415f3654e2f2a28801bb88ae57963cbfad12f8884afe65d1f8bd425", "sha256:6ea0f80d8ee22cbade99a25e8dbbfc5bbb921f4162fd845d3214712fbab9ab46", "sha256:5277d9bcaf7a0de6b117417979c0db580d73a34d945d8e44541b13aa20ca5bb3", "sha256:6a9fe25e916b076ea06cb23371fb4866a022ac60790ebcfa14d0058ddf04d8e2", "sha256:eccb05d256e53999506a666460c63f14837a506ac8625a698bcecfbef591747e", "sha256:ccf991445436efaec39d5b96b1a5314479d000435c52e73692a834c742864ffe"]}, "Labels": null, "Annotations": {}, "ManifestType": "application/vnd.oci.image.manifest.v1+json", "User": "jenkins", "History": [{"created": "2020-02-01T17:23:41.468764783Z", "created_by": "/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / "}, {"created": "2020-02-01T17:23:41.779347941Z", "created_by": "/bin/sh -c #(nop) CMD [\\"bash\\"]", "empty_layer": true}, {"created": "2020-02-02T00:33:57.472084922Z", "created_by": "/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T00:34:02.575926722Z", "created_by": "/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi"}, {"created": "2020-02-02T00:34:28.382621018Z", "created_by": "/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T06:26:12.069777635Z", "created_by": "/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T06:26:12.30328465Z", "created_by": "/bin/sh -c #(nop) ENV LANG=C.UTF-8", "empty_layer": true}, {"created": "2020-02-02T06:28:13.86592879Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8", "empty_layer": true}, {"created": "2020-02-02T06:28:14.22638233Z", "created_by": "/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "empty_layer": true}, {"created": "2020-02-02T06:28:15.751830342Z", "created_by": "/bin/sh -c { echo \'#/bin/sh\'; echo \'echo \\"$JAVA_HOME\\"\'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\"$JAVA_HOME\\" = \\"$(docker-java-home)\\" ]"}, {"created": "2020-02-02T06:28:16.075627774Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_VERSION=8u242", "empty_layer": true}, {"created": "2020-02-02T06:28:16.435762402Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_", "empty_layer": true}, {"created": "2020-02-02T06:28:16.735005969Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08", "empty_layer": true}, {"created": "2020-02-02T06:28:28.536242187Z", "created_by": "/bin/sh -c set -eux; \\t\\tdpkgArch=\\"$(dpkg --print-architecture)\\"; \\tcase \\"$dpkgArch\\" in \\t\\tamd64) upstreamArch=\'x64\' ;; \\t\\tarm64) upstreamArch=\'aarch64\' ;; \\t\\t*) echo >&2 \\"error: unsupported architecture: $dpkgArch\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\"; \\twget -O openjdk.tgz \\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\"$(mktemp -d)\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep \'0xA5CD6035332FA671\' \\t\\t| grep \'Andrew Haley\'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\"$GNUPGHOME\\"; \\t\\tmkdir -p \\"$JAVA_HOME\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\"$JAVA_HOME\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo \'#!/usr/bin/env bash\'; \\t\\techo \'set -Eeuo pipefail\'; \\t\\techo \'if ! [ -d \\"$JAVA_HOME\\" ]; then echo >&2 \\"error: missing JAVA_HOME environment variable\\"; exit 1; fi\'; \\t\\techo \'cacertsFile=; for f in \\"$JAVA_HOME/lib/security/cacerts\\" \\"$JAVA_HOME/jre/lib/security/cacerts\\"; do if [ -e \\"$f\\" ]; then cacertsFile=\\"$f\\"; break; fi; done\'; \\t\\techo \'if [ -z \\"$cacertsFile\\" ] || ! [ -f \\"$cacertsFile\\" ]; then echo >&2 \\"error: failed to find cacerts file in $JAVA_HOME\\"; exit 1; fi\'; \\t\\techo \'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\"$cacertsFile\\"\'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\"$JAVA_HOME/lib\\" -name \'*.so\' -exec dirname \'{}\' \';\' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version"}, {"created": "2020-03-08T02:08:28.421510268Z", "created_by": "/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*"}, {"created": "2020-03-08T02:08:31.93114162Z", "created_by": "/bin/sh -c #(nop) ARG user=jenkins", "empty_layer": true}, {"created": "2020-03-08T02:08:33.055659115Z", "created_by": "/bin/sh -c #(nop) ARG group=jenkins", "empty_layer": true}, {"created": "2020-03-08T02:08:33.874837809Z", "created_by": "/bin/sh -c #(nop) ARG uid=1000", "empty_layer": true}, {"created": "2020-03-08T02:08:34.884950047Z", "created_by": "/bin/sh -c #(nop) ARG gid=1000", "empty_layer": true}, {"created": "2020-03-08T02:08:35.896784898Z", "created_by": "/bin/sh -c #(nop) ARG http_port=8080", "empty_layer": true}, {"created": "2020-03-08T02:08:36.871761947Z", "created_by": "/bin/sh -c #(nop) ARG agent_port=50000", "empty_layer": true}, {"created": "2020-03-08T02:08:37.961637985Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home", "empty_layer": true}, {"created": "2020-03-08T02:08:38.89036682Z", "created_by": "/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref", "empty_layer": true}, {"created": "2020-03-08T02:08:39.868974806Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home", "empty_layer": true}, {"created": "2020-03-08T02:08:40.877764659Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000", "empty_layer": true}, {"created": "2020-03-08T02:08:41.871871056Z", "created_by": "/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref", "empty_layer": true}, {"created": "2020-03-08T02:08:44.011833083Z", "created_by": "|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\"$JENKINS_HOME\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}"}, {"created": "2020-03-08T02:08:44.942422372Z", "created_by": "/bin/sh -c #(nop) VOLUME [/var/jenkins_home]", "empty_layer": true}, {"created": "2020-03-08T02:08:46.818223402Z", "created_by": "|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p ${REF}/init.groovy.d"}, {"created": "2020-03-08T02:08:47.950453499Z", "created_by": "/bin/sh -c #(nop) ARG TINI_VERSION=v0.16.1", "empty_layer": true}, {"created": "2020-03-08T02:08:49.000333745Z", "created_by": "/bin/sh -c #(nop) COPY file:653491cb486e752a4c2b4b407a46ec75646a54eabb597634b25c7c2b82a31424 in /var/jenkins_home/tini_pub.gpg "}, {"created": "2020-03-08T02:08:52.418862763Z", "created_by": "|7 TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture) -o /sbin/tini && curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture).asc -o /sbin/tini.asc && gpg --no-tty --import ${JENKINS_HOME}/tini_pub.gpg && gpg --verify /sbin/tini.asc && rm -rf /sbin/tini.asc /root/.gnupg && chmod +x /sbin/tini"}, {"created": "2020-03-08T02:08:53.39134768Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_VERSION", "empty_layer": true}, {"created": "2020-03-08T02:08:54.341276057Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_VERSION=2.204.5", "empty_layer": true}, {"created": "2020-03-08T02:08:55.299951051Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_SHA=33a6c3161cf8de9c8729fd83914d781319fd1569acf487c7b1121681dba190a5", "empty_layer": true}, {"created": "2020-03-08T02:08:56.24780124Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war", "empty_layer": true}, {"created": "2020-03-08T02:09:00.907352083Z", "created_by": "|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL ${JENKINS_URL} -o /usr/share/jenkins/jenkins.war && echo \\"${JENKINS_SHA} /usr/share/jenkins/jenkins.war\\" | sha256sum -c -"}, {"created": "2020-03-08T02:09:03.539294194Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_UC=https://updates.jenkins.io", "empty_layer": true}, {"created": "2020-03-08T02:09:04.752513022Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental", "empty_layer": true}, {"created": "2020-03-08T02:09:05.949731202Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals", "empty_layer": true}, {"created": "2020-03-08T02:09:07.774563715Z", "created_by": "|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c chown -R ${user} \\"$JENKINS_HOME\\" \\"$REF\\""}, {"created": "2020-03-08T02:09:08.967515128Z", "created_by": "/bin/sh -c #(nop) EXPOSE 8080", "empty_layer": true}, {"created": "2020-03-08T02:09:09.960137695Z", "created_by": "/bin/sh -c #(nop) EXPOSE 50000", "empty_layer": true}, {"created": "2020-03-08T02:09:10.94919503Z", "created_by": "/bin/sh -c #(nop) ENV COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log", "empty_layer": true}, {"created": "2020-03-08T02:09:12.036159753Z", "created_by": "/bin/sh -c #(nop) USER jenkins", "empty_layer": true}, {"created": "2020-0Traceback (most recent call last):\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 102, in <module>\r\n _ansiballz_main()\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.cloud.podman.podman_image\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.8/runpy.py", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.8/runpy.py", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.8/runpy.py", line 86, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py", line 725, in <module>\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py", line 721, in main\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py", line 2071, in exit_json\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py", line 2065, in _return_formatted\r\nBlockingIOError: [Errno 11] write could not complete without blocking\r\n', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received <10.50.0.63> (1, b'\r\n{"changed": true, "actions": ["Built image own_jenkins:latest from /home/fedora/own_jenkins_container"], "image": [{"Id": "7abd50ead542536c39423370f5f32e5be3994bc69a54c0270c0304c61c29c552", "Digest": "sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2", "RepoTags": ["localhost/own_jenkins:latest"], "RepoDigests": ["localhost/own_jenkins@sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2"], "Parent": "", "Comment": "", "Created": "2020-05-12T08:31:47.665534799Z", "Config": {"User": "jenkins", "ExposedPorts": {"50000/tcp": {}, "8080/tcp": {}}, "Env": ["PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "LANG=C.UTF-8", "JAVA_HOME=/usr/local/openjdk-8", "JAVA_VERSION=8u242", "JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_", "JAVA_URL_VERSION=8u242b08", "JENKINS_HOME=/var/jenkins_home", "JENKINS_SLAVE_AGENT_PORT=50000", "REF=/usr/share/jenkins/ref", "JENKINS_VERSION=2.204.5", "JENKINS_UC=https://updates.jenkins.io", "JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental", "JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals", "COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log", "CASC_JENKINS_CONFIG=/var/jenkins_conf", "JAVA_OPTS=-Djenkins.install.runSetupWizard=false -Djenkins.model.Jenkins.buildsDir=${JENKINS_HOME}/builds/${ITEM_FULL_NAME}"], "Entrypoint": ["/sbin/tini", "--", "/usr/local/bin/jenkins.sh"], "Volumes": {"/var/jenkins_home": {}}}, "Version": "", "Author": "", "Architecture": "amd64", "Os": "linux", "Size": 703042144, "VirtualSize": 703042144, "GraphDriver": {"Name": "overlay", "Data": {"LowerDir": "/var/lib/containers/storage/overlay/a182d7b1ea0efd2caf6bf17217b29f48ed8b3d99619ba7b7d5313ada27bfb1e1/diff:/var/lib/containers/storage/overlay/3097ea2f74c7778d7780a88ebfccf31d24709052f55988c74b74738b9b4582a2/diff:/var/lib/containers/storage/overlay/e2f35c70be785ec013d48d47f3f2ecf0bd149b58c0a9c87e7ad4c8077e54e6ed/diff:/var/lib/containers/storage/overlay/e02a0e49f8e38766619fc24732f5e156234680fbc0f2e5bf9a02066146bab64f/diff:/var/lib/containers/storage/overlay/1658640241622a1df132e1ef979c95fabc4539d5ec3c725348f833e0f1b3926d/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff", "UpperDir": "/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/diff", "WorkDir": "/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/work"}}, "RootFS": {"Type": "layers", "Layers": ["sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9", "sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9", "sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e", "sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9", "sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9", "sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90", "sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd", "sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd", "sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda", "sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b", "sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181", "sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59", "sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0", "sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301", "sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2", "sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9", "sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84", "sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d", "sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8", "sha256:21ff7aaaa415f3654e2f2a28801bb88ae57963cbfad12f8884afe65d1f8bd425", "sha256:6ea0f80d8ee22cbade99a25e8dbbfc5bbb921f4162fd845d3214712fbab9ab46", "sha256:5277d9bcaf7a0de6b117417979c0db580d73a34d945d8e44541b13aa20ca5bb3", "sha256:6a9fe25e916b076ea06cb23371fb4866a022ac60790ebcfa14d0058ddf04d8e2", "sha256:eccb05d256e53999506a666460c63f14837a506ac8625a698bcecfbef591747e", "sha256:ccf991445436efaec39d5b96b1a5314479d000435c52e73692a834c742864ffe"]}, "Labels": null, "Annotations": {}, "ManifestType": "application/vnd.oci.image.manifest.v1+json", "User": "jenkins", "History": [{"created": "2020-02-01T17:23:41.468764783Z", "created_by": "/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / "}, {"created": "2020-02-01T17:23:41.779347941Z", "created_by": "/bin/sh -c #(nop) CMD [\\"bash\\"]", "empty_layer": true}, {"created": "2020-02-02T00:33:57.472084922Z", "created_by": "/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T00:34:02.575926722Z", "created_by": "/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi"}, {"created": "2020-02-02T00:34:28.382621018Z", "created_by": "/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T06:26:12.069777635Z", "created_by": "/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*"}, {"created": "2020-02-02T06:26:12.30328465Z", "created_by": "/bin/sh -c #(nop) ENV LANG=C.UTF-8", "empty_layer": true}, {"created": "2020-02-02T06:28:13.86592879Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8", "empty_layer": true}, {"created": "2020-02-02T06:28:14.22638233Z", "created_by": "/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "empty_layer": true}, {"created": "2020-02-02T06:28:15.751830342Z", "created_by": "/bin/sh -c { echo \'#/bin/sh\'; echo \'echo \\"$JAVA_HOME\\"\'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\"$JAVA_HOME\\" = \\"$(docker-java-home)\\" ]"}, {"created": "2020-02-02T06:28:16.075627774Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_VERSION=8u242", "empty_layer": true}, {"created": "2020-02-02T06:28:16.435762402Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_", "empty_layer": true}, {"created": "2020-02-02T06:28:16.735005969Z", "created_by": "/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08", "empty_layer": true}, {"created": "2020-02-02T06:28:28.536242187Z", "created_by": "/bin/sh -c set -eux; \\t\\tdpkgArch=\\"$(dpkg --print-architecture)\\"; \\tcase \\"$dpkgArch\\" in \\t\\tamd64) upstreamArch=\'x64\' ;; \\t\\tarm64) upstreamArch=\'aarch64\' ;; \\t\\t*) echo >&2 \\"error: unsupported architecture: $dpkgArch\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\"; \\twget -O openjdk.tgz \\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\"$(mktemp -d)\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep \'0xA5CD6035332FA671\' \\t\\t| grep \'Andrew Haley\'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\"$GNUPGHOME\\"; \\t\\tmkdir -p \\"$JAVA_HOME\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\"$JAVA_HOME\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo \'#!/usr/bin/env bash\'; \\t\\techo \'set -Eeuo pipefail\'; \\t\\techo \'if ! [ -d \\"$JAVA_HOME\\" ]; then echo >&2 \\"error: missing JAVA_HOME environment variable\\"; exit 1; fi\'; \\t\\techo \'cacertsFile=; for f in \\"$JAVA_HOME/lib/security/cacerts\\" \\"$JAVA_HOME/jre/lib/security/cacerts\\"; do if [ -e \\"$f\\" ]; then cacertsFile=\\"$f\\"; break; fi; done\'; \\t\\techo \'if [ -z \\"$cacertsFile\\" ] || ! [ -f \\"$cacertsFile\\" ]; then echo >&2 \\"error: failed to find cacerts file in $JAVA_HOME\\"; exit 1; fi\'; \\t\\techo \'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\"$cacertsFile\\"\'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\"$JAVA_HOME/lib\\" -name \'*.so\' -exec dirname \'{}\' \';\' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version"}, {"created": "2020-03-08T02:08:28.421510268Z", "created_by": "/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*"}, {"created": "2020-03-08T02:08:31.93114162Z", "created_by": "/bin/sh -c #(nop) ARG user=jenkins", "empty_layer": true}, {"created": "2020-03-08T02:08:33.055659115Z", "created_by": "/bin/sh -c #(nop) ARG group=jenkins", "empty_layer": true}, {"created": "2020-03-08T02:08:33.874837809Z", "created_by": "/bin/sh -c #(nop) ARG uid=1000", "empty_layer": true}, {"created": "2020-03-08T02:08:34.884950047Z", "created_by": "/bin/sh -c #(nop) ARG gid=1000", "empty_layer": true}, {"created": "2020-03-08T02:08:35.896784898Z", "created_by": "/bin/sh -c #(nop) ARG http_port=8080", "empty_layer": true}, {"created": "2020-03-08T02:08:36.871761947Z", "created_by": "/bin/sh -c #(nop) ARG agent_port=50000", "empty_layer": true}, {"created": "2020-03-08T02:08:37.961637985Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home", "empty_layer": true}, {"created": "2020-03-08T02:08:38.89036682Z", "created_by": "/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref", "empty_layer": true}, {"created": "2020-03-08T02:08:39.868974806Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home", "empty_layer": true}, {"created": "2020-03-08T02:08:40.877764659Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000", "empty_layer": true}, {"created": "2020-03-08T02:08:41.871871056Z", "created_by": "/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref", "empty_layer": true}, {"created": "2020-03-08T02:08:44.011833083Z", "created_by": "|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\"$JENKINS_HOME\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}"}, {"created": "2020-03-08T02:08:44.942422372Z", "created_by": "/bin/sh -c #(nop) VOLUME [/var/jenkins_home]", "empty_layer": true}, {"created": "2020-03-08T02:08:46.818223402Z", "created_by": "|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p ${REF}/init.groovy.d"}, {"created": "2020-03-08T02:08:47.950453499Z", "created_by": "/bin/sh -c #(nop) ARG TINI_VERSION=v0.16.1", "empty_layer": true}, {"created": "2020-03-08T02:08:49.000333745Z", "created_by": "/bin/sh -c #(nop) COPY file:653491cb486e752a4c2b4b407a46ec75646a54eabb597634b25c7c2b82a31424 in /var/jenkins_home/tini_pub.gpg "}, {"created": "2020-03-08T02:08:52.418862763Z", "created_by": "|7 TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture) -o /sbin/tini && curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture).asc -o /sbin/tini.asc && gpg --no-tty --import ${JENKINS_HOME}/tini_pub.gpg && gpg --verify /sbin/tini.asc && rm -rf /sbin/tini.asc /root/.gnupg && chmod +x /sbin/tini"}, {"created": "2020-03-08T02:08:53.39134768Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_VERSION", "empty_layer": true}, {"created": "2020-03-08T02:08:54.341276057Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_VERSION=2.204.5", "empty_layer": true}, {"created": "2020-03-08T02:08:55.299951051Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_SHA=33a6c3161cf8de9c8729fd83914d781319fd1569acf487c7b1121681dba190a5", "empty_layer": true}, {"created": "2020-03-08T02:08:56.24780124Z", "created_by": "/bin/sh -c #(nop) ARG JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war", "empty_layer": true}, {"created": "2020-03-08T02:09:00.907352083Z", "created_by": "|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL ${JENKINS_URL} -o /usr/share/jenkins/jenkins.war && echo \\"${JENKINS_SHA} /usr/share/jenkins/jenkins.war\\" | sha256sum -c -"}, {"created": "2020-03-08T02:09:03.539294194Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_UC=https://updates.jenkins.io", "empty_layer": true}, {"created": "2020-03-08T02:09:04.752513022Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental", "empty_layer": true}, {"created": "2020-03-08T02:09:05.949731202Z", "created_by": "/bin/sh -c #(nop) ENV JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals", "empty_layer": true}, {"created": "2020-03-08T02:09:07.774563715Z", "created_by": "|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c chown -R ${user} \\"$JENKINS_HOME\\" \\"$REF\\""}, {"created": "2020-03-08T02:09:08.967515128Z", "created_by": "/bin/sh -c #(nop) EXPOSE 8080", "empty_layer": true}, {"created": "2020-03-08T02:09:09.960137695Z", "created_by": "/bin/sh -c #(nop) EXPOSE 50000", "empty_layer": true}, {"created": "2020-03-08T02:09:10.94919503Z", "created_by": "/bin/sh -c #(nop) ENV COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log", "empty_layer": true}, {"created": "2020-03-08T02:09:12.036159753Z", "created_by": "/bin/sh -c #(nop) USER jenkins", "empty_layer": true}, {"created": "2020-0Traceback (most recent call last):\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 102, in <module>\r\n _ansiballz_main()\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File "/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py", line 40, in invoke_module\r\n runpy.run_module(mod_name=\'ansible.modules.cloud.podman.podman_image\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n File "/usr/lib64/python3.8/runpy.py", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File "/usr/lib64/python3.8/runpy.py", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File "/usr/lib64/python3.8/runpy.py", line 86, in _run_code\r\n exec(code, run_globals)\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py", line 725, in <module>\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py", line 721, in main\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py", line 2071, in exit_json\r\n File "/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py", line 2065, in _return_formatted\r\nBlockingIOError: [Errno 11] write could not complete without blocking\r\n', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to 10.50.0.63 closed.\r\n")
<10.50.0.63> Failed to connect to the host via ssh: OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020
debug1: Reading configuration data /etc/ssh/ssh_config
debug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0
debug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf
debug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63
debug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'
debug2: match not found
debug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)
debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config
debug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]
debug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]
debug1: configuration requests final Match pass
debug2: resolve_canonicalize: hostname 10.50.0.63 is address
debug1: re-parsing configuration
debug1: Reading configuration data /etc/ssh/ssh_config
debug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0
debug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf
debug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63
debug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'
debug2: match found
debug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1
debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config
debug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]
debug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]
debug1: auto-mux: Trying existing master
debug2: fd 4 setting O_NONBLOCK
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_client_request_alive: done pid = 48171
debug3: mux_client_request_session: session request sent
debug3: mux_client_read_packet: read header failed: Broken pipe
debug2: Received exit status from master 1
Shared connection to 10.50.0.63 closed.
<10.50.0.63> ESTABLISH SSH CONNECTION FOR USER: fedora
<10.50.0.63> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="fedora"' -o ConnectTimeout=10 -o ControlPath=/home/pink/.ansible/cp/f9c8b55839 10.50.0.63 '/bin/sh -c '"'"'rm -f -r /home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/ > /dev/null 2>&1 && sleep 0'"'"''
<10.50.0.63> (0, b'', b"OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n")
g1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to 10.50.0.63 closed.\r\n",
"module_stdout": "\r\n{\"changed\": true, \"actions\": [\"Built image own_jenkins:latest from /home/fedora/own_jenkins_container\"], \"image\": [{\"Id\": \"7abd50ead542536c39423370f5f32e5be3994bc69a54c0270c0304c61c29c552\", \"Digest\": \"sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2\", \"RepoTags\": [\"localhost/own_jenkins:latest\"], \"RepoDigests\": [\"localhost/own_jenkins@sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2\"], \"Parent\": \"\", \"Comment\": \"\", \"Created\": \"2020-05-12T08:31:47.665534799Z\", \"Config\": {\"User\": \"jenkins\", \"ExposedPorts\": {\"50000/tcp\": {}, \"8080/tcp\": {}}, \"Env\": [\"PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"LANG=C.UTF-8\", \"JAVA_HOME=/usr/local/openjdk-8\", \"JAVA_VERSION=8u242\", \"JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"JAVA_URL_VERSION=8u242b08\", \"JENKINS_HOME=/var/jenkins_home\", \"JENKINS_SLAVE_AGENT_PORT=50000\", \"REF=/usr/share/jenkins/ref\", \"JENKINS_VERSION=2.204.5\", \"JENKINS_UC=https://updates.jenkins.io\", \"JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental\", \"JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals\", \"COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log\", \"CASC_JENKINS_CONFIG=/var/jenkins_conf\", \"JAVA_OPTS=-Djenkins.install.runSetupWizard=false -Djenkins.model.Jenkins.buildsDir=${JENKINS_HOME}/builds/${ITEM_FULL_NAME}\"], \"Entrypoint\": [\"/sbin/tini\", \"--\", \"/usr/local/bin/jenkins.sh\"], \"Volumes\": {\"/var/jenkins_home\": {}}}, \"Version\": \"\", \"Author\": \"\", \"Architecture\": \"amd64\", \"Os\": \"linux\", \"Size\": 703042144, \"VirtualSize\": 703042144, \"GraphDriver\": {\"Name\": \"overlay\", \"Data\": {\"LowerDir\": \"/var/lib/containers/storage/overlay/a182d7b1ea0efd2caf6bf17217b29f48ed8b3d99619ba7b7d5313ada27bfb1e1/diff:/var/lib/containers/storage/overlay/3097ea2f74c7778d7780a88ebfccf31d24709052f55988c74b74738b9b4582a2/diff:/var/lib/containers/storage/overlay/e2f35c70be785ec013d48d47f3f2ecf0bd149b58c0a9c87e7ad4c8077e54e6ed/diff:/var/lib/containers/storage/overlay/e02a0e49f8e38766619fc24732f5e156234680fbc0f2e5bf9a02066146bab64f/diff:/var/lib/containers/storage/overlay/1658640241622a1df132e1ef979c95fabc4539d5ec3c725348f833e0f1b3926d/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff\", \"UpperDir\": \"/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/diff\", \"WorkDir\": \"/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/work\"}}, \"RootFS\": {\"Type\": \"layers\", \"Layers\": [\"sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9\", \"sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9\", \"sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e\", \"sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9\", \"sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9\", \"sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90\", \"sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd\", \"sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd\", \"sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda\", \"sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b\", \"sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181\", \"sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59\", \"sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0\", \"sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301\", \"sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2\", \"sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9\", \"sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84\", \"sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d\", \"sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8\", \"sha256:21ff7aaaa415f3654e2f2a28801bb88ae57963cbfad12f8884afe65d1f8bd425\", \"sha256:6ea0f80d8ee22cbade99a25e8dbbfc5bbb921f4162fd845d3214712fbab9ab46\", \"sha256:5277d9bcaf7a0de6b117417979c0db580d73a34d945d8e44541b13aa20ca5bb3\", \"sha256:6a9fe25e916b076ea06cb23371fb4866a022ac60790ebcfa14d0058ddf04d8e2\", \"sha256:eccb05d256e53999506a666460c63f14837a506ac8625a698bcecfbef591747e\", \"sha256:ccf991445436efaec39d5b96b1a5314479d000435c52e73692a834c742864ffe\"]}, \"Labels\": null, \"Annotations\": {}, \"ManifestType\": \"application/vnd.oci.image.manifest.v1+json\", \"User\": \"jenkins\", \"History\": [{\"created\": \"2020-02-01T17:23:41.468764783Z\", \"created_by\": \"/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / \"}, {\"created\": \"2020-02-01T17:23:41.779347941Z\", \"created_by\": \"/bin/sh -c #(nop) CMD [\\\"bash\\\"]\", \"empty_layer\": true}, {\"created\": \"2020-02-02T00:33:57.472084922Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T00:34:02.575926722Z\", \"created_by\": \"/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi\"}, {\"created\": \"2020-02-02T00:34:28.382621018Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.069777635Z\", \"created_by\": \"/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.30328465Z\", \"created_by\": \"/bin/sh -c #(nop) ENV LANG=C.UTF-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:13.86592879Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:14.22638233Z\", \"created_by\": \"/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:15.751830342Z\", \"created_by\": \"/bin/sh -c { echo '#/bin/sh'; echo 'echo \\\"$JAVA_HOME\\\"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\\"$JAVA_HOME\\\" = \\\"$(docker-java-home)\\\" ]\"}, {\"created\": \"2020-02-02T06:28:16.075627774Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_VERSION=8u242\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.435762402Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.735005969Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:28.536242187Z\", \"created_by\": \"/bin/sh -c set -eux; \\t\\tdpkgArch=\\\"$(dpkg --print-architecture)\\\"; \\tcase \\\"$dpkgArch\\\" in \\t\\tamd64) upstreamArch='x64' ;; \\t\\tarm64) upstreamArch='aarch64' ;; \\t\\t*) echo >&2 \\\"error: unsupported architecture: $dpkgArch\\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\\"; \\twget -O openjdk.tgz \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\\"$(mktemp -d)\\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep '0xA5CD6035332FA671' \\t\\t| grep 'Andrew Haley'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\\"$GNUPGHOME\\\"; \\t\\tmkdir -p \\\"$JAVA_HOME\\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\\"$JAVA_HOME\\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo '#!/usr/bin/env bash'; \\t\\techo 'set -Eeuo pipefail'; \\t\\techo 'if ! [ -d \\\"$JAVA_HOME\\\" ]; then echo >&2 \\\"error: missing JAVA_HOME environment variable\\\"; exit 1; fi'; \\t\\techo 'cacertsFile=; for f in \\\"$JAVA_HOME/lib/security/cacerts\\\" \\\"$JAVA_HOME/jre/lib/security/cacerts\\\"; do if [ -e \\\"$f\\\" ]; then cacertsFile=\\\"$f\\\"; break; fi; done'; \\t\\techo 'if [ -z \\\"$cacertsFile\\\" ] || ! [ -f \\\"$cacertsFile\\\" ]; then echo >&2 \\\"error: failed to find cacerts file in $JAVA_HOME\\\"; exit 1; fi'; \\t\\techo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\\"$cacertsFile\\\"'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\\"$JAVA_HOME/lib\\\" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version\"}, {\"created\": \"2020-03-08T02:08:28.421510268Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-03-08T02:08:31.93114162Z\", \"created_by\": \"/bin/sh -c #(nop) ARG user=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.055659115Z\", \"created_by\": \"/bin/sh -c #(nop) ARG group=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.874837809Z\", \"created_by\": \"/bin/sh -c #(nop) ARG uid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:34.884950047Z\", \"created_by\": \"/bin/sh -c #(nop) ARG gid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:35.896784898Z\", \"created_by\": \"/bin/sh -c #(nop) ARG http_port=8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:36.871761947Z\", \"created_by\": \"/bin/sh -c #(nop) ARG agent_port=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:37.961637985Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:38.89036682Z\", \"created_by\": \"/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:39.868974806Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:40.877764659Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:41.871871056Z\", \"created_by\": \"/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:44.011833083Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\\"$JENKINS_HOME\\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}\"}, {\"created\": \"2020-03-08T02:08:44.942422372Z\", \"created_by\": \"/bin/sh -c #(nop) VOLUME [/var/jenkins_home]\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:46.818223402Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p ${REF}/init.groovy.d\"}, {\"created\": \"2020-03-08T02:08:47.950453499Z\", \"created_by\": \"/bin/sh -c #(nop) ARG TINI_VERSION=v0.16.1\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:49.000333745Z\", \"created_by\": \"/bin/sh -c #(nop) COPY file:653491cb486e752a4c2b4b407a46ec75646a54eabb597634b25c7c2b82a31424 in /var/jenkins_home/tini_pub.gpg \"}, {\"created\": \"2020-03-08T02:08:52.418862763Z\", \"created_by\": \"|7 TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture) -o /sbin/tini && curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture).asc -o /sbin/tini.asc && gpg --no-tty --import ${JENKINS_HOME}/tini_pub.gpg && gpg --verify /sbin/tini.asc && rm -rf /sbin/tini.asc /root/.gnupg && chmod +x /sbin/tini\"}, {\"created\": \"2020-03-08T02:08:53.39134768Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_VERSION\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:54.341276057Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_VERSION=2.204.5\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:55.299951051Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_SHA=33a6c3161cf8de9c8729fd83914d781319fd1569acf487c7b1121681dba190a5\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:56.24780124Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:00.907352083Z\", \"created_by\": \"|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL ${JENKINS_URL} -o /usr/share/jenkins/jenkins.war && echo \\\"${JENKINS_SHA} /usr/share/jenkins/jenkins.war\\\" | sha256sum -c -\"}, {\"created\": \"2020-03-08T02:09:03.539294194Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_UC=https://updates.jenkins.io\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:04.752513022Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:05.949731202Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:07.774563715Z\", \"created_by\": \"|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c chown -R ${user} \\\"$JENKINS_HOME\\\" \\\"$REF\\\"\"}, {\"created\": \"2020-03-08T02:09:08.967515128Z\", \"created_by\": \"/bin/sh -c #(nop) EXPOSE 8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:09.960137695Z\", \"created_by\": \"/bin/sh -c #(nop) EXPOSE 50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:10.94919503Z\", \"created_by\": \"/bin/sh -c #(nop) ENV COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:12.036159753Z\", \"created_by\": \"/bin/sh -c #(nop) USER jenkins\", \"empty_layer\": true}, {\"created\": \"2020-0Traceback (most recent call last):\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible.modules.cloud.podman.podman_image', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.8/runpfatal: [dev_jenkins_christian]: FAILED! => {
"changed": false,
"module_stderr": "OpenSSH_8.2p1, OpenSSL 1.1.1g FIPS 21 Apr 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: not matched 'final'\r\ndebug2: match not found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1 (parse only)\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: configuration requests final Match pass\r\ndebug2: resolve_canonicalize: hostname 10.50.0.63 is address\r\ndebug1: re-parsing configuration\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug3: /etc/ssh/ssh_config line 54: Including file /etc/ssh/ssh_config.d/05-redhat.conf depth 0\r\ndebug1: Reading configuration data /etc/ssh/ssh_config.d/05-redhat.conf\r\ndebug2: checking match for 'final all' host 10.50.0.63 originally 10.50.0.63\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 3: matched 'final'\r\ndebug2: match found\r\ndebug3: /etc/ssh/ssh_config.d/05-redhat.conf line 5: Including file /etc/crypto-policies/back-ends/openssh.config depth 1\r\ndebug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config\r\ndebug3: gss kex names ok: [gss-gex-sha1-,gss-group14-sha1-,gss-group1-sha1-]\r\ndebug3: kex names ok: [curve25519-sha256,[email protected],ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 48171\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to 10.50.0.63 closed.\r\n",
"module_stdout": "\r\n{\"changed\": true, \"actions\": [\"Built image own_jenkins:latest from /home/fedora/own_jenkins_container\"], \"image\": [{\"Id\": \"7abd50ead542536c39423370f5f32e5be3994bc69a54c0270c0304c61c29c552\", \"Digest\": \"sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2\", \"RepoTags\": [\"localhost/own_jenkins:latest\"], \"RepoDigests\": [\"localhost/own_jenkins@sha256:e2b771de36a17bacbb9fbcd650e931355193de8f190f5c3bb5f84060402f4fa2\"], \"Parent\": \"\", \"Comment\": \"\", \"Created\": \"2020-05-12T08:31:47.665534799Z\", \"Config\": {\"User\": \"jenkins\", \"ExposedPorts\": {\"50000/tcp\": {}, \"8080/tcp\": {}}, \"Env\": [\"PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"LANG=C.UTF-8\", \"JAVA_HOME=/usr/local/openjdk-8\", \"JAVA_VERSION=8u242\", \"JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"JAVA_URL_VERSION=8u242b08\", \"JENKINS_HOME=/var/jenkins_home\", \"JENKINS_SLAVE_AGENT_PORT=50000\", \"REF=/usr/share/jenkins/ref\", \"JENKINS_VERSION=2.204.5\", \"JENKINS_UC=https://updates.jenkins.io\", \"JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental\", \"JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals\", \"COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log\", \"CASC_JENKINS_CONFIG=/var/jenkins_conf\", \"JAVA_OPTS=-Djenkins.install.runSetupWizard=false -Djenkins.model.Jenkins.buildsDir=${JENKINS_HOME}/builds/${ITEM_FULL_NAME}\"], \"Entrypoint\": [\"/sbin/tini\", \"--\", \"/usr/local/bin/jenkins.sh\"], \"Volumes\": {\"/var/jenkins_home\": {}}}, \"Version\": \"\", \"Author\": \"\", \"Architecture\": \"amd64\", \"Os\": \"linux\", \"Size\": 703042144, \"VirtualSize\": 703042144, \"GraphDriver\": {\"Name\": \"overlay\", \"Data\": {\"LowerDir\": \"/var/lib/containers/storage/overlay/a182d7b1ea0efd2caf6bf17217b29f48ed8b3d99619ba7b7d5313ada27bfb1e1/diff:/var/lib/containers/storage/overlay/3097ea2f74c7778d7780a88ebfccf31d24709052f55988c74b74738b9b4582a2/diff:/var/lib/containers/storage/overlay/e2f35c70be785ec013d48d47f3f2ecf0bd149b58c0a9c87e7ad4c8077e54e6ed/diff:/var/lib/containers/storage/overlay/e02a0e49f8e38766619fc24732f5e156234680fbc0f2e5bf9a02066146bab64f/diff:/var/lib/containers/storage/overlay/1658640241622a1df132e1ef979c95fabc4539d5ec3c725348f833e0f1b3926d/diff:/var/lib/containers/storage/overlay/c2274d9f2a9971867ec79bf207e42a16fb1f14bf815d6e8740430dacc88d8785/diff:/var/lib/containers/storage/overlay/bce3558237d0c0afbd7889ae8dd3540690a6337ffe24711e729e0a2fe9e8bf33/diff:/var/lib/containers/storage/overlay/913e80387050faa9e08015de966f213dd8fdeac746abcd77d88219fe59eb282b/diff:/var/lib/containers/storage/overlay/bd7aa483693493b4a08403d5aa95beef3910421bf9ae4c0694475f267b73c2b7/diff:/var/lib/containers/storage/overlay/2d1241b71b044bd2aae6c1773403b2d6ad2f2f2753fc983a448d56e28c6ff303/diff:/var/lib/containers/storage/overlay/0e379119c6cd9edc25624e817155260ed0e0bf27a34747d07cca00d0d9c535da/diff:/var/lib/containers/storage/overlay/8878e4743896dcc6d9df1828ad89cdac1318bc47e1e8423f6fae21647f133c62/diff:/var/lib/containers/storage/overlay/e52d47750762e598165e24a4ff617604792ce598476b5e234e885b5722d0eed8/diff:/var/lib/containers/storage/overlay/eb70dd733080d6dccaed95bc3d792078e6bbddce015457f2cac3598b610769d4/diff:/var/lib/containers/storage/overlay/4acb7d7ee512db214eb30693b56108cf6c6f47f0093c5f4940b92b89c5dbebc0/diff:/var/lib/containers/storage/overlay/84ef69b6ed5f0015840f7ed80876cf3792c05c2bbf72ce006ed8a67e9c8ae03b/diff:/var/lib/containers/storage/overlay/c588bce0921bd3ae8ca65a2203c722745c0acd69b1f4380546184d2e387ff138/diff:/var/lib/containers/storage/overlay/3a798a3dc49789ccb6dc626c6edb0f3aa4e3343e96393a2adcf83a6ec8f94d8d/diff:/var/lib/containers/storage/overlay/6f33a85672c80ba01bb2a5034f06c9784c25247c02f992f49d1a16810c94263e/diff:/var/lib/containers/storage/overlay/c1bd2f1a5b3b63a20f976dbe5bc64e4d87187d7b6283a86b6a1310356be4c3b6/diff:/var/lib/containers/storage/overlay/150db6730074508a4a369098738fd79cec29a9438c6b0e0a72f7bcdecf552779/diff:/var/lib/containers/storage/overlay/25eb51e911b02cfe76adba09f93fe7ded3f212db4a05710718dfcefe6cfe3746/diff:/var/lib/containers/storage/overlay/2f3cf89f4bc52573b720975440b7b65aa707b690b8e3b121c1a8e1ad138aef8e/diff:/var/lib/containers/storage/overlay/7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9/diff\", \"UpperDir\": \"/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/diff\", \"WorkDir\": \"/var/lib/containers/storage/overlay/336c0abd5a97ba845e58383c6ac1d3f5a56138d9da412b4e9fd3f4a6c8882408/work\"}}, \"RootFS\": {\"Type\": \"layers\", \"Layers\": [\"sha256:7948c3e5790c6df89fe48041fabd8f1c576d4bb7c869183e03b9e3873a5f33d9\", \"sha256:4d1ab3827f6b69f4e55bd69cc8abe1dde7d7a7f61bd6e32c665f12e0a8efd1c9\", \"sha256:69dfa7bd7a92b8ba12a617ff56f22533223007c5ba6b3a191c91b852320f012e\", \"sha256:01727b1a72df8ba02293a98ab365bb4e8015aefadd661aaf7e6aa76567b706b9\", \"sha256:e43c0c41b833ec88f51b6fdb7c5faa32c76a32dbefdeb602969b74720ecf47c9\", \"sha256:bd76253da83ab721c5f9deed421f66db1406d89f720387b799dfe5503b797a90\", \"sha256:d81d8fa6dfd451a45e0161e76e3475e4e30e87e1cc1e9839509aa7c3ba42b5dd\", \"sha256:f161be957652c6ed6ffb9628fe833a1524ddccfbd5499039d75107ff4c4706cd\", \"sha256:631a8cd2cb056e41704102fab7338173883273c30ed9ef085035b9d54357fbda\", \"sha256:f57fcddcd21311556c453c009d6ffb31bd42e86f5e9d632d2c0e505674f7eb6b\", \"sha256:3e82e2066cb309b852f4954d30327b3c549fb856171209edcb5d64e64d231181\", \"sha256:9b9811cf4eb368566864a7bec3e7025fa5253baa02d1c381557d020bb08a2b59\", \"sha256:8afb2f989d486191c930b34b37a5a9c374c399b6763b6bae7ecc49a022b739a0\", \"sha256:2d50b321cf2cb23d39f2e448d1b26e28c49183f2f0f7caaca56d08793d7cd301\", \"sha256:698d1e1f868243a7ba8d490928007867bf2033b795551ce522d9beae3aa99bd2\", \"sha256:5fe6ec06b025070d593ebc7ab733ee6a3ca0f03ef794e46a0574fbc0db4a05e9\", \"sha256:7b23eed1281de14bc4c2914f5a0bab87093fcd3174efea2c39218d482da78e84\", \"sha256:4f8774dcadf6aee646abff26c622fe8aa87bee5974766df41222b2f22536465d\", \"sha256:7a4ef5d59373586055068d4bb4e6c032031da4bdf5618faa52a1366c95e474d8\", \"sha256:21ff7aaaa415f3654e2f2a28801bb88ae57963cbfad12f8884afe65d1f8bd425\", \"sha256:6ea0f80d8ee22cbade99a25e8dbbfc5bbb921f4162fd845d3214712fbab9ab46\", \"sha256:5277d9bcaf7a0de6b117417979c0db580d73a34d945d8e44541b13aa20ca5bb3\", \"sha256:6a9fe25e916b076ea06cb23371fb4866a022ac60790ebcfa14d0058ddf04d8e2\", \"sha256:eccb05d256e53999506a666460c63f14837a506ac8625a698bcecfbef591747e\", \"sha256:ccf991445436efaec39d5b96b1a5314479d000435c52e73692a834c742864ffe\"]}, \"Labels\": null, \"Annotations\": {}, \"ManifestType\": \"application/vnd.oci.image.manifest.v1+json\", \"User\": \"jenkins\", \"History\": [{\"created\": \"2020-02-01T17:23:41.468764783Z\", \"created_by\": \"/bin/sh -c #(nop) ADD file:8a9218592e5d736a05a1821a6dd38b205cdd8197c26a5aa33f6fc22fbfaa1c4d in / \"}, {\"created\": \"2020-02-01T17:23:41.779347941Z\", \"created_by\": \"/bin/sh -c #(nop) CMD [\\\"bash\\\"]\", \"empty_layer\": true}, {\"created\": \"2020-02-02T00:33:57.472084922Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tca-certificates \\t\\tcurl \\t\\tnetbase \\t\\twget \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T00:34:02.575926722Z\", \"created_by\": \"/bin/sh -c set -ex; \\tif ! command -v gpg > /dev/null; then \\t\\tapt-get update; \\t\\tapt-get install -y --no-install-recommends \\t\\t\\tgnupg \\t\\t\\tdirmngr \\t\\t; \\t\\trm -rf /var/lib/apt/lists/*; \\tfi\"}, {\"created\": \"2020-02-02T00:34:28.382621018Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get install -y --no-install-recommends \\t\\tbzr \\t\\tgit \\t\\tmercurial \\t\\topenssh-client \\t\\tsubversion \\t\\t\\t\\tprocps \\t&& rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.069777635Z\", \"created_by\": \"/bin/sh -c set -eux; \\tapt-get update; \\tapt-get install -y --no-install-recommends \\t\\tbzip2 \\t\\tunzip \\t\\txz-utils \\t\\t\\t\\tca-certificates p11-kit \\t\\t\\t\\tfontconfig libfreetype6 \\t; \\trm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-02-02T06:26:12.30328465Z\", \"created_by\": \"/bin/sh -c #(nop) ENV LANG=C.UTF-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:13.86592879Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_HOME=/usr/local/openjdk-8\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:14.22638233Z\", \"created_by\": \"/bin/sh -c #(nop) ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:15.751830342Z\", \"created_by\": \"/bin/sh -c { echo '#/bin/sh'; echo 'echo \\\"$JAVA_HOME\\\"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ \\\"$JAVA_HOME\\\" = \\\"$(docker-java-home)\\\" ]\"}, {\"created\": \"2020-02-02T06:28:16.075627774Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_VERSION=8u242\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.435762402Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_BASE_URL=https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u242-b08/OpenJDK8U-jdk_\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:16.735005969Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JAVA_URL_VERSION=8u242b08\", \"empty_layer\": true}, {\"created\": \"2020-02-02T06:28:28.536242187Z\", \"created_by\": \"/bin/sh -c set -eux; \\t\\tdpkgArch=\\\"$(dpkg --print-architecture)\\\"; \\tcase \\\"$dpkgArch\\\" in \\t\\tamd64) upstreamArch='x64' ;; \\t\\tarm64) upstreamArch='aarch64' ;; \\t\\t*) echo >&2 \\\"error: unsupported architecture: $dpkgArch\\\" ;; \\tesac; \\t\\twget -O openjdk.tgz.asc \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz.sign\\\"; \\twget -O openjdk.tgz \\\"${JAVA_BASE_URL}${upstreamArch}_linux_${JAVA_URL_VERSION}.tar.gz\\\" --progress=dot:giga; \\t\\texport GNUPGHOME=\\\"$(mktemp -d)\\\"; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; \\tgpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; \\tgpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F \\t\\t| tee /dev/stderr \\t\\t| grep '0xA5CD6035332FA671' \\t\\t| grep 'Andrew Haley'; \\tgpg --batch --verify openjdk.tgz.asc openjdk.tgz; \\tgpgconf --kill all; \\trm -rf \\\"$GNUPGHOME\\\"; \\t\\tmkdir -p \\\"$JAVA_HOME\\\"; \\ttar --extract \\t\\t--file openjdk.tgz \\t\\t--directory \\\"$JAVA_HOME\\\" \\t\\t--strip-components 1 \\t\\t--no-same-owner \\t; \\trm openjdk.tgz*; \\t\\t\\t{ \\t\\techo '#!/usr/bin/env bash'; \\t\\techo 'set -Eeuo pipefail'; \\t\\techo 'if ! [ -d \\\"$JAVA_HOME\\\" ]; then echo >&2 \\\"error: missing JAVA_HOME environment variable\\\"; exit 1; fi'; \\t\\techo 'cacertsFile=; for f in \\\"$JAVA_HOME/lib/security/cacerts\\\" \\\"$JAVA_HOME/jre/lib/security/cacerts\\\"; do if [ -e \\\"$f\\\" ]; then cacertsFile=\\\"$f\\\"; break; fi; done'; \\t\\techo 'if [ -z \\\"$cacertsFile\\\" ] || ! [ -f \\\"$cacertsFile\\\" ]; then echo >&2 \\\"error: failed to find cacerts file in $JAVA_HOME\\\"; exit 1; fi'; \\t\\techo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth \\\"$cacertsFile\\\"'; \\t} > /etc/ca-certificates/update.d/docker-openjdk; \\tchmod +x /etc/ca-certificates/update.d/docker-openjdk; \\t/etc/ca-certificates/update.d/docker-openjdk; \\t\\tfind \\\"$JAVA_HOME/lib\\\" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; \\tldconfig; \\t\\tjavac -version; \\tjava -version\"}, {\"created\": \"2020-03-08T02:08:28.421510268Z\", \"created_by\": \"/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install -y git curl && curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*\"}, {\"created\": \"2020-03-08T02:08:31.93114162Z\", \"created_by\": \"/bin/sh -c #(nop) ARG user=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.055659115Z\", \"created_by\": \"/bin/sh -c #(nop) ARG group=jenkins\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:33.874837809Z\", \"created_by\": \"/bin/sh -c #(nop) ARG uid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:34.884950047Z\", \"created_by\": \"/bin/sh -c #(nop) ARG gid=1000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:35.896784898Z\", \"created_by\": \"/bin/sh -c #(nop) ARG http_port=8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:36.871761947Z\", \"created_by\": \"/bin/sh -c #(nop) ARG agent_port=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:37.961637985Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:38.89036682Z\", \"created_by\": \"/bin/sh -c #(nop) ARG REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:39.868974806Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_HOME=/var/jenkins_home\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:40.877764659Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_SLAVE_AGENT_PORT=50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:41.871871056Z\", \"created_by\": \"/bin/sh -c #(nop) ENV REF=/usr/share/jenkins/ref\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:44.011833083Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p $JENKINS_HOME && chown ${uid}:${gid} $JENKINS_HOME && groupadd -g ${gid} ${group} && useradd -d \\\"$JENKINS_HOME\\\" -u ${uid} -g ${gid} -m -s /bin/bash ${user}\"}, {\"created\": \"2020-03-08T02:08:44.942422372Z\", \"created_by\": \"/bin/sh -c #(nop) VOLUME [/var/jenkins_home]\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:46.818223402Z\", \"created_by\": \"|6 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c mkdir -p ${REF}/init.groovy.d\"}, {\"created\": \"2020-03-08T02:08:47.950453499Z\", \"created_by\": \"/bin/sh -c #(nop) ARG TINI_VERSION=v0.16.1\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:49.000333745Z\", \"created_by\": \"/bin/sh -c #(nop) COPY file:653491cb486e752a4c2b4b407a46ec75646a54eabb597634b25c7c2b82a31424 in /var/jenkins_home/tini_pub.gpg \"}, {\"created\": \"2020-03-08T02:08:52.418862763Z\", \"created_by\": \"|7 TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture) -o /sbin/tini && curl -fsSL https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static-$(dpkg --print-architecture).asc -o /sbin/tini.asc && gpg --no-tty --import ${JENKINS_HOME}/tini_pub.gpg && gpg --verify /sbin/tini.asc && rm -rf /sbin/tini.asc /root/.gnupg && chmod +x /sbin/tini\"}, {\"created\": \"2020-03-08T02:08:53.39134768Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_VERSION\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:54.341276057Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_VERSION=2.204.5\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:55.299951051Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_SHA=33a6c3161cf8de9c8729fd83914d781319fd1569acf487c7b1121681dba190a5\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:08:56.24780124Z\", \"created_by\": \"/bin/sh -c #(nop) ARG JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:00.907352083Z\", \"created_by\": \"|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c curl -fsSL ${JENKINS_URL} -o /usr/share/jenkins/jenkins.war && echo \\\"${JENKINS_SHA} /usr/share/jenkins/jenkins.war\\\" | sha256sum -c -\"}, {\"created\": \"2020-03-08T02:09:03.539294194Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_UC=https://updates.jenkins.io\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:04.752513022Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_UC_EXPERIMENTAL=https://updates.jenkins.io/experimental\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:05.949731202Z\", \"created_by\": \"/bin/sh -c #(nop) ENV JENKINS_INCREMENTALS_REPO_MIRROR=https://repo.jenkins-ci.org/incrementals\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:07.774563715Z\", \"created_by\": \"|9 JENKINS_SHA=94c73fa5b72e0a4eb52c5c99c08351f85a51d138f3dbaff6f64e4406353f839c JENKINS_URL=https://repo.jenkins-ci.org/public/org/jenkins-ci/main/jenkins-war/2.204.5/jenkins-war-2.204.5.war TINI_VERSION=v0.16.1 agent_port=50000 gid=1000 group=jenkins http_port=8080 uid=1000 user=jenkins /bin/sh -c chown -R ${user} \\\"$JENKINS_HOME\\\" \\\"$REF\\\"\"}, {\"created\": \"2020-03-08T02:09:08.967515128Z\", \"created_by\": \"/bin/sh -c #(nop) EXPOSE 8080\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:09.960137695Z\", \"created_by\": \"/bin/sh -c #(nop) EXPOSE 50000\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:10.94919503Z\", \"created_by\": \"/bin/sh -c #(nop) ENV COPY_REFERENCE_FILE_LOG=/var/jenkins_home/copy_reference_file.log\", \"empty_layer\": true}, {\"created\": \"2020-03-08T02:09:12.036159753Z\", \"created_by\": \"/bin/sh -c #(nop) USER jenkins\", \"empty_layer\": true}, {\"created\": \"2020-0Traceback (most recent call last):\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 102, in <module>\r\n _ansiballz_main()\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 94, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/home/fedora/.ansible/tmp/ansible-tmp-1589272265.3273017-48299-39753569833961/AnsiballZ_podman_image.py\", line 40, in invoke_module\r\n runpy.run_module(mod_name='ansible.modules.cloud.podman.podman_image', init_globals=None, run_name='__main__', alter_sys=True)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 206, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib64/python3.8/runpy.py\", line 96, in _run_module_code\r\n _run_code(code, mod_globals, init_globals,\r\n File \"/usr/lib64/python3.8/runpy.py\", line 86, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 725, in <module>\r\n File \"/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/modules/cloud/podman/podman_image.py\", line 721, in main\r\n File \"/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2071, in exit_json\r\n File \"/tmp/ansible_podman_image_payload_yta_pzsw/ansible_podman_image_payload.zip/ansible/module_utils/basic.py\", line 2065, in _return_formatted\r\nBlockingIOError: [Errno 11] write could not complete without blocking\r\n",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
Additional environment details (AWS, VirtualBox, physical, etc.):
Target host is a KVM instance in OpenStack.
This collection will be included in Ansible 2.10 because it contains modules and/or plugins that were included in Ansible 2.9. Please review:
The latest version of the collection available on August 18 will be included in Ansible 2.10.0, except possibly newer versions which differ only in the patch level. (For details, see the roadmap). Please release version 1.0.0 of your collection by this date! If 1.0.0 does not exist, the same 0.x.y version will be used in all of Ansible 2.10 without updates, and your 1.x.y release will not be included until Ansible 2.11 (unless you request an exception at a community working group meeting and go through a demanding manual process to vouch for backwards compatibility . . . you want to avoid this!).
Your collection versioning must follow all semver rules. This means:
Your collection should provide data for the Ansible 2.10 changelog and porting guide. The changelog and porting guide are automatically generated from ansible-base, and from the changelogs of the included collections. All changes from the breaking_changes
, major_changes
, removed_features
and deprecated_features
sections will appear in both the changelog and the porting guide. You have two options for providing changelog fragments to include:
changelogs/changelog.yaml
inside your collection (see the documentation of changelogs/changelog.yaml format).If you cannot contribute to the integrated Ansible changelog using one of these methods, please provide a link to your collection's changelog by creating an issue in https://github.com/ansible-community/ansible-build-data/. If you do not provide changelogs/changelog.yml
or a link, users will not be able to find out what changed in your collection from the Ansible changelog and porting guide.
Run ansible-test sanity --docker -v
in the collection with the latest ansible-base or stable-2.10
ansible/ansible checkout.
Be sure you're subscribed to:
If you have questions or want to provide feedback, please see the Feedback section in the collection requirements.
(Internal link to keep track of issues: ansible-collections/overview#102)
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
Idempotency is broken once again similar to #31 , but this time for network
and security_opt
(apparmor).
Steps to reproduce the issue:
- name: Install Graphite container
containers.podman.podman_container:
image: graphiteapp/graphite-statsd:latest
name: graphite
state: present
Run it
Run it again
Describe the results you received:
It recreates the container. Using -D
one can see why:
TASK [graphite : Install Graphite container] *************************************************************************************************************************************************
--- before
+++ after
@@ -1,2 +1,2 @@
-network - ['bridge']
-security_opt - ['apparmor=containers-default-0.14.3']
+network - ['slirp4netns']
+security_opt - []
Describe the results you expected:
No rebuild of the container.
Additional information you deem important (e.g. issue happens only occasionally):
I can reproduce this for all of my containers.
Output of ansible --version
:
ansible 2.9.10
config file = /home/jrsr/repos/audriga-infra/ansible.cfg
configured module search path = ['/home/jrsr/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.8.3 (default, May 17 2020, 18:15:42) [GCC 10.1.0]
Output of podman version
:
Version: 2.0.1
API Version: 1
Go Version: go1.13.8
Built: Thu Jan 1 00:00:00 1970
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.0
cgroupVersion: v1
conmon:
package: 'conmon: /usr/libexec/podman/conmon'
path: /usr/libexec/podman/conmon
version: 'conmon version 2.0.18, commit: '
cpus: 2
distribution:
distribution: ubuntu
version: "20.04"
eventLogger: file
hostname: logging2
idMappings:
gidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
uidmap:
- container_id: 0
host_id: 1000
size: 1
- container_id: 1
host_id: 100000
size: 65536
kernel: 5.4.0-1017-aws
linkmode: dynamic
memFree: 238243840
memTotal: 4064804864
ociRuntime:
name: runc
package: 'runc: /usr/sbin/runc'
path: /usr/sbin/runc
version: 'runc version spec: 1.0.1-dev'
os: linux
remoteSocket:
path: /run/user/1000/podman/podman.sock
rootless: true
slirp4netns:
executable: /usr/bin/slirp4netns
package: 'slirp4netns: /usr/bin/slirp4netns'
version: |-
slirp4netns version 1.0.0
commit: unknown
libslirp: 4.2.0
swapFree: 0
swapTotal: 0
uptime: 29h 8m 32.8s (Approximately 1.21 days)
registries:
search:
- docker.io
- quay.io
store:
configFile: /home/ubuntu/.config/containers/storage.conf
containerStore:
number: 0
paused: 0
running: 0
stopped: 0
graphDriverName: vfs
graphOptions: {}
graphRoot: /home/ubuntu/.local/share/containers/storage
graphStatus: {}
imageStore:
number: 0
runRoot: /run/user/1000/containers
volumePath: /home/ubuntu/.local/share/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Thu Jan 1 00:00:00 1970
GitCommit: ""
GoVersion: go1.13.8
OsArch: linux/amd64
Version: 2.0.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
Listing... Done
podman/unknown,now 2.0.1~1 amd64 [installed]
podman/unknown 2.0.1~1 arm64
podman/unknown 2.0.1~1 armhf
podman/unknown 2.0.1~1 s390x
Playbok you run with ansible (e.g. content of playbook.yaml
):
See above.
Command line and output of ansible run with high verbosity:
[user@sadatoni ~/repos/playbook]$ ansible-playbook -CDv site.yml --start-at-task 'Install Graphite container'
Using /home/user/repos/playbook/ansible.cfg as config file
PLAY [all] ***********************************************************************************************************************************************************************************
PLAY [graphite_servers] **********************************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [graphite]
TASK [graphite : Install Graphite container] ***********************************************************************************************************************************************
--- before
+++ after
@@ -1,2 +1,2 @@
-network - ['bridge']
-security_opt - ['apparmor=containers-default-0.14.3']
+network - ['slirp4netns']
+security_opt - []
changed: [graphite] => {"actions": ["recreated graphite"], "changed": true, "container": {"AppArmorProfile": "containers-default-0.14.3", "Args": ["/entrypoint"], "BoundingCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "Config": {"Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-06-30T16:01:23.505777248Z", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.apparmor": "containers-default-0.14.3", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "FALSE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "1"}, "AttachStderr": false, "AttachStdin": false, "AttachStdout": false, "Cmd": null, "CreateCommand": ["podman", "container", "run", "--name", "graphite", "--detach=True", "graphiteapp/graphite-statsd:latest"], "Domainname": "", "Entrypoint": "/entrypoint", "Env": ["PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "STATSD_INTERFACE=udp", "HOSTNAME=739a12254a62", "HOME=/root"], "Hostname": "739a12254a62", "Image": "docker.io/graphiteapp/graphite-statsd:latest", "Labels": {"maintainer": "Denys Zhdanov <[email protected]>"}, "OnBuild": null, "OpenStdin": false, "StdinOnce": false, "StopSignal": 1, "Tty": false, "User": "", "Volumes": null, "WorkingDir": "/"}, "ConmonPidFile": "/var/run/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/conmon.pid", "Created": "2020-06-30T16:01:23.505777248Z", "Dependencies": [], "Driver": "overlay", "EffectiveCaps": ["CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD", "CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP", "CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT"], "ExecIDs": [], "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "runc", "--events-backend", "file", "container", "cleanup", "739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904"], "GraphDriver": {"Data": {"LowerDir": "/var/lib/containers/storage/overlay/b781a05c1a84f7b2cd6dcc42896769c3da91a99fcceae4aec7613a1b544c7005/diff:/var/lib/containers/storage/overlay/f22f13de64123bbdfa2430eae9c7aeb1cffef91d94864a1630b8212ebaab59d2/diff:/var/lib/containers/storage/overlay/ebefb31b3c26b858d05e029544d9c1ee02e5c2cc9db62c9f2ce7f8ab3ae3249f/diff:/var/lib/containers/storage/overlay/50644c29ef5a27c9a40c393a73ece2479de78325cae7d762ef3cdc19bf42dd0a/diff", "MergedDir": "/var/lib/containers/storage/overlay/dce06662720e3594434aac2bce80c510dee87da3c82dd4d9c3aee542c80d9598/merged", "UpperDir": "/var/lib/containers/storage/overlay/dce06662720e3594434aac2bce80c510dee87da3c82dd4d9c3aee542c80d9598/diff", "WorkDir": "/var/lib/containers/storage/overlay/dce06662720e3594434aac2bce80c510dee87da3c82dd4d9c3aee542c80d9598/work"}, "Name": "overlay"}, "HostConfig": {"AutoRemove": false, "Binds": ["5987290181aa734301f34f1e694477d3a23aa3c77e4936080c8c8192586a6689:/opt/graphite/conf:rprivate,rw,nodev,exec,nosuid,rbind", "e485b6e08535d63e5e8fe2216bc2d453df4db622535757f22798d7566b478fb9:/opt/graphite/storage:rprivate,rw,nodev,exec,nosuid,rbind", "3a1b6c019e9dcb99144aebffc5bbf0f0c2d04daa03230dc2915ccee6c45ec2b7:/opt/graphite/webapp/graphite/functions/custom:rprivate,rw,nodev,exec,nosuid,rbind", "53360f01492f4e14c6d9b761b4c114d7c67aec48b0586027f778dc20f3e27c7f:/opt/statsd/config:rprivate,rw,nodev,exec,nosuid,rbind", "5fcf5f6d425d376fa1480344df19bb2765feef99081dc04091edca7b328450f0:/var/lib/redis:rprivate,rw,nodev,exec,nosuid,rbind", "1d479d50f63cdf5fdd1f79bb4f8e3dee08cc2451220a64be685da9ffe2825259:/var/log:rprivate,rw,nodev,exec,nosuid,rbind", "0dffc63dc686a09c3b265dcc5b45f033021437d66cd72bb31418d164cc38dd50:/etc/logrotate.d:rprivate,rw,nodev,exec,nosuid,rbind", "8d10c8e8b6e483efb792d141c52fe2cb9ca5b3d69aa4189500687f5997717000:/etc/nginx:rprivate,rw,nodev,exec,nosuid,rbind"], "BlkioDeviceReadBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceWriteIOps": null, "BlkioWeight": 0, "BlkioWeightDevice": null, "CapAdd": [], "CapDrop": [], "Cgroup": "", "CgroupMode": "host", "CgroupParent": "", "Cgroups": "default", "ConsoleSize": [0, 0], "ContainerIDFile": "", "CpuCount": 0, "CpuPercent": 0, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpuShares": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "Dns": [], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IOMaximumBandwidth": 0, "IOMaximumIOps": 0, "IpcMode": "private", "Isolation": "", "KernelMemory": 0, "Links": null, "LogConfig": {"Config": null, "Type": "k8s-file"}, "Memory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "NanoCpus": 0, "NetworkMode": "bridge", "OomKillDisable": false, "OomScoreAdj": 0, "PidMode": "private", "PidsLimit": 4096, "PortBindings": {}, "Privileged": false, "PublishAllPorts": false, "ReadonlyRootfs": false, "RestartPolicy": {"MaximumRetryCount": 0, "Name": ""}, "Runtime": "oci", "SecurityOpt": ["apparmor=containers-default-0.14.3"], "ShmSize": 65536000, "Tmpfs": {}, "UTSMode": "private", "Ulimits": [{"Hard": 1048576, "Name": "RLIMIT_NOFILE", "Soft": 1048576}, {"Hard": 4194304, "Name": "RLIMIT_NPROC", "Soft": 4194304}], "UsernsMode": "", "VolumeDriver": "", "VolumesFrom": null}, "HostnamePath": "/var/run/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/hosts", "Id": "739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904", "Image": "e6d5ceada381ef60598b9b1a4615a7820f7ab068229a13e39c0936ad6be50394", "ImageName": "docker.io/graphiteapp/graphite-statsd:latest", "IsInfra": false, "LogPath": "/var/lib/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/ctr.log", "LogTag": "", "MountLabel": "", "Mounts": [{"Destination": "/opt/graphite/conf", "Driver": "local", "Mode": "", "Name": "5987290181aa734301f34f1e694477d3a23aa3c77e4936080c8c8192586a6689", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/5987290181aa734301f34f1e694477d3a23aa3c77e4936080c8c8192586a6689/_data", "Type": "volume"}, {"Destination": "/opt/graphite/storage", "Driver": "local", "Mode": "", "Name": "e485b6e08535d63e5e8fe2216bc2d453df4db622535757f22798d7566b478fb9", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/e485b6e08535d63e5e8fe2216bc2d453df4db622535757f22798d7566b478fb9/_data", "Type": "volume"}, {"Destination": "/opt/graphite/webapp/graphite/functions/custom", "Driver": "local", "Mode": "", "Name": "3a1b6c019e9dcb99144aebffc5bbf0f0c2d04daa03230dc2915ccee6c45ec2b7", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/3a1b6c019e9dcb99144aebffc5bbf0f0c2d04daa03230dc2915ccee6c45ec2b7/_data", "Type": "volume"}, {"Destination": "/opt/statsd/config", "Driver": "local", "Mode": "", "Name": "53360f01492f4e14c6d9b761b4c114d7c67aec48b0586027f778dc20f3e27c7f", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/53360f01492f4e14c6d9b761b4c114d7c67aec48b0586027f778dc20f3e27c7f/_data", "Type": "volume"}, {"Destination": "/var/lib/redis", "Driver": "local", "Mode": "", "Name": "5fcf5f6d425d376fa1480344df19bb2765feef99081dc04091edca7b328450f0", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/5fcf5f6d425d376fa1480344df19bb2765feef99081dc04091edca7b328450f0/_data", "Type": "volume"}, {"Destination": "/var/log", "Driver": "local", "Mode": "", "Name": "1d479d50f63cdf5fdd1f79bb4f8e3dee08cc2451220a64be685da9ffe2825259", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/1d479d50f63cdf5fdd1f79bb4f8e3dee08cc2451220a64be685da9ffe2825259/_data", "Type": "volume"}, {"Destination": "/etc/logrotate.d", "Driver": "local", "Mode": "", "Name": "0dffc63dc686a09c3b265dcc5b45f033021437d66cd72bb31418d164cc38dd50", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/0dffc63dc686a09c3b265dcc5b45f033021437d66cd72bb31418d164cc38dd50/_data", "Type": "volume"}, {"Destination": "/etc/nginx", "Driver": "local", "Mode": "", "Name": "8d10c8e8b6e483efb792d141c52fe2cb9ca5b3d69aa4189500687f5997717000", "Options": ["nodev", "exec", "nosuid", "rbind"], "Propagation": "rprivate", "RW": true, "Source": "/var/lib/containers/storage/volumes/8d10c8e8b6e483efb792d141c52fe2cb9ca5b3d69aa4189500687f5997717000/_data", "Type": "volume"}], "Name": "graphite", "Namespace": "", "NetworkSettings": {"Bridge": "", "EndpointID": "", "Gateway": "10.88.0.1", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "HairpinMode": false, "IPAddress": "10.88.0.45", "IPPrefixLen": 16, "IPv6Gateway": "", "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "MacAddress": "b2:38:76:b3:3b:5f", "Ports": {}, "SandboxID": "", "SandboxKey": "/var/run/netns/cni-be71ac12-583a-3a11-2cc6-800ad6b3aeb6"}, "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/config.json", "OCIRuntime": "runc", "Path": "/entrypoint", "Pod": "", "ProcessLabel": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata/resolv.conf", "RestartCount": 0, "Rootfs": "", "State": {"ConmonPid": 2366355, "Dead": false, "Error": "", "ExitCode": 0, "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"FailingStreak": 0, "Log": null, "Status": ""}, "OOMKilled": false, "OciVersion": "1.0.2-dev", "Paused": false, "Pid": 2366378, "Restarting": false, "Running": true, "StartedAt": "2020-06-30T16:01:24.187601085Z", "Status": "running"}, "StaticDir": "/var/lib/containers/storage/overlay-containers/739a12254a621d57bbb5bd75b998520360038e791e2cb032347cb22d30ed2904/userdata"}, "podman_actions": ["podman rm -f graphite", "podman run --name graphite --detach=True graphiteapp/graphite-statsd:latest"], "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
Additional environment details (AWS, VirtualBox, physical, etc.):
Ubuntu 20.04 on AWS
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
If you provide volumes with double or trailing slashes, the volumes idempotency doesn't work.
Steps to reproduce the issue:
- name: Install Graphite container
containers.podman.podman_container:
image: "graphiteapp/graphite-statsd"
name: graphite
volumes:
- /tmp/test/:/data
state: present
- name: Install Graphite container
containers.podman.podman_container:
image: "graphiteapp/graphite-statsd"
name: graphite
volumes:
- /tmp//test/:/data/
state: present
Describe the results you received:
Container is recreated
Describe the results you expected:
Container is not recreated.
Additional information you deem important (e.g. issue happens only occasionally):
Additional environment details (AWS, VirtualBox, physical, etc.):
The "Big Migration" has now taken place.
As this collection already exists, we need to carefully check to see if any further commits went into devel since this repo was created.
Please check the contents of https://github.com/ansible-collection-migration/containers.podman against this repo
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind bug
Description
PodmanContainerDiff.diffparam_network()
is diff'ing HostConfig.NetworkMode
against the network
module parameter.
When a network-name is passed in the network
parameter and the container already exists, HostConfig.NetworkMode
can/will have a value of bridge
. This breaks idempotency since 'bridge' != 'network-name'
.
If the existing container isn't running, the network name isn't included in the podman container inspect
json.
If the existing container is running, the network-name is a key in NetworkSettings.Networks
.
Steps to reproduce the issue:
Create a podman network
Run task to create a container with podman_container
specifying the podman network name
Re-run task
Describe the results you received:
Container is recreated.
The container diff (PodmanContainer.diff
) is:
{
"before": {
"network": [
"bridge"
]
},
"after": {
"network": [
"pihole"
]
}
}
Describe the results you expected:
Container is left unchanged
Additional information you deem important (e.g. issue happens only occasionally):
Issue happens every run of task
Output of ansible --version
:
ansible 2.9.10
config file = /Users/derelam/.ansible.cfg
configured module search path = ['/Users/derelam/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible
executable location = /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/bin/ansible
python version = 3.8.3 (default, May 15 2020, 14:39:37) [Clang 11.0.3 (clang-1103.0.32.59)]
Output of podman version
:
Version: 2.0.1
API Version: 1
Go Version: go1.14.3
Built: Wed Dec 31 18:00:00 1969
OS/Arch: linux/amd64
Output of podman info --debug
:
host:
arch: amd64
buildahVersion: 1.15.0
cgroupVersion: v2
conmon:
package: conmon-2.0.18-1.fc32.x86_64
path: /usr/bin/conmon
version: 'conmon version 2.0.18, commit: 6e8799f576f11f902cd8a8d8b45b2b2caf636a85'
cpus: 2
distribution:
distribution: fedora
version: "32"
eventLogger: file
hostname: h-msn-smbdc-05.dom.creof.com
idMappings:
gidmap: null
uidmap: null
kernel: 5.7.7-200.fc32.x86_64
linkmode: dynamic
memFree: 82173952
memTotal: 2046242816
ociRuntime:
name: crun
package: crun-0.14-2.fc32.x86_64
path: /usr/bin/crun
version: |-
crun version 0.14
commit: ebc56fc9bcce4b3208bb0079636c80545122bf58
spec: 1.0.0
+SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +YAJL
os: linux
remoteSocket:
exists: true
path: /run/podman/podman.sock
rootless: false
slirp4netns:
executable: ""
package: ""
version: ""
swapFree: 2139308032
swapTotal: 2147479552
uptime: 19h 2m 33.86s (Approximately 0.79 days)
registries:
search:
- registry.fedoraproject.org
- registry.access.redhat.com
- registry.centos.org
- docker.io
store:
configFile: /etc/containers/storage.conf
containerStore:
number: 1
paused: 0
running: 1
stopped: 0
graphDriverName: overlay
graphOptions:
overlay.mountopt: nodev,metacopy=on
graphRoot: /var/lib/containers/storage
graphStatus:
Backing Filesystem: xfs
Native Overlay Diff: "false"
Supports d_type: "true"
Using metacopy: "true"
imageStore:
number: 1
runRoot: /var/run/containers/storage
volumePath: /var/lib/containers/storage/volumes
version:
APIVersion: 1
Built: 0
BuiltTime: Wed Dec 31 18:00:00 1969
GitCommit: ""
GoVersion: go1.14.3
OsArch: linux/amd64
Version: 2.0.1
Package info (e.g. output of rpm -q podman
or apt list podman
):
podman-2.0.1-1.fc32.x86_64
Playbok you run with ansible (e.g. content of playbook.yaml
):
---
- hosts: all
become: true
tasks:
- name: create mount dirs
file:
path: '{{ item }}'
state: directory
loop:
- /etc/pihole
- /etc/pihole/dnsmasq.d
- /var/log/pihole
- name: create network
shell: |
set -eo pipefail
inspect=$(podman network inspect pihole 2>/dev/null || true)
result=$(echo "$inspect" | python3 -c "
import json as j, sys as s
try:
c = j.load(s.stdin)
except j.JSONDecodeError:
action='create'
else:
r = next(plugin['ipam']['ranges'] for plugin in c[0]['plugins'] if plugin['type'] == 'bridge')
if not (len(r) == 1 and len(r[0]) == 1 and r[0][0]['subnet'] == '10.254.0.0/24'):
action='delete'
else:
action='ok'
print(action)
")
if [ "$result" == 'ok' ]; then
result=false
else
if [ "$result" == 'delete' ]; then
podman network rm -f pi-hole
fi
result=$(podman network create --subnet=10.254.0.0/24 --disable-dns pihole)
if [ $? -eq 0 ]; then
result=true
fi
fi
echo $result
register: create_network
changed_when: create_network.stdout.strip() == 'true'
failed_when: create_network.stdout.strip() not in ['true', 'false']
- name: create container
containers.podman.podman_container:
name: pihole
state: present
image: docker.io/pihole/pihole:latest
dns:
- 127.0.0.1
- 1.1.1.1
network: pihole
ip: 10.254.0.5
publish:
- '{{ ansible_default_ipv4.address }}:53:53/tcp'
- '{{ ansible_default_ipv4.address }}:53:53/udp'
- 127.0.0.1:53:53/tcp
- 127.0.0.1:53:53/udp
hostname: host.pihole.dom.local
privileged: true
env:
TZ: America/Chicago
VIRTUAL_HOST: host.dom.local
DNS1: 1.1.1.1
DNS2: 1.0.0.1
IPv6: false
DNSSEC: true
DNS_BOGUS_PRIV: false
volume:
- /etc/pihole:/etc/pihole
- /etc/pihole/dnsmasq.d:/etc/dnsmasq.d
- /var/log/pihole:/var/log
healthcheck: dig +norecurse +retry=0 @{{ ansible_default_ipv4.address }} host.pihole.dom.local || exit 1
Command line and output of ansible run with high verbosity:
ansible-playbook 2.9.10
config file = /Users/derelam/.ansible.cfg
configured module search path = ['/Users/derelam/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible
executable location = /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/bin/ansible-playbook
python version = 3.8.3 (default, May 15 2020, 14:39:37) [Clang 11.0.3 (clang-1103.0.32.59)]
Using /Users/derelam/.ansible.cfg as config file
Reading vault password file: ~/.vault_passwd.squid
Reading vault password file: ~/.vault_passwd.mail-services
Reading vault password file: ~/.vault_passwd.audit-tools
Reading vault password file: ~/.vault_passwd.postgres
Reading vault password file: ~/.vault_passwd.rsa_prime
setting up inventory plugins
host_list declined parsing /Users/derelam/Development/samba-dc/hosts.yaml as it did not pass its verify_file() method
script declined parsing /Users/derelam/Development/samba-dc/hosts.yaml as it did not pass its verify_file() method
Parsed /Users/derelam/Development/samba-dc/hosts.yaml inventory source with yaml plugin
Loading callback plugin default of type stdout, v2.0 from /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/plugins/callback/default.py
PLAYBOOK: container.yaml ******************************************************************************************************************************************************************************************************************************************************************************************
Positional arguments: container.yaml
verbosity: 7
connection: smart
timeout: 20
become_method: sudo
tags: ('all',)
inventory: ('/Users/derelam/Development/samba-dc/hosts.yaml',)
subset: h-msn-smbdc-05
forks: 5
1 plays in container.yaml
PLAY [all] ********************************************************************************************************************************************************************************************************************************************************************************************************
TASK [Gathering Facts] ********************************************************************************************************************************************************************************************************************************************************************************************
task path: /Users/derelam/Development/samba-dc/container.yaml:2
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782 && echo ansible-tmp-1594417501.571101-23160-225477768236782="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417501.571101-23160-225477768236782=/home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/default_collectors.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/ansible_collector.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/namespace.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/basic.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/netbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/base.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/local.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/netbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/selinux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/linux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/openbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/date_time.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/openbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/base.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/lsb.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/base.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/other/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/dragonfly.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/chroot.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/aix.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/pkg_mgr.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/openbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/service_mgr.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/iscsi.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/aix.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/dragonfly.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/freebsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/dragonfly.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/freebsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/apparmor.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/hurd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/linux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/hurd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/freebsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/fc_wwn.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/darwin.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/other/facter.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/darwin.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/env.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/other/ohai.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/fips.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/ssh_pub_keys.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/user.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/sunos.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/distribution.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/platform.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/sunos.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/python.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/sunos.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/dns.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/caps.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/system/cmdline.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/hpux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/nvme.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/hpux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/hpux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/hardware/linux.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/netbsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/virtual/sysctl.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/collector.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/timeout.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/six/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/utils.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/sysctl.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/_text.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/facts/network/generic_bsd.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/process.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/file.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/formatters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/sys_info.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_utils.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/_distro.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/parameters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_json_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/selectors.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_collections_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/validation.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/pycompat24.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/converters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/convert_bool.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/collections.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/_selectors2.py
Using module file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/modules/system/setup.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmp4ydxdxkl TO /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmp4ydxdxkl /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmp4ydxdxkl\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:8 O:131072 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:9 O:163840 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:10 O:196608 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:11 O:229376 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:12 O:262144 S:2489\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 32768 bytes at 98304\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 8 32768 bytes at 131072\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 9 32768 bytes at 163840\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 10 32768 bytes at 196608\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 11 32768 bytes at 229376\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 12 2489 bytes at 262144\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/ /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=qhdikpmjgwbmdsgebjztimqskrlnwndl] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-qhdikpmjgwbmdsgebjztimqskrlnwndl ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "32", "ansible_distribution_major_version": "32", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtAdJoTdfqKgVWBth1O9MgEcCz0gqY9KCZ18LGYh0MqqLI8SB88UbgBAsTltpTpW6/BtI4+BehyA4qemrIJE17CXs76k6qBliRUajstmCJQ/4eBRuYYhJnvsql/JbgzhFfNDHCDVGelXDIaiTv/ecXTbAaHjcJGz8eWaWUxEMhmXR53po6Re5W2QL5tkbPpDQ5T16/dU7iHY1Bhspn6X/+1ZYLJOigbkMvigyfoAO/0Wl2hl0df06RnyrVw2s36FnWcD3GMgGMtCGbpSjHMpOg3vvoQtON5QHNbWuLbPm6Raa7piTQ7RNNIRp53NaBd00yQCsPKFo/my+3wBo3k7ytErHHjpCbTK9UjbvHtYxPR5vjHh3K/hh2BOBEhciFiIcVOuCv0MbuMatE7ody4IHcKbPber9W8RRy1GiRzs6nTpm1hhSPIP6bj1lbHotOVxYsBYwstAl0YIDs46kUAFwvvuYuMZRIjHoZ1m53fcp/ys5DQBFLkDEBupi1TyYeZT0=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBfMXtpju1b01wG/TosUfiQHssWjzPNNV9YqH6j7M+tLTeIu0jJv+bzu6LBdMn53t50GtvFd71+cMLyQMVWWPdQ=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFHsgZf8CXsKEzJof7EGmEm8rMZ+NST5UnQqTH/Cf6D6", "ansible_system": "Linux", "ansible_kernel": "5.7.7-200.fc32.x86_64", "ansible_kernel_version": "#1 SMP Wed Jul 1 19:53:01 UTC 2020", "ansible_machine": "x86_64", "ansible_python_version": "3.8.3", "ansible_fqdn": "h-msn-smbdc-05.dom.creof.com", "ansible_hostname": "h-msn-smbdc-05", "ansible_nodename": "h-msn-smbdc-05.dom.creof.com", "ansible_domain": "dom.creof.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "cb3ee667658548e4b9b2ea7b67722b27", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 32, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/vmlinuz-5.7.7-200.fc32.x86_64", "root": "UUID=fd2e8579-27af-4ebc-9c55-e7e027aa4f18", "ro": true, "resume": "UUID=22d2022e-ac08-47c3-b9d8-3cf2db69f7da", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/vmlinuz-5.7.7-200.fc32.x86_64", "root": "UUID=fd2e8579-27af-4ebc-9c55-e7e027aa4f18", "ro": true, "resume": "UUID=22d2022e-ac08-47c3-b9d8-3cf2db69f7da", "rhgb": true, "quiet": true}, "ansible_virtualization_role": "guest", "ansible_virtualization_type": "VMware", "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"search": ["dom.creof.com"], "nameservers": ["192.168.153.16", "127.0.0.1"]}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz", "1", "GenuineIntel", "Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz"], "ansible_processor_count": 2, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 1, "ansible_processor_vcpus": 2, "ansible_memtotal_mb": 1951, "ansible_memfree_mb": 527, "ansible_swaptotal_mb": 2047, "ansible_swapfree_mb": 2038, "ansible_memory_mb": {"real": {"total": 1951, "used": 1424, "free": 527}, "nocache": {"free": 1246, "used": 705}, "swap": {"total": 2047, "free": 2038, "used": 9, "cached": 0}}, "ansible_bios_date": "05/15/2020", "ansible_bios_version": "VMW71.00V.16221537.B64.2005150253", "ansible_form_factor": "Other", "ansible_product_name": "VMware7,1", "ansible_product_serial": "VMware-56 4d 44 c5 cb 2b e0 d2-89 69 13 e7 4b 7c 20 42", "ansible_product_uuid": "c5444d56-2bcb-d2e0-8969-13e74b7c2042", "ansible_product_version": "None", "ansible_system_vendor": "VMware, Inc.", "ansible_devices": {"sr0": {"virtual": 1, "links": {"ids": ["ata-VMware_Virtual_IDE_CDROM_Drive_10000000000000000001"], "uuids": ["2020-04-22-22-29-16-00"], "labels": ["Fedora-S-dvd-x86_64-32"], "masters": []}, "vendor": "NECVMWar", "model": "VMware IDE CDR10", "sas_address": null, "sas_device_handle": null, "removable": "1", "support_discard": "0", "partitions": {}, "rotational": "1", "scheduler_mode": "bfq", "sectors": "4048896", "sectorsize": "2048", "size": "1.93 GB", "host": "IDE interface: Intel Corporation 82371AB/EB/MB PIIX4 IDE (rev 01)", "holders": []}, "sda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": "VMware,", "model": "VMware Virtual S", "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"sda4": {"links": {"ids": [], "uuids": ["fd2e8579-27af-4ebc-9c55-e7e027aa4f18"], "labels": [], "masters": []}, "start": "7522304", "sectors": "31457280", "sectorsize": 512, "size": "15.00 GB", "uuid": "fd2e8579-27af-4ebc-9c55-e7e027aa4f18", "holders": []}, "sda2": {"links": {"ids": [], "uuids": ["71973e61-0d9e-4bc9-8d7c-7ac007068b3d"], "labels": [], "masters": []}, "start": "1230848", "sectors": "2097152", "sectorsize": 512, "size": "1.00 GB", "uuid": "71973e61-0d9e-4bc9-8d7c-7ac007068b3d", "holders": []}, "sda3": {"links": {"ids": [], "uuids": ["22d2022e-ac08-47c3-b9d8-3cf2db69f7da"], "labels": [], "masters": []}, "start": "3328000", "sectors": "4194304", "sectorsize": 512, "size": "2.00 GB", "uuid": "22d2022e-ac08-47c3-b9d8-3cf2db69f7da", "holders": []}, "sda1": {"links": {"ids": [], "uuids": ["F5C3-B168"], "labels": [], "masters": []}, "start": "2048", "sectors": "1228800", "sectorsize": 512, "size": "600.00 MB", "uuid": "F5C3-B168", "holders": []}}, "rotational": "1", "scheduler_mode": "bfq", "sectors": "41943040", "sectorsize": "512", "size": "20.00 GB", "host": "SCSI storage controller: Broadcom / LSI 53c1030 PCI-X Fusion-MPT Dual Ultra320 SCSI (rev 01)", "holders": []}}, "ansible_device_links": {"ids": {"sr0": ["ata-VMware_Virtual_IDE_CDROM_Drive_10000000000000000001"]}, "uuids": {"sr0": ["2020-04-22-22-29-16-00"], "sda2": ["71973e61-0d9e-4bc9-8d7c-7ac007068b3d"], "sda4": ["fd2e8579-27af-4ebc-9c55-e7e027aa4f18"], "sda1": ["F5C3-B168"], "sda3": ["22d2022e-ac08-47c3-b9d8-3cf2db69f7da"]}, "labels": {"sr0": ["Fedora-S-dvd-x86_64-32"]}, "masters": {}}, "ansible_uptime_seconds": 69914, "ansible_mounts": [{"mount": "/", "device": "/dev/sda4", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 16095641600, "size_available": 12175048704, "block_size": 4096, "block_total": 3929600, "block_available": 2972424, "block_used": 957176, "inode_total": 7864320, "inode_available": 7779587, "inode_used": 84733, "uuid": "fd2e8579-27af-4ebc-9c55-e7e027aa4f18"}, {"mount": "/boot", "device": "/dev/sda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 1063256064, "size_available": 887848960, "block_size": 4096, "block_total": 259584, "block_available": 216760, "block_used": 42824, "inode_total": 524288, "inode_available": 524265, "inode_used": 23, "uuid": "71973e61-0d9e-4bc9-8d7c-7ac007068b3d"}, {"mount": "/boot/efi", "device": "/dev/sda1", "fstype": "vfat", "options": "rw,relatime,fmask=0077,dmask=0077,codepage=437,iocharset=ascii,shortname=winnt,errors=remount-ro", "size_total": 627900416, "size_available": 618979328, "block_size": 4096, "block_total": 153296, "block_available": 151118, "block_used": 2178, "inode_total": 0, "inode_available": 0, "inode_used": 0, "uuid": "F5C3-B168"}, {"mount": "/var/lib/containers/storage/overlay", "device": "/dev/sda4", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota,bind", "size_total": 16095641600, "size_available": 12175048704, "block_size": 4096, "block_total": 3929600, "block_available": 2972424, "block_used": 957176, "inode_total": 7864320, "inode_available": 7779587, "inode_used": 84733, "uuid": "fd2e8579-27af-4ebc-9c55-e7e027aa4f18"}], "ansible_interfaces": ["ens33", "cni-podman1", "vethd77d9289", "cni-podman0", "lo", "lo1"], "ansible_cni_podman0": {"device": "cni-podman0", "macaddress": "6a:16:a7:61:74:f3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.6a16a76174f3", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["rx_software", "software"], "hw_timestamp_filters": []}, "ansible_vethd77d9289": {"device": "vethd77d9289", "macaddress": "02:2a:3b:56:6e:7e", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": true, "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_cni_podman1": {"device": "cni-podman1", "macaddress": "5e:fc:6f:01:d0:a5", "mtu": 1500, "active": true, "type": "bridge", "interfaces": ["vethd77d9289"], "id": "8000.5efc6f01d0a5", "stp": false, "speed": 10000, "promisc": false, "ipv4": {"address": "10.254.0.1", "broadcast": "10.254.0.255", "netmask": "255.255.255.0", "network": "10.254.0.0"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [requested on]", "tx_fcoe_segmentation": "off [requested on]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["rx_software", "software"], "hw_timestamp_filters": []}, "ansible_ens33": {"device": "ens33", "macaddress": "00:0c:29:7c:20:42", "mtu": 1500, "active": true, "module": "e1000", "type": "ether", "pciid": "0000:02:01.0", "speed": 1000, "promisc": false, "ipv4": {"address": "192.168.153.15", "broadcast": "192.168.153.255", "netmask": "255.255.255.0", "network": "192.168.153.0"}, "features": {"rx_checksumming": "off", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "off [fixed]", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "on [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off", "rx_all": "off", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_lo1": {"device": "lo1", "macaddress": "46:92:60:ca:80:7e", "mtu": 1500, "active": true, "type": "ether", "promisc": false, "ipv4": {"address": "10.255.0.5", "broadcast": "global", "netmask": "255.255.255.255", "network": "10.255.0.5"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "host", "netmask": "255.0.0.0", "network": "127.0.0.0"}, "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "192.168.153.2", "interface": "ens33", "address": "192.168.153.15", "broadcast": "192.168.153.255", "netmask": "255.255.255.0", "network": "192.168.153.0", "macaddress": "00:0c:29:7c:20:42", "mtu": 1500, "type": "ether", "alias": "ens33"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.254.0.1", "192.168.153.15", "10.255.0.5"], "ansible_all_ipv6_addresses": [], "ansible_system_capabilities_enforced": "True", "ansible_system_capabilities": ["cap_chown", "cap_dac_override", "cap_dac_read_search", "cap_fowner", "cap_fsetid", "cap_kill", "cap_setgid", "cap_setuid", "cap_setpcap", "cap_linux_immutable", "cap_net_bind_service", "cap_net_broadcast", "cap_net_admin", "cap_net_raw", "cap_ipc_lock", "cap_ipc_owner", "cap_sys_module", "cap_sys_rawio", "cap_sys_chroot", "cap_sys_ptrace", "cap_sys_pacct", "cap_sys_admin", "cap_sys_boot", "cap_sys_nice", "cap_sys_resource", "cap_sys_time", "cap_sys_tty_config", "cap_mknod", "cap_lease", "cap_audit_write", "cap_audit_control", "cap_setfcap", "cap_mac_override", "cap_mac_admin", "cap_syslog", "cap_wake_alarm", "cap_block_suspend", "cap_audit_read+ep"], "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "SUDO_GID": "1000", "SUDO_COMMAND": "/bin/sh -c echo BECOME-SUCCESS-qhdikpmjgwbmdsgebjztimqskrlnwndl ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/AnsiballZ_setup.py", "SUDO_USER": "admin", "PWD": "/home/admin", "LOGNAME": "root", "HOME": "/root", "LANG": "C", "LS_COLORS": "rs=0:di=38;5;33:ln=38;5;51:mh=00:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=01;37;41:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;40:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.zst=38;5;9:*.tzst=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.wim=38;5;9:*.swm=38;5;9:*.dwm=38;5;9:*.esd=38;5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.mjpg=38;5;13:*.mjpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.webp=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:", "TERM": "xterm-256color", "USER": "root", "SHLVL": "1", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "SUDO_UID": "1000", "MAIL": "/var/mail/root", "_": "/usr/bin/python3", "LC_ALL": "C", "LC_NUMERIC": "C"}, "ansible_date_time": {"year": "2020", "month": "07", "weekday": "Friday", "weekday_number": "5", "weeknumber": "27", "day": "10", "hour": "16", "minute": "45", "second": "02", "epoch": "1594417502", "date": "2020-07-10", "time": "16:45:02", "iso8601_micro": "2020-07-10T21:45:02.838445Z", "iso8601": "2020-07-10T21:45:02Z", "iso8601_basic": "20200710T164502838362", "iso8601_basic_short": "20200710T164502", "tz": "CDT", "tz_offset": "-0500"}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 8, "micro": 3, "releaselevel": "final", "serial": 0}, "version_info": [3, 8, 3, "final", 0], "executable": "/usr/bin/python3", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417501.571101-23160-225477768236782/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [h-msn-smbdc-05]
META: ran handlers
TASK [create mount dirs] ******************************************************************************************************************************************************************************************************************************************************************************************
task path: /Users/derelam/Development/samba-dc/container.yaml:5
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205 && echo ansible-tmp-1594417503.063797-23175-215420439866205="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417503.063797-23175-215420439866205=/home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/_text.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/basic.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/six/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/parameters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_json_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/selectors.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_collections_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/process.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/validation.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/pycompat24.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/file.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_utils.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/sys_info.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/formatters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/converters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/convert_bool.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/collections.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/_selectors2.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/_distro.py
Using module file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/modules/files/file.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpn7hyor6u TO /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/AnsiballZ_file.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpn7hyor6u /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/AnsiballZ_file.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpn7hyor6u\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/AnsiballZ_file.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:25039\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 25039 bytes at 98304\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/ /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/AnsiballZ_file.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=uxcrlsxdcikwksuqvtrzqysximcxyzvt] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-uxcrlsxdcikwksuqvtrzqysximcxyzvt ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/AnsiballZ_file.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"path": "/etc/pihole", "changed": false, "diff": {"before": {"path": "/etc/pihole"}, "after": {"path": "/etc/pihole"}}, "uid": 999, "gid": 999, "owner": "systemd-coredump", "group": "input", "mode": "0775", "state": "directory", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 4096, "invocation": {"module_args": {"path": "/etc/pihole", "state": "directory", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null, "unsafe_writes": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417503.063797-23175-215420439866205/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [h-msn-smbdc-05] => (item=/etc/pihole) => {
"ansible_loop_var": "item",
"changed": false,
"diff": {
"after": {
"path": "/etc/pihole"
},
"before": {
"path": "/etc/pihole"
}
},
"gid": 999,
"group": "input",
"invocation": {
"module_args": {
"_diff_peek": null,
"_original_basename": null,
"access_time": null,
"access_time_format": "%Y%m%d%H%M.%S",
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"follow": true,
"force": false,
"group": null,
"mode": null,
"modification_time": null,
"modification_time_format": "%Y%m%d%H%M.%S",
"owner": null,
"path": "/etc/pihole",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "directory",
"unsafe_writes": null
}
},
"item": "/etc/pihole",
"mode": "0775",
"owner": "systemd-coredump",
"path": "/etc/pihole",
"secontext": "unconfined_u:object_r:etc_t:s0",
"size": 4096,
"state": "directory",
"uid": 999
}
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492 && echo ansible-tmp-1594417503.659966-23175-43442213652492="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417503.659966-23175-43442213652492=/home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/modules/files/file.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpr06__t00 TO /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/AnsiballZ_file.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpr06__t00 /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/AnsiballZ_file.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpr06__t00\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/AnsiballZ_file.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:25048\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 25048 bytes at 98304\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/ /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/AnsiballZ_file.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=ufagogdmrstmhcddhvtttabjtvrmuxlu] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-ufagogdmrstmhcddhvtttabjtvrmuxlu ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/AnsiballZ_file.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"path": "/etc/pihole/dnsmasq.d", "changed": false, "diff": {"before": {"path": "/etc/pihole/dnsmasq.d"}, "after": {"path": "/etc/pihole/dnsmasq.d"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:dnsmasq_etc_t:s0", "size": 49, "invocation": {"module_args": {"path": "/etc/pihole/dnsmasq.d", "state": "directory", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null, "unsafe_writes": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417503.659966-23175-43442213652492/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [h-msn-smbdc-05] => (item=/etc/pihole/dnsmasq.d) => {
"ansible_loop_var": "item",
"changed": false,
"diff": {
"after": {
"path": "/etc/pihole/dnsmasq.d"
},
"before": {
"path": "/etc/pihole/dnsmasq.d"
}
},
"gid": 0,
"group": "root",
"invocation": {
"module_args": {
"_diff_peek": null,
"_original_basename": null,
"access_time": null,
"access_time_format": "%Y%m%d%H%M.%S",
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"follow": true,
"force": false,
"group": null,
"mode": null,
"modification_time": null,
"modification_time_format": "%Y%m%d%H%M.%S",
"owner": null,
"path": "/etc/pihole/dnsmasq.d",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "directory",
"unsafe_writes": null
}
},
"item": "/etc/pihole/dnsmasq.d",
"mode": "0755",
"owner": "root",
"path": "/etc/pihole/dnsmasq.d",
"secontext": "unconfined_u:object_r:dnsmasq_etc_t:s0",
"size": 49,
"state": "directory",
"uid": 0
}
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879 && echo ansible-tmp-1594417504.116388-23175-129277141207879="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417504.116388-23175-129277141207879=/home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/modules/files/file.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpz6ve0rfo TO /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/AnsiballZ_file.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpz6ve0rfo /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/AnsiballZ_file.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpz6ve0rfo\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/AnsiballZ_file.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:25043\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 25043 bytes at 98304\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/ /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/AnsiballZ_file.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=tevcvnfzhatujlnyiyzbkrukgclearmo] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-tevcvnfzhatujlnyiyzbkrukgclearmo ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/AnsiballZ_file.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"path": "/var/log/pihole", "changed": false, "diff": {"before": {"path": "/var/log/pihole"}, "after": {"path": "/var/log/pihole"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 96, "invocation": {"module_args": {"path": "/var/log/pihole", "state": "directory", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "content": null, "backup": null, "remote_src": null, "regexp": null, "delimiter": null, "directory_mode": null, "unsafe_writes": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d45 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417504.116388-23175-129277141207879/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [h-msn-smbdc-05] => (item=/var/log/pihole) => {
"ansible_loop_var": "item",
"changed": false,
"diff": {
"after": {
"path": "/var/log/pihole"
},
"before": {
"path": "/var/log/pihole"
}
},
"gid": 0,
"group": "root",
"invocation": {
"module_args": {
"_diff_peek": null,
"_original_basename": null,
"access_time": null,
"access_time_format": "%Y%m%d%H%M.%S",
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"follow": true,
"force": false,
"group": null,
"mode": null,
"modification_time": null,
"modification_time_format": "%Y%m%d%H%M.%S",
"owner": null,
"path": "/var/log/pihole",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "directory",
"unsafe_writes": null
}
},
"item": "/var/log/pihole",
"mode": "0755",
"owner": "root",
"path": "/var/log/pihole",
"secontext": "unconfined_u:object_r:var_log_t:s0",
"size": 96,
"state": "directory",
"uid": 0
}
TASK [create network] *********************************************************************************************************************************************************************************************************************************************************************************************
task path: /Users/derelam/Development/samba-dc/container.yaml:14
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498 && echo ansible-tmp-1594417504.673494-23216-59906902540498="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417504.673494-23216-59906902540498=/home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/collections.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/_text.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/basic.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/six/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_collections_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/parameters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_json_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/process.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/selectors.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/validation.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/pycompat24.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/file.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_utils.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/sys_info.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/formatters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/converters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/convert_bool.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/_selectors2.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/_distro.py
Using module file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/modules/commands/command.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmptd3n65cl TO /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/AnsiballZ_command.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmptd3n65cl /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/AnsiballZ_command.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmptd3n65cl\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/AnsiballZ_command.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:20070\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 20070 bytes at 98304\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/ /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/AnsiballZ_command.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=bdtlneusyryfpnqoxvtkqhbieaqkalwv] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-bdtlneusyryfpnqoxvtkqhbieaqkalwv ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"cmd": "set -eo pipefail\\n\\ninspect=$(podman network inspect pihole 2>/dev/null || true)\\nresult=$(echo \\"$inspect\\" | python3 -c \\"\\nimport json as j, sys as s\\ntry:\\n c = j.load(s.stdin)\\nexcept j.JSONDecodeError:\\n action=\'create\'\\nelse:\\n r = next(plugin[\'ipam\'][\'ranges\'] for plugin in c[0][\'plugins\'] if plugin[\'type\'] == \'bridge\')\\n if not (len(r) == 1 and len(r[0]) == 1 and r[0][0][\'subnet\'] == \'10.254.0.0/24\'):\\n action=\'delete\'\\n else:\\n action=\'ok\'\\nprint(action)\\n\\")\\n\\nif [ \\"$result\\" == \'ok\' ]; then\\n result=false\\nelse\\n if [ \\"$result\\" == \'delete\' ]; then\\n podman network rm -f pi-hole\\n fi\\n result=$(podman network create --subnet=10.254.0.0/24 --disable-dns pihole)\\n if [ $? -eq 0 ]; then\\n result=true\\n fi\\nfi\\necho $result\\n", "stdout": "false", "stderr": "", "rc": 0, "start": "2020-07-10 16:45:05.193351", "end": "2020-07-10 16:45:05.382298", "delta": "0:00:00.188947", "changed": true, "invocation": {"module_args": {"_raw_params": "set -eo pipefail\\n\\ninspect=$(podman network inspect pihole 2>/dev/null || true)\\nresult=$(echo \\"$inspect\\" | python3 -c \\"\\nimport json as j, sys as s\\ntry:\\n c = j.load(s.stdin)\\nexcept j.JSONDecodeError:\\n action=\'create\'\\nelse:\\n r = next(plugin[\'ipam\'][\'ranges\'] for plugin in c[0][\'plugins\'] if plugin[\'type\'] == \'bridge\')\\n if not (len(r) == 1 and len(r[0]) == 1 and r[0][0][\'subnet\'] == \'10.254.0.0/24\'):\\n action=\'delete\'\\n else:\\n action=\'ok\'\\nprint(action)\\n\\")\\n\\nif [ \\"$result\\" == \'ok\' ]; then\\n result=false\\nelse\\n if [ \\"$result\\" == \'delete\' ]; then\\n podman network rm -f pi-hole\\n fi\\n result=$(podman network create --subnet=10.254.0.0/24 --disable-dns pihole)\\n if [ $? -eq 0 ]; then\\n result=true\\n fi\\nfi\\necho $result\\n", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d46 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417504.673494-23216-59906902540498/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [h-msn-smbdc-05] => {
"changed": false,
"cmd": "set -eo pipefail\n\ninspect=$(podman network inspect pihole 2>/dev/null || true)\nresult=$(echo \"$inspect\" | python3 -c \"\nimport json as j, sys as s\ntry:\n c = j.load(s.stdin)\nexcept j.JSONDecodeError:\n action='create'\nelse:\n r = next(plugin['ipam']['ranges'] for plugin in c[0]['plugins'] if plugin['type'] == 'bridge')\n if not (len(r) == 1 and len(r[0]) == 1 and r[0][0]['subnet'] == '10.254.0.0/24'):\n action='delete'\n else:\n action='ok'\nprint(action)\n\")\n\nif [ \"$result\" == 'ok' ]; then\n result=false\nelse\n if [ \"$result\" == 'delete' ]; then\n podman network rm -f pi-hole\n fi\n result=$(podman network create --subnet=10.254.0.0/24 --disable-dns pihole)\n if [ $? -eq 0 ]; then\n result=true\n fi\nfi\necho $result\n",
"delta": "0:00:00.188947",
"end": "2020-07-10 16:45:05.382298",
"failed_when_result": false,
"invocation": {
"module_args": {
"_raw_params": "set -eo pipefail\n\ninspect=$(podman network inspect pihole 2>/dev/null || true)\nresult=$(echo \"$inspect\" | python3 -c \"\nimport json as j, sys as s\ntry:\n c = j.load(s.stdin)\nexcept j.JSONDecodeError:\n action='create'\nelse:\n r = next(plugin['ipam']['ranges'] for plugin in c[0]['plugins'] if plugin['type'] == 'bridge')\n if not (len(r) == 1 and len(r[0]) == 1 and r[0][0]['subnet'] == '10.254.0.0/24'):\n action='delete'\n else:\n action='ok'\nprint(action)\n\")\n\nif [ \"$result\" == 'ok' ]; then\n result=false\nelse\n if [ \"$result\" == 'delete' ]; then\n podman network rm -f pi-hole\n fi\n result=$(podman network create --subnet=10.254.0.0/24 --disable-dns pihole)\n if [ $? -eq 0 ]; then\n result=true\n fi\nfi\necho $result\n",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"rc": 0,
"start": "2020-07-10 16:45:05.193351",
"stderr": "",
"stderr_lines": [],
"stdout": "false",
"stdout_lines": [
"false"
]
}
TASK [create container] *******************************************************************************************************************************************************************************************************************************************************************************************
task path: /Users/derelam/Development/samba-dc/container.yaml:50
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'echo ~admin && sleep 0'"'"''
<192.168.153.15> (0, b'/home/admin\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/admin/.ansible/tmp `"&& mkdir /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198 && echo ansible-tmp-1594417505.601862-23231-229416052397198="` echo /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198 `" ) && sleep 0'"'"''
<192.168.153.15> (0, b'ansible-tmp-1594417505.601862-23231-229416052397198=/home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/_text.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/basic.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/six/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/parameters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_json_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/selectors.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_collections_compat.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/process.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/validation.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/pycompat24.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/file.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/_utils.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/sys_info.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/formatters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/text/converters.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/parsing/convert_bool.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/common/collections.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/compat/_selectors2.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/__init__.py
Using module_utils file /Users/derelam/Library/Caches/pypoetry/virtualenvs/samba-dc-TDLwk2X_-py3.8/lib/python3.8/site-packages/ansible/module_utils/distro/_distro.py
Using module file /Users/derelam/.ansible/collections/ansible_collections/containers/podman/plugins/modules/podman_container.py
<192.168.153.15> PUT /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpnu8s_teg TO /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/AnsiballZ_podman_container.py
<192.168.153.15> SSH: disable batch mode for sshpass: (-o)(BatchMode=no)
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set sftp_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 sftp -o BatchMode=no -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da '[192.168.153.15]'
<192.168.153.15> (0, b'sftp> put /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpnu8s_teg /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/AnsiballZ_podman_container.py\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 2\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug2: Server supports extension "[email protected]" revision 1\r\ndebug3: Sent message fd 3 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /home/admin size 0\r\ndebug3: Looking up /Users/derelam/.ansible/tmp/ansible-local-23156zl_lra7o/tmpnu8s_teg\r\ndebug3: Sent message fd 3 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/AnsiballZ_podman_container.py\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 32768 bytes at 0\r\ndebug3: Sent message SSH2_FXP_WRITE I:5 O:32768 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:6 O:65536 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:7 O:98304 S:32768\r\ndebug3: Sent message SSH2_FXP_WRITE I:8 O:131072 S:4338\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 5 32768 bytes at 32768\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 6 32768 bytes at 65536\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 7 32768 bytes at 98304\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 8 4338 bytes at 131072\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'chmod u+x /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/ /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/AnsiballZ_podman_container.py && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da -tt 192.168.153.15 '/bin/sh -c '"'"'sudo -H -S -p "[sudo via ansible, key=kebkzcqfzisxilfxkjvwgwbmeavnkein] password:" -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-kebkzcqfzisxilfxkjvwgwbmeavnkein ; /usr/bin/python3 /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/AnsiballZ_podman_container.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<192.168.153.15> (0, b'\r\n\r\n{"changed": true, "actions": ["recreated pihole"], "container": {"Id": "41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3", "Created": "2020-07-10T16:45:10.87563612-05:00", "Path": "/s6-init", "Args": ["/s6-init"], "State": {"OciVersion": "1.0.2-dev", "Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 137393, "ConmonPid": 137390, "ExitCode": 0, "Error": "", "StartedAt": "2020-07-10T16:45:11.120085058-05:00", "FinishedAt": "0001-01-01T00:00:00Z", "Healthcheck": {"Status": "starting", "FailingStreak": 0, "Log": null}}, "Image": "788fa841f00633e23f5c40775dad6844262b6546d021441626639984cd5f710a", "ImageName": "docker.io/pihole/pihole:latest", "Rootfs": "", "Pod": "", "ResolvConfPath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/resolv.conf", "HostnamePath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/hostname", "HostsPath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/hosts", "StaticDir": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata", "OCIConfigPath": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/config.json", "OCIRuntime": "crun", "LogPath": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/ctr.log", "LogTag": "", "ConmonPidFile": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/conmon.pid", "Name": "pihole", "RestartCount": 0, "Driver": "overlay", "MountLabel": "system_u:object_r:container_file_t:s0:c70,c770", "ProcessLabel": "", "AppArmorProfile": "", "EffectiveCaps": ["CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_DAC_READ_SEARCH", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_SETGID", "CAP_SETUID", "CAP_SETPCAP", "CAP_LINUX_IMMUTABLE", "CAP_NET_BIND_SERVICE", "CAP_NET_BROADCAST", "CAP_NET_ADMIN", "CAP_NET_RAW", "CAP_IPC_LOCK", "CAP_IPC_OWNER", "CAP_SYS_MODULE", "CAP_SYS_RAWIO", "CAP_SYS_CHROOT", "CAP_SYS_PTRACE", "CAP_SYS_PACCT", "CAP_SYS_ADMIN", "CAP_SYS_BOOT", "CAP_SYS_NICE", "CAP_SYS_RESOURCE", "CAP_SYS_TIME", "CAP_SYS_TTY_CONFIG", "CAP_MKNOD", "CAP_LEASE", "CAP_AUDIT_WRITE", "CAP_AUDIT_CONTROL", "CAP_SETFCAP", "CAP_MAC_OVERRIDE", "CAP_MAC_ADMIN", "CAP_SYSLOG", "CAP_WAKE_ALARM", "CAP_BLOCK_SUSPEND", "CAP_AUDIT_READ"], "BoundingCaps": ["CAP_CHOWN", "CAP_DAC_OVERRIDE", "CAP_DAC_READ_SEARCH", "CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_SETGID", "CAP_SETUID", "CAP_SETPCAP", "CAP_LINUX_IMMUTABLE", "CAP_NET_BIND_SERVICE", "CAP_NET_BROADCAST", "CAP_NET_ADMIN", "CAP_NET_RAW", "CAP_IPC_LOCK", "CAP_IPC_OWNER", "CAP_SYS_MODULE", "CAP_SYS_RAWIO", "CAP_SYS_CHROOT", "CAP_SYS_PTRACE", "CAP_SYS_PACCT", "CAP_SYS_ADMIN", "CAP_SYS_BOOT", "CAP_SYS_NICE", "CAP_SYS_RESOURCE", "CAP_SYS_TIME", "CAP_SYS_TTY_CONFIG", "CAP_MKNOD", "CAP_LEASE", "CAP_AUDIT_WRITE", "CAP_AUDIT_CONTROL", "CAP_SETFCAP", "CAP_MAC_OVERRIDE", "CAP_MAC_ADMIN", "CAP_SYSLOG", "CAP_WAKE_ALARM", "CAP_BLOCK_SUSPEND", "CAP_AUDIT_READ"], "ExecIDs": [], "GraphDriver": {"Name": "overlay", "Data": {"LowerDir": "/var/lib/containers/storage/overlay/f5242a0db3466e8384c3401e67093aff4e626c077337f88010931e5eb86d9f3d/diff:/var/lib/containers/storage/overlay/c51c1d92b5e99db402eef258d4b76864edb0cb9646248309adec0c6357931532/diff:/var/lib/containers/storage/overlay/c8a1abf41cf7452d1a3170eed9ba64d4a877317392ed8dbd1bbdc851f774dfbc/diff:/var/lib/containers/storage/overlay/6920dd59a8463cdf9854e461d46549dce77bd56ecac59683ee5c3dabd1bf0005/diff:/var/lib/containers/storage/overlay/f2046601e3b3cad9e8e3cd63f4c90ff60bbcb29f21d0825338b40023c9b71b69/diff:/var/lib/containers/storage/overlay/5ec562341d602e646183e2499c8fe31672e573038be49bf2ff44ec272abe6696/diff:/var/lib/containers/storage/overlay/2028935954a9a6b336fe4efc61c57fca26ba1a6f69e53a7a85dcfecbceb256be/diff:/var/lib/containers/storage/overlay/97782f90571bec3a0ba37c9a8a8ff527e8ae19d81df4330185d004d1736bb973/diff:/var/lib/containers/storage/overlay/d7a84d67f8adae97b1628dbe5095fe1aff6a720954361b298be2b74301fdaeb4/diff:/var/lib/containers/storage/overlay/3d83d295905c2da0bf0f45d8d3782ec9c049fd0dcf4d867f14753ca22cceedac/diff:/var/lib/containers/storage/overlay/7b313ef723f5c7f02c3ec6efc515398eac3f81ff7ff7070cf20704efe810e34e/diff:/var/lib/containers/storage/overlay/81be3decf5b445909c2517d88d27c52a29f2e137e4739312e77d4a7225d603bf/diff:/var/lib/containers/storage/overlay/f66ed577df6ed16d061b2705132407525a370b4fd821840776cb8564fa9eb501/diff", "MergedDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/merged", "UpperDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/diff", "WorkDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/work"}}, "Mounts": [{"Type": "bind", "Name": "", "Source": "/etc/pihole", "Destination": "/etc/pihole", "Driver": "", "Mode": "", "Options": ["rbind"], "RW": true, "Propagation": "rprivate"}, {"Type": "bind", "Name": "", "Source": "/etc/pihole/dnsmasq.d", "Destination": "/etc/dnsmasq.d", "Driver": "", "Mode": "", "Options": ["rbind"], "RW": true, "Propagation": "rprivate"}, {"Type": "bind", "Name": "", "Source": "/var/log/pihole", "Destination": "/var/log", "Driver": "", "Mode": "", "Options": ["rbind"], "RW": true, "Propagation": "rprivate"}], "Dependencies": [], "NetworkSettings": {"EndpointID": "", "Gateway": "", "IPAddress": "", "IPPrefixLen": 0, "IPv6Gateway": "", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "MacAddress": "", "Bridge": "", "SandboxID": "", "HairpinMode": false, "LinkLocalIPv6Address": "", "LinkLocalIPv6PrefixLen": 0, "Ports": {"53/tcp": [{"HostIp": "192.168.153.15", "HostPort": "53"}, {"HostIp": "127.0.0.1", "HostPort": "53"}], "53/udp": [{"HostIp": "192.168.153.15", "HostPort": "53"}, {"HostIp": "127.0.0.1", "HostPort": "53"}]}, "SandboxKey": "/var/run/netns/cni-cbf7cae6-e55e-60b5-1dd3-e0e89bfbc9ac", "Networks": {"pihole": {"EndpointID": "", "Gateway": "10.254.0.1", "IPAddress": "10.254.0.5", "IPPrefixLen": 24, "IPv6Gateway": "", "GlobalIPv6Address": "", "GlobalIPv6PrefixLen": 0, "MacAddress": "56:8c:5b:90:6e:4c", "NetworkID": "pihole", "DriverOpts": null, "IPAMConfig": null, "Links": null}}}, "ExitCommand": ["/usr/bin/podman", "--root", "/var/lib/containers/storage", "--runroot", "/var/run/containers/storage", "--log-level", "error", "--cgroup-manager", "systemd", "--tmpdir", "/var/run/libpod", "--runtime", "crun", "--storage-driver", "overlay", "--storage-opt", "overlay.mountopt=nodev,metacopy=on", "--events-backend", "file", "container", "cleanup", "41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3"], "Namespace": "", "IsInfra": false, "Config": {"Hostname": "host.pihole.dom.local", "Domainname": "", "User": "", "AttachStdin": false, "AttachStdout": false, "AttachStderr": false, "Tty": false, "OpenStdin": false, "StdinOnce": false, "Env": ["PATH=/opt/pihole:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "TERM=xterm", "container=podman", "S6_KEEP_ENV=1", "ServerIP=0.0.0.0", "VIRTUAL_HOST=host.dom.local", "DNS1=1.1.1.1", "VERSION=v5.0", "IPv6=False", "S6OVERLAY_RELEASE=https://github.com/just-containers/s6-overlay/releases/download/v1.22.1.0/s6-overlay-amd64.tar.gz", "S6_BEHAVIOUR_IF_STAGE2_FAILS=2", "DNSMASQ_USER=root", "DNS2=1.0.0.1", "PHP_ERROR_LOG=/var/log/lighttpd/error.log", "S6_LOGGING=0", "FTL_CMD=no-daemon", "PHP_ENV_CONFIG=/etc/lighttpd/conf-enabled/15-fastcgi-php.conf", "PIHOLE_INSTALL=/root/ph_install.sh", "ARCH=amd64", "DNSSEC=True", "DNS_BOGUS_PRIV=False", "TZ=America/Chicago", "HOSTNAME=host.pihole.dom.local", "HOME=/root"], "Cmd": null, "Image": "docker.io/pihole/pihole:latest", "Volumes": null, "WorkingDir": "/", "Entrypoint": "/s6-init", "OnBuild": null, "Labels": {"image": "pihole/pihole:v5.0_amd64", "maintainer": "[email protected]", "url": "https://www.github.com/pi-hole/docker-pi-hole"}, "Annotations": {"io.container.manager": "libpod", "io.kubernetes.cri-o.Created": "2020-07-10T16:45:10.87563612-05:00", "io.kubernetes.cri-o.TTY": "false", "io.podman.annotations.autoremove": "FALSE", "io.podman.annotations.init": "FALSE", "io.podman.annotations.privileged": "TRUE", "io.podman.annotations.publish-all": "FALSE", "org.opencontainers.image.stopSignal": "15"}, "StopSignal": 15, "Healthcheck": {"Test": ["CMD-SHELL", "dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1"], "Interval": 30000000000, "Timeout": 30000000000, "Retries": 3}, "CreateCommand": ["podman", "container", "run", "--name", "pihole", "--dns", "127.0.0.1,1.1.1.1", "--network", "pihole", "--ip", "10.254.0.5", "--publish", "192.168.153.15:53:53/tcp", "--publish", "192.168.153.15:53:53/udp", "--publish", "127.0.0.1:53:53/tcp", "--publish", "127.0.0.1:53:53/udp", "--hostname", "host.pihole.dom.local", "--privileged=True", "--env", "TZ=America/Chicago", "--env", "VIRTUAL_HOST=host.dom.local", "--env", "DNS1=1.1.1.1", "--env", "DNS2=1.0.0.1", "--env", "IPv6=False", "--env", "DNSSEC=True", "--env", "DNS_BOGUS_PRIV=False", "--volume", "/etc/pihole:/etc/pihole", "--volume", "/etc/pihole/dnsmasq.d:/etc/dnsmasq.d", "--volume", "/var/log/pihole:/var/log", "--healthcheck-command", "dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1", "--detach=True", "--rootfs=False", "docker.io/pihole/pihole:latest"]}, "HostConfig": {"Binds": ["/etc/pihole:/etc/pihole:rw,rprivate,rbind", "/etc/pihole/dnsmasq.d:/etc/dnsmasq.d:rw,rprivate,rbind", "/var/log/pihole:/var/log:rw,rprivate,rbind"], "CgroupMode": "private", "ContainerIDFile": "", "LogConfig": {"Type": "k8s-file", "Config": null}, "NetworkMode": "bridge", "PortBindings": {"53/tcp": [{"HostIp": "192.168.153.15", "HostPort": "53"}, {"HostIp": "127.0.0.1", "HostPort": "53"}], "53/udp": [{"HostIp": "192.168.153.15", "HostPort": "53"}, {"HostIp": "127.0.0.1", "HostPort": "53"}]}, "RestartPolicy": {"Name": "", "MaximumRetryCount": 0}, "AutoRemove": false, "VolumeDriver": "", "VolumesFrom": null, "CapAdd": [], "CapDrop": [], "Dns": ["127.0.0.1", "1.1.1.1"], "DnsOptions": [], "DnsSearch": [], "ExtraHosts": [], "GroupAdd": [], "IpcMode": "private", "Cgroup": "", "Cgroups": "default", "Links": null, "OomScoreAdj": 0, "PidMode": "private", "Privileged": true, "PublishAllPorts": false, "ReadonlyRootfs": false, "SecurityOpt": [], "Tmpfs": {}, "UTSMode": "private", "UsernsMode": "", "ShmSize": 65536000, "Runtime": "oci", "ConsoleSize": [0, 0], "Isolation": "", "CpuShares": 0, "Memory": 0, "NanoCpus": 0, "CgroupParent": "", "BlkioWeight": 0, "BlkioWeightDevice": null, "BlkioDeviceReadBps": null, "BlkioDeviceWriteBps": null, "BlkioDeviceReadIOps": null, "BlkioDeviceWriteIOps": null, "CpuPeriod": 0, "CpuQuota": 0, "CpuRealtimePeriod": 0, "CpuRealtimeRuntime": 0, "CpusetCpus": "", "CpusetMems": "", "Devices": [], "DiskQuota": 0, "KernelMemory": 0, "MemoryReservation": 0, "MemorySwap": 0, "MemorySwappiness": 0, "OomKillDisable": false, "PidsLimit": 4096, "Ulimits": [{"Name": "RLIMIT_NOFILE", "Soft": 1048576, "Hard": 1048576}, {"Name": "RLIMIT_NPROC", "Soft": 4194304, "Hard": 4194304}], "CpuCount": 0, "CpuPercent": 0, "IOMaximumIOps": 0, "IOMaximumBandwidth": 0}}, "podman_actions": ["podman rm -f pihole", "podman run --name pihole --dns 127.0.0.1,1.1.1.1 --network pihole --ip 10.254.0.5 --publish 192.168.153.15:53:53/tcp --publish 192.168.153.15:53:53/udp --publish 127.0.0.1:53:53/tcp --publish 127.0.0.1:53:53/udp --hostname host.pihole.dom.local --privileged=True --env TZ=America/Chicago --env VIRTUAL_HOST=host.dom.local --env DNS1=1.1.1.1 --env DNS2=1.0.0.1 --env IPv6=False --env DNSSEC=True --env DNS_BOGUS_PRIV=False --volume /etc/pihole:/etc/pihole --volume /etc/pihole/dnsmasq.d:/etc/dnsmasq.d --volume /var/log/pihole:/var/log --healthcheck-command dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1 --detach=True --rootfs=False docker.io/pihole/pihole:latest"], "stdout": "41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3\\n", "stderr": "", "invocation": {"module_args": {"name": "pihole", "state": "present", "image": "docker.io/pihole/pihole:latest", "dns": ["127.0.0.1", "1.1.1.1"], "network": ["pihole"], "ip": "10.254.0.5", "publish": ["192.168.153.15:53:53/tcp", "192.168.153.15:53:53/udp", "127.0.0.1:53:53/tcp", "127.0.0.1:53:53/udp"], "hostname": "host.pihole.dom.local", "privileged": true, "env": {"TZ": "America/Chicago", "VIRTUAL_HOST": "host.dom.local", "DNS1": "1.1.1.1", "DNS2": "1.0.0.1", "IPv6": false, "DNSSEC": true, "DNS_BOGUS_PRIV": false}, "volume": ["/etc/pihole:/etc/pihole", "/etc/pihole/dnsmasq.d:/etc/dnsmasq.d", "/var/log/pihole:/var/log"], "healthcheck": "dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1", "executable": "podman", "detach": true, "debug": false, "force_restart": false, "image_strict": false, "recreate": false, "rootfs": false, "annotation": null, "authfile": null, "blkio_weight": null, "blkio_weight_device": null, "cap_add": null, "cap_drop": null, "cgroup_parent": null, "cgroupns": null, "cgroups": null, "cidfile": null, "cmd_args": null, "conmon_pidfile": null, "command": null, "cpu_period": null, "cpu_rt_period": null, "cpu_rt_runtime": null, "cpu_shares": null, "cpus": null, "cpuset_cpus": null, "cpuset_mems": null, "detach_keys": null, "device": null, "device_read_bps": null, "device_read_iops": null, "device_write_bps": null, "device_write_iops": null, "dns_option": null, "dns_search": null, "entrypoint": null, "env_file": null, "env_host": null, "etc_hosts": null, "expose": null, "gidmap": null, "group_add": null, "healthcheck_interval": null, "healthcheck_retries": null, "healthcheck_start_period": null, "healthcheck_timeout": null, "http_proxy": null, "image_volume": null, "init": null, "init_path": null, "interactive": null, "ipc": null, "kernel_memory": null, "label": null, "label_file": null, "log_driver": null, "log_opt": null, "memory": null, "memory_reservation": null, "memory_swap": null, "memory_swappiness": null, "mount": null, "no_hosts": null, "oom_kill_disable": null, "oom_score_adj": null, "pid": null, "pids_limit": null, "pod": null, "publish_all": null, "read_only": null, "read_only_tmpfs": null, "restart_policy": null, "rm": null, "security_opt": null, "shm_size": null, "sig_proxy": null, "stop_signal": null, "stop_timeout": null, "subgidname": null, "subuidname": null, "sysctl": null, "systemd": null, "tmpfs": null, "tty": null, "uidmap": null, "ulimit": null, "user": null, "userns": null, "uts": null, "volumes_from": null, "workdir": null}}}\r\n', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to 192.168.153.15 closed.\r\n')
<192.168.153.15> ESTABLISH SSH CONNECTION FOR USER: admin
<192.168.153.15> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<192.168.153.15> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="admin")
<192.168.153.15> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=20)
<192.168.153.15> SSH: PlayContext set ssh_common_args: ()
<192.168.153.15> SSH: PlayContext set ssh_extra_args: ()
<192.168.153.15> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da)
<192.168.153.15> SSH: EXEC sshpass -d47 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="admin"' -o ConnectTimeout=20 -o ControlPath=/Users/derelam/.ansible/cp/7eaf7b87da 192.168.153.15 '/bin/sh -c '"'"'rm -f -r /home/admin/.ansible/tmp/ansible-tmp-1594417505.601862-23231-229416052397198/ > /dev/null 2>&1 && sleep 0'"'"''
<192.168.153.15> (0, b'', b'OpenSSH_8.1p1, LibreSSL 2.7.3\r\ndebug1: Reading configuration data /Users/derelam/.ssh/config\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: /Users/derelam/.ssh/config line 9: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 47: Applying options for *\r\ndebug2: resolve_canonicalize: hostname 192.168.153.15 is address\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 23072\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [h-msn-smbdc-05] => {
"actions": [
"recreated pihole"
],
"changed": true,
"container": {
"AppArmorProfile": "",
"Args": [
"/s6-init"
],
"BoundingCaps": [
"CAP_CHOWN",
"CAP_DAC_OVERRIDE",
"CAP_DAC_READ_SEARCH",
"CAP_FOWNER",
"CAP_FSETID",
"CAP_KILL",
"CAP_SETGID",
"CAP_SETUID",
"CAP_SETPCAP",
"CAP_LINUX_IMMUTABLE",
"CAP_NET_BIND_SERVICE",
"CAP_NET_BROADCAST",
"CAP_NET_ADMIN",
"CAP_NET_RAW",
"CAP_IPC_LOCK",
"CAP_IPC_OWNER",
"CAP_SYS_MODULE",
"CAP_SYS_RAWIO",
"CAP_SYS_CHROOT",
"CAP_SYS_PTRACE",
"CAP_SYS_PACCT",
"CAP_SYS_ADMIN",
"CAP_SYS_BOOT",
"CAP_SYS_NICE",
"CAP_SYS_RESOURCE",
"CAP_SYS_TIME",
"CAP_SYS_TTY_CONFIG",
"CAP_MKNOD",
"CAP_LEASE",
"CAP_AUDIT_WRITE",
"CAP_AUDIT_CONTROL",
"CAP_SETFCAP",
"CAP_MAC_OVERRIDE",
"CAP_MAC_ADMIN",
"CAP_SYSLOG",
"CAP_WAKE_ALARM",
"CAP_BLOCK_SUSPEND",
"CAP_AUDIT_READ"
],
"Config": {
"Annotations": {
"io.container.manager": "libpod",
"io.kubernetes.cri-o.Created": "2020-07-10T16:45:10.87563612-05:00",
"io.kubernetes.cri-o.TTY": "false",
"io.podman.annotations.autoremove": "FALSE",
"io.podman.annotations.init": "FALSE",
"io.podman.annotations.privileged": "TRUE",
"io.podman.annotations.publish-all": "FALSE",
"org.opencontainers.image.stopSignal": "15"
},
"AttachStderr": false,
"AttachStdin": false,
"AttachStdout": false,
"Cmd": null,
"CreateCommand": [
"podman",
"container",
"run",
"--name",
"pihole",
"--dns",
"127.0.0.1,1.1.1.1",
"--network",
"pihole",
"--ip",
"10.254.0.5",
"--publish",
"192.168.153.15:53:53/tcp",
"--publish",
"192.168.153.15:53:53/udp",
"--publish",
"127.0.0.1:53:53/tcp",
"--publish",
"127.0.0.1:53:53/udp",
"--hostname",
"host.pihole.dom.local",
"--privileged=True",
"--env",
"TZ=America/Chicago",
"--env",
"VIRTUAL_HOST=host.dom.local",
"--env",
"DNS1=1.1.1.1",
"--env",
"DNS2=1.0.0.1",
"--env",
"IPv6=False",
"--env",
"DNSSEC=True",
"--env",
"DNS_BOGUS_PRIV=False",
"--volume",
"/etc/pihole:/etc/pihole",
"--volume",
"/etc/pihole/dnsmasq.d:/etc/dnsmasq.d",
"--volume",
"/var/log/pihole:/var/log",
"--healthcheck-command",
"dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1",
"--detach=True",
"--rootfs=False",
"docker.io/pihole/pihole:latest"
],
"Domainname": "",
"Entrypoint": "/s6-init",
"Env": [
"PATH=/opt/pihole:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
"TERM=xterm",
"container=podman",
"S6_KEEP_ENV=1",
"ServerIP=0.0.0.0",
"VIRTUAL_HOST=host.dom.local",
"DNS1=1.1.1.1",
"VERSION=v5.0",
"IPv6=False",
"S6OVERLAY_RELEASE=https://github.com/just-containers/s6-overlay/releases/download/v1.22.1.0/s6-overlay-amd64.tar.gz",
"S6_BEHAVIOUR_IF_STAGE2_FAILS=2",
"DNSMASQ_USER=root",
"DNS2=1.0.0.1",
"PHP_ERROR_LOG=/var/log/lighttpd/error.log",
"S6_LOGGING=0",
"FTL_CMD=no-daemon",
"PHP_ENV_CONFIG=/etc/lighttpd/conf-enabled/15-fastcgi-php.conf",
"PIHOLE_INSTALL=/root/ph_install.sh",
"ARCH=amd64",
"DNSSEC=True",
"DNS_BOGUS_PRIV=False",
"TZ=America/Chicago",
"HOSTNAME=host.pihole.dom.local",
"HOME=/root"
],
"Healthcheck": {
"Interval": 30000000000,
"Retries": 3,
"Test": [
"CMD-SHELL",
"dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1"
],
"Timeout": 30000000000
},
"Hostname": "host.pihole.dom.local",
"Image": "docker.io/pihole/pihole:latest",
"Labels": {
"image": "pihole/pihole:v5.0_amd64",
"maintainer": "[email protected]",
"url": "https://www.github.com/pi-hole/docker-pi-hole"
},
"OnBuild": null,
"OpenStdin": false,
"StdinOnce": false,
"StopSignal": 15,
"Tty": false,
"User": "",
"Volumes": null,
"WorkingDir": "/"
},
"ConmonPidFile": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/conmon.pid",
"Created": "2020-07-10T16:45:10.87563612-05:00",
"Dependencies": [],
"Driver": "overlay",
"EffectiveCaps": [
"CAP_CHOWN",
"CAP_DAC_OVERRIDE",
"CAP_DAC_READ_SEARCH",
"CAP_FOWNER",
"CAP_FSETID",
"CAP_KILL",
"CAP_SETGID",
"CAP_SETUID",
"CAP_SETPCAP",
"CAP_LINUX_IMMUTABLE",
"CAP_NET_BIND_SERVICE",
"CAP_NET_BROADCAST",
"CAP_NET_ADMIN",
"CAP_NET_RAW",
"CAP_IPC_LOCK",
"CAP_IPC_OWNER",
"CAP_SYS_MODULE",
"CAP_SYS_RAWIO",
"CAP_SYS_CHROOT",
"CAP_SYS_PTRACE",
"CAP_SYS_PACCT",
"CAP_SYS_ADMIN",
"CAP_SYS_BOOT",
"CAP_SYS_NICE",
"CAP_SYS_RESOURCE",
"CAP_SYS_TIME",
"CAP_SYS_TTY_CONFIG",
"CAP_MKNOD",
"CAP_LEASE",
"CAP_AUDIT_WRITE",
"CAP_AUDIT_CONTROL",
"CAP_SETFCAP",
"CAP_MAC_OVERRIDE",
"CAP_MAC_ADMIN",
"CAP_SYSLOG",
"CAP_WAKE_ALARM",
"CAP_BLOCK_SUSPEND",
"CAP_AUDIT_READ"
],
"ExecIDs": [],
"ExitCommand": [
"/usr/bin/podman",
"--root",
"/var/lib/containers/storage",
"--runroot",
"/var/run/containers/storage",
"--log-level",
"error",
"--cgroup-manager",
"systemd",
"--tmpdir",
"/var/run/libpod",
"--runtime",
"crun",
"--storage-driver",
"overlay",
"--storage-opt",
"overlay.mountopt=nodev,metacopy=on",
"--events-backend",
"file",
"container",
"cleanup",
"41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3"
],
"GraphDriver": {
"Data": {
"LowerDir": "/var/lib/containers/storage/overlay/f5242a0db3466e8384c3401e67093aff4e626c077337f88010931e5eb86d9f3d/diff:/var/lib/containers/storage/overlay/c51c1d92b5e99db402eef258d4b76864edb0cb9646248309adec0c6357931532/diff:/var/lib/containers/storage/overlay/c8a1abf41cf7452d1a3170eed9ba64d4a877317392ed8dbd1bbdc851f774dfbc/diff:/var/lib/containers/storage/overlay/6920dd59a8463cdf9854e461d46549dce77bd56ecac59683ee5c3dabd1bf0005/diff:/var/lib/containers/storage/overlay/f2046601e3b3cad9e8e3cd63f4c90ff60bbcb29f21d0825338b40023c9b71b69/diff:/var/lib/containers/storage/overlay/5ec562341d602e646183e2499c8fe31672e573038be49bf2ff44ec272abe6696/diff:/var/lib/containers/storage/overlay/2028935954a9a6b336fe4efc61c57fca26ba1a6f69e53a7a85dcfecbceb256be/diff:/var/lib/containers/storage/overlay/97782f90571bec3a0ba37c9a8a8ff527e8ae19d81df4330185d004d1736bb973/diff:/var/lib/containers/storage/overlay/d7a84d67f8adae97b1628dbe5095fe1aff6a720954361b298be2b74301fdaeb4/diff:/var/lib/containers/storage/overlay/3d83d295905c2da0bf0f45d8d3782ec9c049fd0dcf4d867f14753ca22cceedac/diff:/var/lib/containers/storage/overlay/7b313ef723f5c7f02c3ec6efc515398eac3f81ff7ff7070cf20704efe810e34e/diff:/var/lib/containers/storage/overlay/81be3decf5b445909c2517d88d27c52a29f2e137e4739312e77d4a7225d603bf/diff:/var/lib/containers/storage/overlay/f66ed577df6ed16d061b2705132407525a370b4fd821840776cb8564fa9eb501/diff",
"MergedDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/merged",
"UpperDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/diff",
"WorkDir": "/var/lib/containers/storage/overlay/c715d66f945fecc9e1fe2e74b0a731e1dc5f79675b181fe6374952454bda6b4c/work"
},
"Name": "overlay"
},
"HostConfig": {
"AutoRemove": false,
"Binds": [
"/etc/pihole:/etc/pihole:rw,rprivate,rbind",
"/etc/pihole/dnsmasq.d:/etc/dnsmasq.d:rw,rprivate,rbind",
"/var/log/pihole:/var/log:rw,rprivate,rbind"
],
"BlkioDeviceReadBps": null,
"BlkioDeviceReadIOps": null,
"BlkioDeviceWriteBps": null,
"BlkioDeviceWriteIOps": null,
"BlkioWeight": 0,
"BlkioWeightDevice": null,
"CapAdd": [],
"CapDrop": [],
"Cgroup": "",
"CgroupMode": "private",
"CgroupParent": "",
"Cgroups": "default",
"ConsoleSize": [
0,
0
],
"ContainerIDFile": "",
"CpuCount": 0,
"CpuPercent": 0,
"CpuPeriod": 0,
"CpuQuota": 0,
"CpuRealtimePeriod": 0,
"CpuRealtimeRuntime": 0,
"CpuShares": 0,
"CpusetCpus": "",
"CpusetMems": "",
"Devices": [],
"DiskQuota": 0,
"Dns": [
"127.0.0.1",
"1.1.1.1"
],
"DnsOptions": [],
"DnsSearch": [],
"ExtraHosts": [],
"GroupAdd": [],
"IOMaximumBandwidth": 0,
"IOMaximumIOps": 0,
"IpcMode": "private",
"Isolation": "",
"KernelMemory": 0,
"Links": null,
"LogConfig": {
"Config": null,
"Type": "k8s-file"
},
"Memory": 0,
"MemoryReservation": 0,
"MemorySwap": 0,
"MemorySwappiness": 0,
"NanoCpus": 0,
"NetworkMode": "bridge",
"OomKillDisable": false,
"OomScoreAdj": 0,
"PidMode": "private",
"PidsLimit": 4096,
"PortBindings": {
"53/tcp": [
{
"HostIp": "192.168.153.15",
"HostPort": "53"
},
{
"HostIp": "127.0.0.1",
"HostPort": "53"
}
],
"53/udp": [
{
"HostIp": "192.168.153.15",
"HostPort": "53"
},
{
"HostIp": "127.0.0.1",
"HostPort": "53"
}
]
},
"Privileged": true,
"PublishAllPorts": false,
"ReadonlyRootfs": false,
"RestartPolicy": {
"MaximumRetryCount": 0,
"Name": ""
},
"Runtime": "oci",
"SecurityOpt": [],
"ShmSize": 65536000,
"Tmpfs": {},
"UTSMode": "private",
"Ulimits": [
{
"Hard": 1048576,
"Name": "RLIMIT_NOFILE",
"Soft": 1048576
},
{
"Hard": 4194304,
"Name": "RLIMIT_NPROC",
"Soft": 4194304
}
],
"UsernsMode": "",
"VolumeDriver": "",
"VolumesFrom": null
},
"HostnamePath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/hostname",
"HostsPath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/hosts",
"Id": "41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3",
"Image": "788fa841f00633e23f5c40775dad6844262b6546d021441626639984cd5f710a",
"ImageName": "docker.io/pihole/pihole:latest",
"IsInfra": false,
"LogPath": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/ctr.log",
"LogTag": "",
"MountLabel": "system_u:object_r:container_file_t:s0:c70,c770",
"Mounts": [
{
"Destination": "/etc/pihole",
"Driver": "",
"Mode": "",
"Name": "",
"Options": [
"rbind"
],
"Propagation": "rprivate",
"RW": true,
"Source": "/etc/pihole",
"Type": "bind"
},
{
"Destination": "/etc/dnsmasq.d",
"Driver": "",
"Mode": "",
"Name": "",
"Options": [
"rbind"
],
"Propagation": "rprivate",
"RW": true,
"Source": "/etc/pihole/dnsmasq.d",
"Type": "bind"
},
{
"Destination": "/var/log",
"Driver": "",
"Mode": "",
"Name": "",
"Options": [
"rbind"
],
"Propagation": "rprivate",
"RW": true,
"Source": "/var/log/pihole",
"Type": "bind"
}
],
"Name": "pihole",
"Namespace": "",
"NetworkSettings": {
"Bridge": "",
"EndpointID": "",
"Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"HairpinMode": false,
"IPAddress": "",
"IPPrefixLen": 0,
"IPv6Gateway": "",
"LinkLocalIPv6Address": "",
"LinkLocalIPv6PrefixLen": 0,
"MacAddress": "",
"Networks": {
"pihole": {
"DriverOpts": null,
"EndpointID": "",
"Gateway": "10.254.0.1",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"IPAMConfig": null,
"IPAddress": "10.254.0.5",
"IPPrefixLen": 24,
"IPv6Gateway": "",
"Links": null,
"MacAddress": "56:8c:5b:90:6e:4c",
"NetworkID": "pihole"
}
},
"Ports": {
"53/tcp": [
{
"HostIp": "192.168.153.15",
"HostPort": "53"
},
{
"HostIp": "127.0.0.1",
"HostPort": "53"
}
],
"53/udp": [
{
"HostIp": "192.168.153.15",
"HostPort": "53"
},
{
"HostIp": "127.0.0.1",
"HostPort": "53"
}
]
},
"SandboxID": "",
"SandboxKey": "/var/run/netns/cni-cbf7cae6-e55e-60b5-1dd3-e0e89bfbc9ac"
},
"OCIConfigPath": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/config.json",
"OCIRuntime": "crun",
"Path": "/s6-init",
"Pod": "",
"ProcessLabel": "",
"ResolvConfPath": "/var/run/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata/resolv.conf",
"RestartCount": 0,
"Rootfs": "",
"State": {
"ConmonPid": 137390,
"Dead": false,
"Error": "",
"ExitCode": 0,
"FinishedAt": "0001-01-01T00:00:00Z",
"Healthcheck": {
"FailingStreak": 0,
"Log": null,
"Status": "starting"
},
"OOMKilled": false,
"OciVersion": "1.0.2-dev",
"Paused": false,
"Pid": 137393,
"Restarting": false,
"Running": true,
"StartedAt": "2020-07-10T16:45:11.120085058-05:00",
"Status": "running"
},
"StaticDir": "/var/lib/containers/storage/overlay-containers/41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3/userdata"
},
"invocation": {
"module_args": {
"annotation": null,
"authfile": null,
"blkio_weight": null,
"blkio_weight_device": null,
"cap_add": null,
"cap_drop": null,
"cgroup_parent": null,
"cgroupns": null,
"cgroups": null,
"cidfile": null,
"cmd_args": null,
"command": null,
"conmon_pidfile": null,
"cpu_period": null,
"cpu_rt_period": null,
"cpu_rt_runtime": null,
"cpu_shares": null,
"cpus": null,
"cpuset_cpus": null,
"cpuset_mems": null,
"debug": false,
"detach": true,
"detach_keys": null,
"device": null,
"device_read_bps": null,
"device_read_iops": null,
"device_write_bps": null,
"device_write_iops": null,
"dns": [
"127.0.0.1",
"1.1.1.1"
],
"dns_option": null,
"dns_search": null,
"entrypoint": null,
"env": {
"DNS1": "1.1.1.1",
"DNS2": "1.0.0.1",
"DNSSEC": true,
"DNS_BOGUS_PRIV": false,
"IPv6": false,
"TZ": "America/Chicago",
"VIRTUAL_HOST": "host.dom.local"
},
"env_file": null,
"env_host": null,
"etc_hosts": null,
"executable": "podman",
"expose": null,
"force_restart": false,
"gidmap": null,
"group_add": null,
"healthcheck": "dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1",
"healthcheck_interval": null,
"healthcheck_retries": null,
"healthcheck_start_period": null,
"healthcheck_timeout": null,
"hostname": "host.pihole.dom.local",
"http_proxy": null,
"image": "docker.io/pihole/pihole:latest",
"image_strict": false,
"image_volume": null,
"init": null,
"init_path": null,
"interactive": null,
"ip": "10.254.0.5",
"ipc": null,
"kernel_memory": null,
"label": null,
"label_file": null,
"log_driver": null,
"log_opt": null,
"memory": null,
"memory_reservation": null,
"memory_swap": null,
"memory_swappiness": null,
"mount": null,
"name": "pihole",
"network": [
"pihole"
],
"no_hosts": null,
"oom_kill_disable": null,
"oom_score_adj": null,
"pid": null,
"pids_limit": null,
"pod": null,
"privileged": true,
"publish": [
"192.168.153.15:53:53/tcp",
"192.168.153.15:53:53/udp",
"127.0.0.1:53:53/tcp",
"127.0.0.1:53:53/udp"
],
"publish_all": null,
"read_only": null,
"read_only_tmpfs": null,
"recreate": false,
"restart_policy": null,
"rm": null,
"rootfs": false,
"security_opt": null,
"shm_size": null,
"sig_proxy": null,
"state": "present",
"stop_signal": null,
"stop_timeout": null,
"subgidname": null,
"subuidname": null,
"sysctl": null,
"systemd": null,
"tmpfs": null,
"tty": null,
"uidmap": null,
"ulimit": null,
"user": null,
"userns": null,
"uts": null,
"volume": [
"/etc/pihole:/etc/pihole",
"/etc/pihole/dnsmasq.d:/etc/dnsmasq.d",
"/var/log/pihole:/var/log"
],
"volumes_from": null,
"workdir": null
}
},
"podman_actions": [
"podman rm -f pihole",
"podman run --name pihole --dns 127.0.0.1,1.1.1.1 --network pihole --ip 10.254.0.5 --publish 192.168.153.15:53:53/tcp --publish 192.168.153.15:53:53/udp --publish 127.0.0.1:53:53/tcp --publish 127.0.0.1:53:53/udp --hostname host.pihole.dom.local --privileged=True --env TZ=America/Chicago --env VIRTUAL_HOST=host.dom.local --env DNS1=1.1.1.1 --env DNS2=1.0.0.1 --env IPv6=False --env DNSSEC=True --env DNS_BOGUS_PRIV=False --volume /etc/pihole:/etc/pihole --volume /etc/pihole/dnsmasq.d:/etc/dnsmasq.d --volume /var/log/pihole:/var/log --healthcheck-command dig +norecurse +retry=0 @192.168.153.15 host.pihole.dom.local || exit 1 --detach=True --rootfs=False docker.io/pihole/pihole:latest"
],
"stderr": "",
"stderr_lines": [],
"stdout": "41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3\n",
"stdout_lines": [
"41e026aeab31c56be6d264a645681c1a4489b55c73e2d583a00582f94d631fc3"
]
}
META: ran handlers
META: ran handlers
PLAY RECAP ********************************************************************************************************************************************************************************************************************************************************************************************************
h-msn-smbdc-05 : ok=4 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Additional environment details (AWS, VirtualBox, physical, etc.):
The actual task is part of a huge role, I created a simply play to demonstrate. Hopefully I didn't miss anything.
Is this a BUG REPORT or FEATURE REQUEST? (leave only one on its own line)
/kind feature
Description
Create podman module that logs in to container registries.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.