I have this below playbook where the remote host is an user input and subsequently I am trying to gather facts about the remote host and copy the same to a file in local:
---
- hosts: localhost
vars_prompt:
name: hostname
prompt: "Enter Hostname"
tasks:
- name: Add hosts to known_hosts file
add_host: name={{ hostname }} groups=new
- name: Check if Host is reachable
shell: ansible -m ping {{ hostname }}
- name: Remove existing remote hosts
shell: ssh-keygen -R {{ hostname }}
- name: Setup passwordless SSH login
shell: ssh-copy-id -i ~/.ssh/id_rsa user@{{ hostname }}
- name: Display facts
command: ansible {{ groups['new'] }} -m setup
register: output
- copy: content="{{ output }}" dest=/var/tmp/dir/Node_Health/temp
...
I get the below error in the temp
file:
Node_Health]# cat temp
{"start": "2016-06-17 09:26:59.174155", "delta": "0:00:00.279268", "cmd": ["ansible", "[udl360x4675]", "-m", "setup"], "end": "2016-06-17 09:26:59.453423", "stderr": " [WARNING]: provided hosts list is empty, only localhost is available", "stdout": "", "stdout_lines": [], "changed": true, "rc": 0, "warnings":
I also tried the below playbook which also gives the same error:
---
- hosts: localhost
vars_prompt:
name: hostname
prompt: "Enter Hostname"
tasks:
- name: Add hosts to known_hosts file
add_host: name={{ hostname }} groups=new
- name: Check if Host is reachable
shell: ansible -m ping {{ hostname }}
- name: Remove existing remote hosts
shell: ssh-keygen -R {{ hostname }}
- name: Setup passwordless SSH login
shell: ssh-copy-id -i ~/.ssh/id_rsa user@{{ hostname }}
- hosts: new
tasks:
- name: Display facts
command: ansible {{ groups['new'] }} -m setup
register: output
- local_action: copy content="{{ output }}" dest=/var/tmp/dir/Node_Health/temp
...
Any help will be appreciated.
You can use the option --list-hosts. It will show all the host IPs from your inventory file.
This ensures that the proper connection and Python are used to execute your tasks locally. You can override the built-in implicit version by creating a localhost host entry in your inventory. At that point, all implicit behaviors are ignored; the localhost in inventory is treated just like any other host.
Ansible assumes that you have all your hosts in an inventory file somewhere.
add_host
only adds your host to the currently running Ansible, and that doesn't propagate to the copy of Ansible you call.
You're going to have to either:
change the command to use an inline inventory list, like ansible all -i '{{ hostname }},' -m setup
(More details re the use of -i '<hostname>,'
here
or write out the hostname to a file, and use that as your inventory file
Put your hosts in a hosts.ini
file, with the following syntax:
[nodes]
node_u1 ansible_user=root ansible_host=127.0.0.1
node_u2 ansible_user=root ansible_host=127.0.1.1
node_u3 ansible_user=root ansible_host=127.0.2.1
node_u4 ansible_user=root ansible_host=127.0.3.1
node_u5 ansible_user=root ansible_host=127.0.4.1
Running ansible, use: ansible-playbook -i hosts.ini
You can also save your hosts file in /etc/ansible/hosts
to avoid passing hosts as parameters. Ansible looks there as a default location. Then just run using:
ansible-playbook <playbook.yml>
For me it looks that thay don't know where the inventory file is located.
I used cmd:ansible-playbook name.yml -i hostfile
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With