Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ansible ssh prompt known_hosts issue

I'm running Ansible playbook and it works fine on one machine.

On a new machine when I try for the first time, I get the following error.

17:04:34 PLAY [appservers] *************************************************************  17:04:34  17:04:34 GATHERING FACTS ***************************************************************  17:04:34 fatal: [server02.cit.product-ref.dev] => {'msg': "FAILED: (22, 'Invalid argument')", 'failed': True} 17:04:34 fatal: [server01.cit.product-ref.dev] => {'msg': "FAILED: (22, 'Invalid argument')", 'failed': True} 17:04:34  17:04:34 TASK: [common | remove old ansible-tmp-*] *************************************  17:04:34 FATAL: no hosts matched or all hosts have already failed -- aborting 17:04:34  17:04:34  17:04:34 PLAY RECAP ********************************************************************  17:04:34            to retry, use: --limit @/var/lib/jenkins/site.retry 17:04:34  17:04:34 server01.cit.product-ref.dev      : ok=0    changed=0    unreachable=1    failed=0    17:04:34 server02.cit.product-ref.dev      : ok=0    changed=0    unreachable=1    failed=0    17:04:34  17:04:34 Build step 'Execute shell' marked build as failure 17:04:34 Finished: FAILURE 

This error can be resolved, if I first go to the source machine (from where I'm running the ansible playbook) and manually ssh to the target machine (as the given user) and enter "yes" for known_hosts file entry.

Now, if I run the same ansible playbook second time, it works without an error.

Therefore, how can I suppress the prompt what SSH gives while making ssh known_hosts entry for the first time for a given user (~/.ssh folder, file known_hosts)?

I found I can do this if I use the following config entries in ~/.ssh/config file.

~/.ssh/config

# For vapp virtual machines Host *   StrictHostKeyChecking no   UserKnownHostsFile=/dev/null   User kobaloki   LogLevel ERROR 

i.e. if I place the above code in the user's ~/.ssh/config file of a remote machine and try Ansible playbook for the first time, I won't be prompted for entring "yes" and playbook will run successfully (without requiring the user to manually create a known_hosts file entry from the source machine to the target/remote machine).

My questions: 1. What security issues I should take care if I go ~/.ssh/config way 2. How can I pass the settings (what's there in the config file) as parameters/options to ansible at command line so that it will run first time on a new machine (without prompting / depending upon the known_hosts file entry on the source machine for the target machine?

like image 686
AKS Avatar asked May 13 '15 22:05

AKS


People also ask

What is known_hosts in .ssh folder?

Definition(s): A file associated with a specific account that contains one or more host keys. Each host key is associated with an SSH server address (IP or hostname) so that the server can be authenticated when a connection is initiated.

How do I ignore ssh authenticity in Ansible?

Two options - the first, as you said in your own answer, is setting the environment variable ANSIBLE_HOST_KEY_CHECKING to False. The second way to set it is to put it in an ansible. cfg file, and that's a really useful option because you can either set that globally (at system or user level, in /etc/ansible/ansible.

What is known_hosts old file?

The purpose of the known_hosts file is for the client to authenticate the server they are connecting to. This error will occur when the public key the host has changes.To fix this we must remove the key causing the error.

Is known_hosts sensitive?

The known_hosts file contains the trusted public keys for hosts you connected to in the past. These public keys can be obtained simply by trying to connect to these hosts. Therefore it is no security risk per se.


2 Answers

The ansible docs have a section on this. Quoting:

Ansible has host key checking enabled by default.

If a host is reinstalled and has a different key in ‘known_hosts’, this will result in an error message until corrected. If a host is not initially in ‘known_hosts’ this will result in prompting for confirmation of the key, which results in an interactive experience if using Ansible, from say, cron. You might not want this.

If you understand the implications and wish to disable this behavior, you can do so by editing /etc/ansible/ansible.cfg or ~/.ansible.cfg:

[defaults] host_key_checking = False 

Alternatively this can be set by the ANSIBLE_HOST_KEY_CHECKING environment variable:

$ export ANSIBLE_HOST_KEY_CHECKING=False 

Also note that host key checking in paramiko mode is reasonably slow, therefore switching to ‘ssh’ is also recommended when using this feature.

like image 88
Ben Whaley Avatar answered Sep 23 '22 07:09

Ben Whaley


To update local known_hosts file, I ended up using a combination of ssh-keyscan (with dig to resolve a hostname to IP address) and ansible module known_hosts as follows: (filename ssh-known_hosts.yml)

- name: Store known hosts of 'all' the hosts in the inventory file   hosts: localhost   connection: local    vars:     ssh_known_hosts_command: "ssh-keyscan -T 10"     ssh_known_hosts_file: "{{ lookup('env','HOME') + '/.ssh/known_hosts' }}"     ssh_known_hosts: "{{ groups['all'] }}"    tasks:    - name: For each host, scan for its ssh public key     shell: "ssh-keyscan {{ item }},`dig +short {{ item }}`"     with_items: "{{ ssh_known_hosts }}"     register: ssh_known_host_results     ignore_errors: yes    - name: Add/update the public key in the '{{ ssh_known_hosts_file }}'     known_hosts:       name: "{{ item.item }}"       key: "{{ item.stdout }}"       path: "{{ ssh_known_hosts_file }}"     with_items: "{{ ssh_known_host_results.results }}" 

To execute such yml, do

ANSIBLE_HOST_KEY_CHECKING=false ansible-playbook path/to/the/yml/above/ssh-known_hosts.yml 

As a result, for each host in the inventory, all supported algorithms will be added/updated in the known_hosts file under hostname,ipaddress pair record; such as

atlanta1.my.com,10.0.5.2 ecdsa-sha2-nistp256 AAAAEjZHN ... NobYTIGgtbdv3K+w= atlanta1.my.com,10.0.5.2 ssh-rsa AAAAB3NaC1y ... JTyWisGpFeRB+VTKQ7 atlanta1.my.com,10.0.5.2 ssh-ed25519 AAAAC3NaCZD ... UteryYr denver8.my.com,10.2.13.3 ssh-rsa AAAAB3NFC2 ... 3tGDQDSfJD ... 

(Provided the inventory file looks like:

[master] atlanta1.my.com atlanta2.my.com  [slave] denver1.my.com denver8.my.com 

)

As opposed to the Xiong's answer, this would properly handle the content of the known_hosts file.

This play is especially helpful if using virtualized environment where the target hosts get re-imaged (thus the ssh pub keys get changed).

like image 22
Stepan Vavra Avatar answered Sep 23 '22 07:09

Stepan Vavra