I want to test an Ansible script using Vagrant. Everything works fine until it tries to do a rsync to the remote host:
- name: Install custom dev user settings
local_action: command rsync -ave ssh roles/common/files/home/{{ item.name }}
{{ ansible_ssh_user }}@{{ inventory_hostname }}:/#home/
with_items: dev_users
when: "{{ item.custom }} == True"
tags:
- dev_custom
- dev_users
- users
However it fails at this point - it seems to be trying to login via password but I don't know why as it should connect to Vagrant via SSH, right (I have elided some information below with ... because it mentioned keys) ?
127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-1393178896.64-215542007508316 && chmod a+rx $HOME/.ansible/tmp/ansible-1393178896.64-215542007508316 && echo $HOME/.ansible/tmp/ansible-1393178896.64-215542007508316']
<127.0.0.1> REMOTE_MODULE command rsync -ave ssh roles/common/files/home/someUser [email protected]:/#home/
<127.0.0.1> PUT /tmp/tmpm3BnEW TO /home/mark/.ansible/tmp/ansible-1393178896.64-215542007508316/command
<127.0.0.1> EXEC /bin/sh -c 'sudo -k && sudo -H -S -p "[sudo via ansible, key=...] password: " -u root /bin/sh -c '"'"'echo SUDO-SUCCESS-...; /usr/bin/python /home/mark/.ansible/tmp/ansible-1393178896.64-215542007508316/command; rm -rf /home/mark/.ansible/tmp/ansible-1393178896.64-215542007508316/ >/dev/null 2>&1'"'"''
failed: [10.0.0.10] => (item={ ... }) => {"failed": true, "item": { ... }, "parsed": false}
invalid output was: [sudo via ansible, key=...] password:
[sudo via ansible, key=...] password:
Sorry, try again.
[sudo via ansible, key=...] password:
[sudo via ansible, key=...] password:
Sorry, try again.
[sudo via ansible, key=...] password:
[sudo via ansible, key=...] password:
Sorry, try again.
sudo: 3 incorrect password attempts
Any idea how to get around this?
I do deploy a custom key to the box via ansible before I do this e.g.:
- name: Place ansible authorized key
authorized_key: user=root key="{{ lookup('file', 'root/.ssh/ansible_key.pub') }}"
tags:
- root
I finally discovered how to fix this problem. I knew it wasn't rsync because I tested it at the command line:
rsync -ave ssh home/dataToSync [email protected]:/home/
and that worked fine. I had uploaded a key to the Vagrant box I didn't even have to log in. However the calling rsync from ansible was failing. I tried specifying a password as Leucos suggested but that didn't work. I thought about it and wondered if the problem was Ansible was trying to sudo on my local box. To see if that was the problem, I added
sudo: False
to my action and it fixed the problem.
Since Vagrant runs Ansible playbooks as the user vagrant, trying to rsync to root-owned directories will usually fail. Rather than monkeying around with assigning groups or trying to find some magically idempotent chmod/chown solution, the most robust solution I've found is just to tell Ansible to use rsync with sudo.
- synchronize:
src=/local/site/
dest=/var/www/site
rsync_path='sudo rsync'
Note, Ansible's synchronize module wraps rsync, making these tasks much cleaner to author.
Mark,
If you did not deploy your ssh public key to the box (in /home/vagrant/ssh/authorized_keys
), it's quite a normal behavior to get asked for a password. Using 'vagrant' with standard boxes should work.
If you push your key first, it would be easier to get things done after. You can find a Vagrant oriented example here
It seems you set ansible_ssh_user
properly, but make sure you also invoke your playbook with --ask-pass --sudo
.
As a side note, the line:
when: "{{ item.custom }} == True"
could be rewritten as:
when: item.custom
In Ansible 2.2.1.0 this worked for me:
- synchronize:
mode: pull
src: "/home/vagrant/dir1"
dest: "/my_linux/dir1"
recursive: yes
delete: no
times: yes
...and with this inventory (to use ansible_ssh_private_key_file
var):
[vagrant1]
192.168.77.4
[vagrant1:vars]
ansible_user=vagrant
ansible_ssh_private_key_file="/Users/Chilcano/.vagrant.d/insecure_private_key"
Hope it helps you.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With