I want to uncomment a line in file sshd_config
by using Ansible and I have the following working configuration:
- name: Uncomment line from /etc/ssh/sshd_config
lineinfile:
dest: /etc/ssh/sshd_config
regexp: '^#AuthorizedKeysFile'
line: 'AuthorizedKeysFile .ssh/authorized_keys'
However this config only works if the line starts by #AuthorizedKeysFile
, but it won't work if the line starts by # AuthorizedKeysFile
or # AuthorizedKeysFile
(spaces between #
and the words).
How can I configure the regexp so it won't take into account any number of spaces after '#'?
I've tried to add another lineinfile option with a space after '#', but this is not a good solution:
- name: Uncomment line from /etc/ssh/sshd_config
lineinfile:
dest: /etc/ssh/sshd_config
regexp: '# AuthorizedKeysFile'
line: 'AuthorizedKeysFile .ssh/authorized_keys'
Uncommenting a line in Ansible using the replace module You just have to remove the first character, right. We can do that using the matching and grouping. The regexp matches every line starting with a '#' character and having 'Uncomment this line' pattern.
Comment Out & Uncomment Lines using Ansible Imagine a config file with some foo option that you need to disable or enable without knowing the current value of this option. In Ansible this can be achieved by using the \1 in replace to match ( … every character inside brackets… ) in regexp .
To modify a line, you need to use the Ansible backrefs parameter along with the regexp parameter. This should be used with state=present. If the regexp does not match any line, then the file is not changed. If the regexp matches a line or multiple lines, then the last matched line will be replaced.
I should caveat this with @techraf's point that 99% of the time a full template of a configuration file is almost always better.
Times I have done lineinfile
include weird and wonderful configuration files that are managed by some other process, or laziness for config I don't fully understand yet and may vary by distro/version and I don't want to maintain all the variants... yet.
Go forth and learn more Ansible... it is great because you can keep iterating on it from raw bash shell commands right up to best practice.
Still good to see how best to configuration manage one or two settings just a little better with this:
tasks:
- name: Apply sshd_config settings
lineinfile:
path: /etc/ssh/sshd_config
# might be commented out, whitespace between key and value
regexp: '^#?\s*{{ item.key }}\s'
line: "{{ item.key }} {{ item.value }}"
validate: '/usr/sbin/sshd -T -f %s'
with_items:
- key: MaxSessions
value: 30
- key: AuthorizedKeysFile
value: .ssh/authorized_keys
notify: restart sshd
handlers:
- name: restart sshd
service:
name: sshd
state: restarted
validate
don't make the change if the change is invalidnotify
/handlers
the correct way to restart once only at the endwith_items
(soon to become loop
) if you have multiple settings^#?
the setting might be commented out - see the other answer\s*{{ item.key }}\s
will not match other settings (i.e. SettingA
cannot match NotSettingA
or SettingAThisIsNot
)Still might clobber a comment like # AuthorizedKeysFile - is a setting
which we have to live with because there could be a setting like AuthorizedKeysFile /some/path # is a setting
... re-read the caveat.
- name: Configure sshd
template:
src: sshd_config.j2
dest: /etc/ssh/sshd_config
owner: root
group: root
mode: "0644"
validate: '/usr/sbin/sshd -T -f %s'
notify: restart sshd
handlers:
- name: restart sshd
service:
name: sshd
state: restarted
And if you are not being lazy about supporting all your distros see this tip
- name: configure ssh
template: src={{ item }} dest={{ SSH_CONFIG }} backup=yes
with_first_found:
- "{{ ansible_distribution }}-{{ ansible_distribution_major_version }}.sshd_config.j2"
- "{{ ansible_distribution }}.sshd_config.j2"
https://ansible-tips-and-tricks.readthedocs.io/en/latest/modifying-files/modifying-files/
(needs to be updated to a loop
using the first_found
lookup)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With