I'd like to add a block of text to my ElasticSearch configuration using blockinfile, but every time I run my playbook, the block gets added to the file -- even when it already exists. This is a problem because ElasticSearch doesn't just take the last value, it chokes on startup saying "you have multiple entries for this value" (or something similar).
My play looks like this:
- name: configure elasticsearch
blockinfile:
dest: /etc/elasticsearch/elasticsearch.yml
marker: "## added by ansible configuration"
block: |
network.host: 0.0.0.0
path.data: /var/lib
path.logs: /var/log/elasticsearch
path.repo: /home/chris/elastic-backups
state: present
But after running my playbook a second time, my elasticsearch.yml
file looks like:
## added by ansible configuration
network.host: 0.0.0.0
path.data: /var/lib
path.logs: /var/log/elasticsearch
path.repo: /home/chris/elastic-backups
## added by ansible configuration
network.host: 0.0.0.0
path.data: /var/lib
path.logs: /var/log/elasticsearch
path.repo: /home/chris/elastic-backups
## added by ansible configuration
Is there a way to only add the block if it does not exist yet?
The accepted answer is correct and straightforward.
But also marker
string should not contain line breaks like \n
, otherwise, Ansible will keep adding the block. Sounds like a bug for me
You should specify {mark}
keyword in the marker
parameter:
marker: "## {mark} added by ansible (configuration elasticsearch)"
This will cause Ansible to insert a line at the beginning and at the end of the block replacing {mark}
accordingly with BEGIN
and END
:
## BEGIN added by ansible (configuration elasticsearch)
network.host: 0.0.0.0
path.data: /var/lib
path.logs: /var/log/elasticsearch
path.repo: /home/chris/elastic-backups
## END added by ansible (configuration elasticsearch)
Otherwise Ansible has no clue, where the block starts and where it ends, so on every run it considers the block is not present and inserts a new one.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With