The default robots.txt file disable search engine indexing? And I cannot replace it with my own. What did I miss?
That's the default and fixed one:
User-agent: *
Disallow: /
According to slack, robots.txt files on *.surge.sh subdomains are locked to prevent link farming... sad.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With