I need a way to answer dynamically to the /robots.txt
request.
And that's why I've decided to go with getServerSideProps
https://nextjs.org/docs/basic-features/data-fetching#getserversideprops-server-side-rendering
If you export an async function called getServerSideProps from a page, Next.js will pre-render this page on each request using the data returned by getServerSideProps.
export async function getServerSideProps(context) {
return {
props: {}, // will be passed to the page component as props
}
}
Inside the context
parameter we have the req
and res
objects.
The response for the robots.txt
will depend on the req.headers.host
value.
For example:
www.mydomain.com
should render a production robots.txt
filetest.mydomain.com
should render a test robots.txt
file (that I'll use on test/staging deployments).This is my current code:
pages/robots.txt.tsx
import React from "react";
import { GetServerSideProps } from "next";
interface Robots {
ENV: "TEST" | "PROD"
}
export const getServerSideProps : GetServerSideProps<Robots> = async (context) => {
const { req, res } = context;
const { host } = req.headers;
res.write("XXX");
res.end();
return({ // This is unnecessary (but Next.js requires it to be here)
props: {
ENV: "TEST"
}
});
};
const Robots: React.FC<Robots> = (props) => { // This is also unnecessary (but Next.js requires it to be here)
console.log("Rendering Robots...");
return(
<div>
I am Robots
</div>
);
};
export default Robots; // This is also unnecessary (but Next.js requires it to be here).
It seems to be working:
But the weird part is that Next.js
demands me to export a component from that page. And also returns a props: {}
object from getServerSideProps
is also required.
What is the way to go here? I'm basically using req,res
from getServerSideProps
to returning something that is not a page. Is this an anti-pattern?
UPDATE
Yes, this is an anti-pattern. You should use rewrites
. See the selected answer.
Robots. txt file is a text file created by the designer to prevent the search engines and bots to crawl up their sites. It contains the list of allowed and disallowed sites and whenever a bot wants to access the website, it checks the robots.
You can use an API route instead for the logic and have a rewrite map /robots.txt
requests to /api/robots
in your Next.js config file.
// next.config.js
module.exports = {
// ...
async rewrites() {
return [
{
source: '/robots.txt',
destination: '/api/robots'
}
];
}
}
// /pages/api/robots
export default function handler(req, res) {
res.send('XXX'); // Send your `robots.txt content here
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With