I use vue-loader to build my Vue app using webpack. My application is served by Firebase.
For SEO purposes, I need to serve the file robots.txt at the root of my application (GET /robots.txt
).
How can I configure webpack/vue-loader to serve this file ?
This is my current webpack base config ./config/index.js
// see http://vuejs-templates.github.io/webpack for documentation.
var path = require('path')
module.exports = {
build: {
env: require('./prod.env'),
index: path.resolve(__dirname, '../dist/index.html'),
assetsRoot: path.resolve(__dirname, '../dist'),
assetsSubDirectory: 'static',
assetsPublicPath: '/',
productionSourceMap: true,
// Gzip off by default as many popular static hosts such as
// Surge or Netlify already gzip all static assets for you.
// Before setting to `true`, make sure to:
// npm install --save-dev compression-webpack-plugin
productionGzip: false,
productionGzipExtensions: ['js', 'css'],
// Run the build command with an extra argument to
// View the bundle analyzer report after build finishes:
// `npm run build --report`
// Set to `true` or `false` to always turn it on or off
bundleAnalyzerReport: process.env.npm_config_report
},
dev: {
env: require('./dev.env'),
port: 8080,
autoOpenBrowser: true,
assetsSubDirectory: 'static',
assetsPublicPath: '/',
proxyTable: {},
// CSS Sourcemaps off by default because relative paths are "buggy"
// with this option, according to the CSS-Loader README
// (https://github.com/webpack/css-loader#sourcemaps)
// In our experience, they generally work as expected,
// just be aware of this issue when enabling this option.
cssSourceMap: false
}
}
Format and location rules: The robots.txt file must be located at the root of the website host to which it applies. For instance, to control crawling on all URLs below https://www.example.com/ , the robots.txt file must be located at https://www.example.com/robots.txt .
VueJS v3 build command copies anything in /public to your final dist/. So use the public/ folder for any additional files that you want in your final distribution.
If I am assuming correctly, you are building your app using the npm run build command from the webpack template creating a /dist folder which you deploy to Firebase. If that is the case you can just add a robots.txt file to that dist folder next to the index. That should work.
However, if better SEO is your aim, it can be better to prerender the page or use Server Side Rendering depending on the complexity of your application.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With