While trying to implement page optimizations in our app, we want to have ability to generate separate CSS files per webpack chunk to improve page rendering performance for first page in our application. To achieve that, we have been trying to use extract-text-plugin in combination with require.ensure like so:
const $ = require('load-webpack-plugins')();
module.exports = {
entry: { 'app': './src/app.js' },
output: {
path: 'dist',
filename: '[name].js'
},
devtool: 'source-map',
module: {
loaders: [
{ test: /\.js$/, use: [{ loader: 'babel-loader' }] },
{
test: /\.css$/,
use: $.ExtractTextPlugin.extract({
fallback: 'style-loader',
use: [{ loader: 'css-loader' }]
})
}
]
},
plugins: [
new $.ExtractTextPlugin({ filename: '[name].[contenthash].css', allChunks: true }),
new $.NamedModulesPlugin(),
]
}
with app.js being:
console.log('this is app.js');
require.ensure([], require => {
require('./a.css');
}, 'base');
require.ensure([], require => {
require('./b.css');
}, 'first');
require.ensure([], require => {
require('./c.css');
}, 'second');
and a.css being:
.a {
color: red;
}
and b.css being:
.b {
color: red;
}
and c.ss being:
.c {
color: red
}
Problem is we are getting only one CSS file dist/app.e2e9da337e9ab8b0ca431717de3bea22.css
with content of all the 3 chunks:
.a {
color: red;
}
.b {
color: red;
}
.c {
color: red
}
/*# sourceMappingURL=app.e2e9da337e9ab8b0ca431717de3bea22.css.map*/
how do we go about extracting one css file per webpack chunk (require.ensure) in this case? should this even be supported by extract-text-plugin.
PS: Here is an example repository demonstrating this issue -- https://github.com/osdevisnot/extract-text-demo
extract-text-plugin doesn't extract on split points.
By passing
{allChunks: true}
it extracts all the css in all the chunks and puts in one css file.
But it doesn't solve your problem.
For that you can try extract-css-chunks-webpack-plugin specifically built for this use case using extract-text-plugin.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With