I'm trying to make a simple task to show the file size for each file in an array of paths using the gulp-size
package, like so:
var gulp = require('gulp')
var size = require('gulp-size')
gulp.task('size', function() {
gulp.src(bigArrayOfFilePathsFromAnotherModule)
.pipe(size({ showFiles: true }))
})
When this runs, it gets part of the way through, but then the task finishes before all of the files are processed. It works just fine if I pipe them to a destination, but I'd rather not copy the files anywhere. Is there a way to pipe these files into a black hole so the task completes?
I've tried .pipe(gulp.dest('/dev/null'))
but it errors out trying to mkdir /dev/null
which already exists.
Is there a good way to pipe a stream to nowhere?
Gulp is a task runner that uses Node.js as a platform. It purely uses the JavaScript code and helps to run front-end tasks and large-scale web applications. Gulp builds system automated tasks like CSS and HTML minification, concatenating library files, and compiling the SASS files.
Gulp Tutorial. Gulp is a task runner that uses Node.js as a platform. It purely uses the JavaScript code and helps to run front-end tasks and large-scale web applications. Gulp builds system automated tasks like CSS and HTML minification, concatenating library files, and compiling the SASS files.
Run the gulp command in your project directory: gulp. To run multiple tasks, you can use gulp <task> <othertask>.
In the Settings/Preferences dialog Ctrl+Alt+S, click Startup Tasks under Tools. On the Startup Tasks page that opens, click on the toolbar. From the list, choose the required Gulp.js run configuration. The configuration is added to the list.
You should return your stream:
gulp.task('size', function() {
return gulp.src(bigArrayOfFilePathsFromAnotherModule)
.pipe(size({ showFiles: true }));
})
Otherwise, gulp will assume the work done by the task is synchronous and done by the time your anonymous function returns.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With