I'm acquiring frames from my built in webcam with go-opencv. The time it takes to acquire an image from the camera is ~50ms. The time to encode the PNG is ~300ms. Encoding a JPEG is 3x faster, but still horribly slow.
Why is this running so slowly?
Note: I've written similar code against NodeJS, executed on the same machine, and have absolutely no problem hitting 30fps w/ additional image processing. For me, this removes hardware issues from the equation.
My code looks like this:
import (
"fmt"
"image/png"
"github.com/lazywei/go-opencv/opencv"
)
camera := opencv.NewCameraCapture(0)
if camera == nil {
panic("Unable to open camera.")
}
defer camera.Release()
for {
if camera.GrabFrame() {
img := camera.RetrieveFrame(1)
if img != nil {
frame := img.ToImage()
buffer := new(bytes.Buffer)
png.Encode(buffer, frame)
} else {
fmt.Println("Unable to capture frame")
}
}
}
Disabling compression improves encoding performance by an order of magnitude on my machine. That may be a start if you don't want to look for png packages outside the standard library.
I also tried the BufferPool (new in Go 1.9), but it didn't make a difference compared to an encoder with a nil BufferPool. Perhaps I did it wrong. The docs are unfortunately very, erm, concise.
package main
import (
"bytes"
"image"
"image/png"
"os"
"testing"
)
func BenchmarkPNG_Encode(b *testing.B) {
img, buf := loadImage(b)
b.ResetTimer()
for i := 0; i < b.N; i++ {
buf.Reset()
png.Encode(buf, img)
}
}
func BenchmarkPNG_Encoder(b *testing.B) {
img, buf := loadImage(b)
enc := &png.Encoder{}
b.ResetTimer()
for i := 0; i < b.N; i++ {
buf.Reset()
enc.Encode(buf, img)
}
}
func BenchmarkPNG_Encoder_NoCompression(b *testing.B) {
img, buf := loadImage(b)
enc := &png.Encoder{
CompressionLevel: png.NoCompression,
}
b.ResetTimer()
for i := 0; i < b.N; i++ {
buf.Reset()
enc.Encode(buf, img)
}
}
func loadImage(b *testing.B) (image.Image, *bytes.Buffer) {
// foo.png PNG 1920x1053 1920x1053+0+0 8-bit sRGB 251KB 0.000u 0:00.000
f, err := os.Open("foo.png")
if err != nil {
b.Fatal(err)
}
img, err := png.Decode(f)
if err != nil {
b.Fatal(err)
}
f.Close()
buf := &bytes.Buffer{}
// grow the buffer once
(&png.Encoder{CompressionLevel: png.NoCompression}).Encode(buf, img)
return img, buf
}
Again, this was on my machine with a roughly 1920x1080 pixel image -- a random screenshot; not sure how much this will differ from, say, a photo. YMMV.
$ go test -v -bench . -benchmem
goos: linux
goarch: amd64
BenchmarkPNG_Encode-8 10 119289121 ns/op 884964 B/op 38 allocs/op
BenchmarkPNG_Encoder-8 10 118001658 ns/op 884932 B/op 37 allocs/op
BenchmarkPNG_Encoder_NoCompression-8 100 13050664 ns/op 807156 B/op 212 allocs/op
It's interesting that no compression causes many more allocations than with compression though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With