Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I efficiently simulate image compression artifacts in Python?

Tags:

I'm training a neural net using simulated images, and one of the things that happens in real life is low quality JPEG compression. It fuzzes up sharp edges in a particular way. Does anyone have an efficient way to simulate these effects? By that I mean create a corrupted version of a clean input. The images are grayscale, stored as numpy arrays.

like image 824
Mastiff Avatar asked Mar 18 '20 16:03

Mastiff


1 Answers

Thanks to the answers in the comments, here is a solution which saves the image as JPEG and reads it back in, all in memory using standard python libraries.

import io
import imageio

# Image is 2D numpy array, q is quality 0-100
def jpegBlur(im,q):
  buf = io.BytesIO()
  imageio.imwrite(buf,im,format='jpg',quality=q)
  s = buf.getbuffer()
  return imageio.imread(s,format='jpg')

In my function I also pre- and post-scaled the image to convert from float64 to uint8 and back again, but this is the basic idea.

like image 113
Mastiff Avatar answered Sep 30 '22 20:09

Mastiff