Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detecting HeartBeat Using WebCam?

I am trying to create an application which can detect heartbeat using your computer webcam. I am working on the code since 2 weeks and developed this code and here I got so far

How does it works? Illustrated below ...

  1. Detecting face using opencv
  2. Getting image of forehead
  3. Applying filter to convert it into grayscale image [you can skip it]
  4. Finding the average intensity of green pixle per frame
  5. Saving the averages into an Array
  6. Applying FFT (I have used minim library)Extract heart beat from FFT spectrum (Here, I need some help)

Here, I need help for extracting heartbeat from FFT spectrum. Can anyone help me. Here, is the similar application developed in python but I am not able to undersand this code so I am developing same in the proessing. Can anyone help me to undersatnd the part of this python code where it is extracting the heartbeat.

//---------import required ilbrary -----------
import gab.opencv.*;
import processing.video.*;
import java.awt.*;
import java.util.*;
import ddf.minim.analysis.*;
import ddf.minim.*;
//----------create objects---------------------------------
Capture video; // camera object
OpenCV opencv; // opencv object
Minim       minim;
FFT         fft;
//IIRFilter filt;
//--------- Create ArrayList--------------------------------
ArrayList<Float> poop = new ArrayList(); 
float[] sample;
int bufferSize = 128;
int sampleRate = 512;
int bandWidth = 20;
int centerFreq = 80;
//---------------------------------------------------
void setup() {
  size(640, 480); // size of the window
  minim = new Minim(this);
  fft = new FFT( bufferSize, sampleRate);
  video = new Capture(this, 640/2, 480/2); // initializing video object
  opencv = new OpenCV(this, 640/2, 480/2); // initializing opencv object
  opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE);  // loading haar cscade file for face detection
  video.start(); // start video
}

void draw() {
  background(0);
  // image(video, 0, 0 ); // show video in the background
  opencv.loadImage(video);
  Rectangle[] faces = opencv.detect();
  video.loadPixels();
  //------------ Finding faces in the video ----------- 
  float gavg = 0;
  for (int i = 0; i < faces.length; i++) {
    noFill();
    stroke(#FFB700); // yellow rectangle
    rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height); // creating rectangle around the face (YELLOW)
    stroke(#0070FF); //blue rectangle
    rect(faces[i].x, faces[i].y, faces[i].width, faces[i].height-2*faces[i].height/3); // creating a blue rectangle around the forehead
    //-------------------- storing forehead white rectangle part into an image -------------------
    stroke(0, 255, 255);
    rect(faces[i].x+faces[i].width/2-15, faces[i].y+15, 30, 15);
    PImage img = video.get(faces[i].x+faces[i].width/2-15, faces[i].y+15, 30, 15); // storing the forehead aera into a image
    img.loadPixels();
    img.filter(GRAY); // converting capture image rgb to gray
    img.updatePixels();

    int numPixels = img.width*img.height;
    for (int px = 0; px < numPixels; px++) { // For each pixel in the video frame...
      final color c = img.pixels[px];
      final color luminG = c>>010 & 0xFF;
      final float luminRangeG = luminG/255.0;
      gavg = gavg + luminRangeG;
    }

    //--------------------------------------------------------
    gavg = gavg/numPixels;
    if (poop.size()< bufferSize) {
      poop.add(gavg);
    }
    else poop.remove(0);
  }
  sample = new float[poop.size()];
  for (int i=0;i<poop.size();i++) {
    Float f = (float) poop.get(i);
    sample[i] = f;
  }

  if (sample.length>=bufferSize) {
    //fft.window(FFT.NONE); 
    fft.forward(sample, 0);
    //    bpf = new BandPass(centerFreq, bandwidth, sampleRate);
    //    in.addEffect(bpf);
    float bw = fft.getBandWidth(); // returns the width of each frequency band in the spectrum (in Hz).
    println(bw); // returns 21.5332031 Hz for spectrum [0] & [512]

    for (int i = 0; i < fft.specSize(); i++)
    {
      // println( " Freq" + max(sample));
      stroke(0, 255, 0);
      float x = map(i, 0, fft.specSize(), 0, width);
      line( x, height, x, height - fft.getBand(i)*100);
     // text("FFT FREQ " + fft.getFreq(i), width/2-100, 10*(i+1));
     // text("FFT BAND " + fft.getBand(i), width/2+100, 10*(i+1));
    }
  }
  else {
    println(sample.length + " " + poop.size());
  }
}

void captureEvent(Capture c) {
  c.read();
}
like image 230
B L Λ C K Avatar asked Dec 05 '14 06:12

B L Λ C K


1 Answers

The FFT is applied in a window with 128 samples.

int bufferSize = 128;

During the draw method the samples are stored in a array until fill the buffer for the FFT to be applied. Then after that the buffer is keep full. To insert a new sample the oldest is removed. gavg is the average gray channel color.

gavg = gavg/numPixels;
if (poop.size()< bufferSize) {
  poop.add(gavg);
}
else poop.remove(0);

Coping poop to sample

sample = new float[poop.size()];
for (int i=0;i < poop.size();i++) {
    Float f = (float) poop.get(i);
    sample[i] = f;
}

Now is possible to apply the FFT to sample Array

fft.forward(sample, 0);

In the code is only show the spectrum result. The heartbeat frequency must be calculated. For each band in fft you have to find the maximum and that position is the frequency of heartbeat.

for(int i = 0; i < fft.specSize(); i++)
{ // draw the line for frequency band i, scaling it up a bit so we can see it
    heartBeatFrequency = max(heartBeatFrequency,fft.getBand(i));
}

Then get the bandwidth to know the frequency.

float bw = fft.getBandWidth();

Adjusting frequency.

heartBeatFrequency = fft.getBandWidth() * heartBeatFrequency ;
like image 178
David Clifte Avatar answered Sep 21 '22 03:09

David Clifte