Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to calculate a Mobilenet FLOPs in Keras

run_meta = tf.RunMetadata()
enter codwith tf.Session(graph=tf.Graph()) as sess:
K.set_session(sess)


with tf.device('/cpu:0'):
    base_model = MobileNet(alpha=1, weights=None, input_tensor=tf.placeholder('float32', shape=(1,224,224,3)))




opts = tf.profiler.ProfileOptionBuilder.float_operation()    
flops = tf.profiler.profile(sess.graph, run_meta=run_meta, cmd='op', options=opts)

opts = tf.profiler.ProfileOptionBuilder.trainable_variables_parameter()    
params = tf.profiler.profile(sess.graph, run_meta=run_meta, cmd='op', options=opts)

print("{:,} --- {:,}".format(flops.total_float_ops, params.total_parameters))

When I run above code, I got a below result

1,137,481,704 --- 4,253,864

This is different from the flops described in the paper.

mobilenet: https://arxiv.org/pdf/1704.04861.pdf

ShuffleNet: https://arxiv.org/pdf/1707.01083.pdf

How to calculate exact flops described in the paper?

like image 823
Y. Han Avatar asked Mar 28 '18 03:03

Y. Han


People also ask

How are model keras FLOPs calculated?

You can use model. summary() on all Keras models to get number of FLOPS.

How are model FLOPs calculated?

It is usually calculated using the number of multiply-add operations that a model performs. Multiply-add operations, as the name suggests, are operations involving multiplication and addition of 2 or more variables. For example, the expression, a * b + c * d, has 2 flops while a * b + c * d + e * f + f * h has 4 flops.


2 Answers

tl;dr You've actually got the right answer! You are simply comparing flops with multiply accumulates (from the paper) and therefore need to divide by two.

If you're using Keras, then the code you listed is slightly over-complicating things...

Let model be any compiled Keras model. We can arrive at the flops of the model with the following code.

import tensorflow as tf
import keras.backend as K


def get_flops():
    run_meta = tf.RunMetadata()
    opts = tf.profiler.ProfileOptionBuilder.float_operation()

    # We use the Keras session graph in the call to the profiler.
    flops = tf.profiler.profile(graph=K.get_session().graph,
                                run_meta=run_meta, cmd='op', options=opts)

    return flops.total_float_ops  # Prints the "flops" of the model.


# .... Define your model here ....
# You need to have compiled your model before calling this.
print(get_flops())

However, when I look at my own example (not Mobilenet) that I did on my computer, the printed out total_float_ops was 2115 and I had the following results when I simply printed the flops variable:

[...]
Mul                      1.06k float_ops (100.00%, 49.98%)
Add                      1.06k float_ops (50.02%, 49.93%)
Sub                          2 float_ops (0.09%, 0.09%)

It's pretty clear that the total_float_ops property takes into consideration multiplication, addition and subtraction.

I then looked back at the MobileNets example, looking through the paper briefly, I found the implementation of MobileNet that is the default Keras implementation based on the number of parameters:

image

The first model in the table matches the result you have (4,253,864) and the Mult-Adds are approximately half of the flops result that you have. Therefore you have the correct answer, it's just you were mistaking flops for Mult-Adds (aka multiply accumulates or MACs).

If you want to compute the number of MACs you simply have to divide the result from the above code by two.


Important Notes

Keep the following in mind if you are trying to run the code sample:

  1. The code sample was written in 2018 and doesn't work with tensorflow version 2. See @driedler 's answer for a complete example of tensorflow version 2 compatibility.
  2. The code sample was originally meant to be run once on a compiled model... For a better example of using this in a way that does not have side effects (and can therefore be run multiple times on the same model), see @ch271828n 's answer.
like image 56
Malcolm Avatar answered Sep 22 '22 17:09

Malcolm


This is working for me in TF-2.1:

def get_flops(model_h5_path):
    session = tf.compat.v1.Session()
    graph = tf.compat.v1.get_default_graph()


    with graph.as_default():
        with session.as_default():
            model = tf.keras.models.load_model(model_h5_path)

            run_meta = tf.compat.v1.RunMetadata()
            opts = tf.compat.v1.profiler.ProfileOptionBuilder.float_operation()

            # Optional: save printed results to file
            # flops_log_path = os.path.join(tempfile.gettempdir(), 'tf_flops_log.txt')
            # opts['output'] = 'file:outfile={}'.format(flops_log_path)

            # We use the Keras session graph in the call to the profiler.
            flops = tf.compat.v1.profiler.profile(graph=graph,
                                                  run_meta=run_meta, cmd='op', options=opts)

            return flops.total_float_ops
like image 23
driedler Avatar answered Sep 20 '22 17:09

driedler