I believe that quantifying the productivity increase (extra working hours) is the most effective way to do this.
My case in point: I have a fast machine at home and a slow one at work. My estimate is that I would gain about 30 minutes a day of extra productivity at work if I had my home machine at work. This productivity would come from less waiting to do all the tasks that I do. (An extra 30 minutes a day is about 3 weeks a year.)
Problem: I need to measure this.
Is there a software utility that can monitor and scientifically quantify time taken by tasks on a machine?
Break it down into pieces you can quantify. For example, I compile every 4 minutes and each compile would save 10 seconds. But after a while I get bored of waiting to seconds so I go to Stack Overflow and I'm there for two minutes. Sometimes I'll start talking to Jim and that takes 4 minutes for both of us. So 15 times an hour * 8 hours * 10 seconds = 12 minutes + 5 trips to stack overflow = 22 minutes + 4 conversations with Jim = 38 minutes for me and 16 for Jim.
The next step is to see whether it's worth it to buy a new computer. Let's round it off to an hour a day and you cost the company $100k per year in salary and benefits. One eighth of your hours are wasted so that's $12,500 per year in wasted productivity between you and Jim that could be saved by getting you a faster computer.
But you're not going to throw the computer out. The boss's new admin doesn't need a brand new PC, and buying her a computer would cost $1,000. Your computer costs $3,000 so it's really only costing the company $2,000.
It's not difficult to make it look like a no-brainer. The goal is to put it into dollars, but of course that doesn't guarantee you anything.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With