What? Why?

Background

Microbenchmarks and language wars are fun and all, but does it really matter whether it takes 5 seconds or 0.5 seconds to calculate a million things? Maybe. Most of the time, probably not. If you're wondering whether you should spend time making your code faster or maybe spend that time making your code easier to read and change later (or better yet, making your product easier to use or more valuable to your customers), you might consider how much it actually costs to simply throw processing time at the problem.

How it's calculated

The hours in the headline are just (programmerHourCost / instanceHourlyCost) for the programmer hour cost you input and the hourly cost of the instance type you select. The hourly rate defaults to the equivalent hourly rate of the first plausible median programmer salary I found on Google.

Try it again.