"A company’s most expensive resource is now its employee’s time. Or in other words, you. It’s more important to get stuff done then to make it go fast."
Now take this sentence and apply it to the end user of your software.
You can only optimize variables (and outcomes) under your control. Coordination is hard, and that's marketing's job to figure out.
If the software is better than anything else available, then the user is optimizing their time within their own constraints. If the users complain that it's too slow and you lose users as a result, then go ahead and optimize. But slow and working is better than fast and imaginary.
Also, you write the software for X minutes and run it for Y. If cost(X) < cost(Y) e.g. Because Y is very large for long running software, then higher performance will save money. Of course it can be difficult to know when this will be true especially in advance...
As a software user, is completely agree with you! I get frustrated by badly performing software so often I regularly wonder if I'd be happier doing something that relied less on shitty software.
I think your misunderstanding the tone of the article. I mention how the bottleneck is usually not the language. It IS important to optimize, but its important to understand where the real issue is. I later explain that things have to be only "good enough". And I also explain how you CAN optimize python when it IS to slow for your users.
From their viewpoint: The biggest cost for the company that implements something is their employees' time, and that's likely true of your customers, as well. Your costs are easily measured by you; the customers' costs aren't. Reading the article, it's a little grating that their costs aren't given the same weight as yours.
From your viewpoint: You've got a large constant X (representing time taken doing something un-optimizable like I/O), a small constant Y (representing the "language multiplier"), and a small constant Z (representing CPU-heavy work). So you've got something like Time=N(X+Y*Z), which for values of Y and Z in your work, means that X is by far the largest timesink.
From my viewpoint: You're talking past each other, imagining different constants. It's also interesting to me that a program is still considered written in "Python" if the interesting parts are actually in C, and called from Python. You make points that I agree with about premature optimization, but I think the way you wrote your article opens it to easy misinterpretation, since everyone will view it through their workloads (which may vary strongly from yours). I would've limited the discussion explicitly to the kinds of use-cases you're thinking of.
Maybe this is saying the same thing, but I interpreted it as "If a handful of developers write an application that millions use, then saving a few seconds per user can add up."
I've seen this applied to the Linux kernel, where saving a few microseconds in a system call, when multiplied out to the billions of machines running Linux, saves hundreds of years of CPU time.
With that said, most services don't have as many users as Linux.