CompSci folks: any sense of what the performance hit would’ve been if we’d just never gone down the speculative execution path?
My sense is 5-10% on normal use, ~20% on heavy workloads, but that’s only based on some readings after Spectre/Meltdown. Any better sources?
This whole line of processor work increasingly just seems like more trouble than it’s worth.
Well, not the same thing as if we’d never gone down the path (presumably work spent on this woul’ve been spent elsewhere), but this is significant, regarding the coarse mitigation for this particular vulnerability:
“Testing conducted by Apple in May 2019 showed as much as a 40 percent reduction in performance with tests that include multithreaded workloads and public benchmarks.”