Measuring IO bottlenecks in The Last of Us Part 1
Referring to report qx-per-07 on Windows 11 23H2, I used 3DMark 2026 stress tests and watched the frame rate crash between 42 and 58 fps, with some sickening drops as low as 15 fps, a 8% deviation from public benchmarks. I originally thought I was hitting a VRAM wall, but the logs revealed massive IO latency spikes. I dove into the IO stress options in 3DMark and forced pre-fetching, while simultaneously switching the disk to High Performance write mode. After that, the frames stabilized around 55 fps. To be fair, I still hit the occasional 0.1s freeze during heavy lighting transitions, which is just the controller hitting its ceiling. Nevertheless, the variance dropped by over 60%, killing that slideshow effect completely. The quantization proves that cache tuning was the silver bullet.