And to add to the sibling reply by OneTimePetes, even if the simulation had to generate a nearly infinite amount of details, the simulation itself could have its internal time slowed down, and the beings inside the simulation would not be able to perceive it as such.
The problem is with finding an argument as to why, as a matter of principle, we would expect simulations to be accompanied by such fixes.
If you buy into the bootstrapping premises (which I don't!) that simulations beget more simulations, and that this runs into limits of computing power, there are clear and consistent principles driving that runaway process.
However, so far as I can tell, it's not clear that there would be any principle leading us to conclude that there would be a standardization of simulations relating to image hashing or fiddling with the experience of time. Those types of fixes would fall into the category of idiosyncratic details that either would or wouldn't hold in particular one-off scenarios, and don't generalize into any transcendant probabilistic implication that such experiences are most likely. That kind of generalization is what would be necessary for it to be responsive to the argument about simulations.