Unfortunately the problem with that is, hardware got so fast - it runs into the engine FPS-limit(*) at a low resolution.
Your and
@trety results for example are affected.
* = verify the limit: load VAM in a low resolution, disable Vsync, load an empty scene, no matter what FPS counter your check (they have different precision), they all max out at around 300
If a test shows max fps >= 300 (usually maxes out at 309 FPS due to bad precision)
the result is bad / not comparable.
The CPU (or to be more precise a thread of the Unity engine) went to sleep at some point during the benchmark until it was allowed to continue with the next frame. It was waiting because somewhere in the Unity engine there is check that let's this thread go to sleep if to fast. Which is usually a good thing to not waste energy, but not for a benchmark. It's like having fake-Vsync on, except with 300 Hz. (Vertical synchronization synchronized the frames to the monitor frequency, so the framerate cannot be higher than the monitors frequency)
It may not be much but it DID affect the result negative. That is a fact. The actual performance could have been higher.
better:
Either test with a higher resolution (easiest, more reliable and probably more precise) or
disable the FPS-limit (which may have unknown side effects and the benchmark will complain about that plugin being loaded) to get correct and comparable results.
Loss of precision:
In general with higher FPS the measuring precision is lower. Timer resolution is limited.
The PC can measure millisecond more precise than nanoseconds.
At 1 second / 300 frames we have only 3,333 milliseconds to render one frame.
But the engine does not just measure the total frame time (script-, physics-, rendering- and wait-time). So
microsecond precision is already needed.
There can be rounding errors from using less precise datatypes (float instead of double) in programming.
On top of that there are various ways to implement a FPS counter with varying precision.
For example some FPS-counters show an average of the multiple past frames to make the number more stable/human-readable and others just do 1 / frame time = FPS.