I don't understand the question: everything maxed out of course
(13900K)
(13900K)
I don't understand the question: everything maxed out of course
(13900K)
Man IDK. My old 11900k is still faster at physics than your 7950x test. Maybe some tweaking is needed. make sure you set to xmp overclock on ram, or whatever its called for amd?Kingston Fury 2x16 gb RAM, running 4600;
Gigabyte Gaming OC 4090
7950x
Unsure if my results are due to the 7950x being that much behind the intel, or if I need to do some tweaking
View attachment 189529
I gathered some data from this thread and made a benchmark chart. The data should be considered inaccurate as some data just had one test sample (like 2070 Super or 980Ti) while other ones like the 4090 should be quiete accurate (6 Test Samples for 1080P). I tried to eleminate obvious stray data, but the inaccuracy shows in the results - like a RX 6800XT outperforming a 6900XT or a 3090TI performing better in 1440p than 1080P. So this shall only be an orientation. I first tried to make a CPU Chart as well, but as expected, all benchmark scenes appear to be heavily GPU bound.
View attachment 189757
(The collected fps data is a not the average, but the "mirror" scene data as this scene seems to be the most GPU bound.)
Thank you, yeah; i am planning on switching out the RAM with something that is compatible with Expo to boost the speed a bit to see if that makes a difference. Seems like a much larger margin than all the other benchmarking I have seen between the twoMan IDK. My old 11900k is still faster at physics than your 7950x test. Maybe some tweaking is needed. make sure you set to xmp overclock on ram, or whatever its called for amd?View attachment 189760
Hardware Performance comparison tables
The following tables are sorted results from MacGrubers VAM Benchmark collected here. Updated - 22.April.2022 - I try to avoid my opinion or hardware recommendations here - there is already to much 'brand XY is better'-fanboyism on the internet. It should be clear that one can be happy with...hub.virtamate.com
You have to decide for yourself. I did switch from a 5600x to a 5800x3d and vam runs smoother. I have to say that all my Vr games do run way better now. So yes i think the 3d cash is black magic or sth. Still if u dont need it u could w8 for the 7800x3d.So looking at this... AM4 users on CPUs prior to Zen 3 maybe shouldn't bother paying extra for a 5800x3D when the 5800x is so close? It is a shame there seems to be no 5700x ratings in these charts because it will probably be close and even cheaper...
it works (officially) ONLY with some games for wich nvidia was optimizing this option. Then (when I was with my old hardware), I was trying a mod to trick my nvidia drivers ... a crapware mod that someone was recommending. I don't remember if it was on reddit or somewhere else, and it was made to get compatibility with any game you wanted to apply. Crapware (that mod). Sorry.Anyone know if Resizeable BAR makes much of a difference?
Anyone know if Resizeable BAR makes much of a difference?
It's normal. It's the most CPU- / physics-heavy scene.I noticed today that my GPU utilization drops down to about 55% during baseline 3. It's at about 96-99% on every other scene. Anything on my end that could be causing this?
It's normal. It's the most CPU- / physics-heavy scene.
There is one single thread on the CPU maxed out. GPU has to wait for CPU more.
If you check Task Manager CPU performance with graph set to show "Logical processors" during baseline 3 it is visible.
With it's default settings you cannot see it. Task Manager only shows all core utilization then.
There is always one graph high at a time.
It may switch / jump around a lot. The scheduler instructs another CPU core to "do stuff" all the time.
I wonder how VAM will work on newest laptops ( 13th gen Intel CPU and RTX 4000 GPU). From the first glance they seem very tempting but I wonder if they will be able to reach RTX 2000 or 3000 desktop level performance in VAM. I would like to retire my laptop with GTX 1060 this year.