5800X3D Not going to be manually overclockable

yeah 20% is an assumed number and pushing it based on AMDs own claims. in the respect of whatever the increase is from 3600X to 5800X is normally, they are claiming “in certain workloads” that the 3DVcache is an additional 15% over… I forget what actually. but either way, they are throwing wild numbers around. reality for the actual thing you do might only be 5%, or you might hit the bingo card and get the 20%.

Hmmm. but DDR3 is only 1 generation behind DDR4 which the 5800X3D uses. Nothing on AM4 uses DDR5 yet. That’s why I thought DDR2, that would be 2 generations.

I’m considering DDR5 to be current. That’s the other thing I referenced: the 5800X3D was announced simultaneously with the 7000 series, so it’s decidedly not going to be current when it’s released. I assume the 6000 series is mobile only? The announcement implied that, but it didn’t say it explicitly.

Yeah but this topic, is not about DDR5 CPUs or 7000. Its about the 5800X3D specifically, by name. so that what all the talk is referencing. it is still current technically within the limitation of everything else it uses, that being DDR4. Unlike intel, AMD are not mixing DDR generations on motherboards, yet.

1 Like

Well it is, because my initial question was about why they’d release all of these cross generation products within months of eachother. That question was how we got to this discussion in the first place. It still doesn’t make a lot of sense to me, but oh well. It’s all academic to me until 2030 anyway.

1 Like

yeah that is the constant frustration with AMD, they had and even attempted on multiple occasions to bring the number releases and generations into line and every time found a way to mess it up. so here we are again.

I think the only reason there is the 5800X3D announcement at all, is that its the test for something they will do later with something better, this was just a cheap easy way to proof of concept it, that’s why its only one chip with the limtiations on clocks.

Ding ding ding ding

Epyc with massive 3DVCache.

https://forum.level1techs.com/t/the-lounge-2020-two-edition/179938/12066?u=regulareel

Turns out Wendell had one…

Yeah their claims where about 15% in regards to gaming.
I would assume that it’s likely a couple of games that will benefit from it.
But Of course we have to wait until the first reliable benchmarks roll out.

I’m just curious about if that additional cache is really going to be a significant benefit,
in several different workloads.
Because if that is the case then it’s actually pretty interesting,
because those chips run at lower clocks.

2 Likes

Gosh darn it the 5800X3D is the processor I need to accelerate engineering software

Over clocking be dammed it doesn’t matter at all for me

All the gaming speculation in here is cute and all but this is definitely geared at people with cache heavy tasks like sci comp which just thrashes CPUs even my 3900X is slow at it

And if you talk to any sci comp person they tell you how much clocks really don’t matter as much as consistent cache and multicore performance so making it not overclockable makes total sense to me. You want consistent fast repeatable performance.

7473X definitely expensive but I can see why AMD is creating these tailored solutions. Gaming has become the smallest least powerful niche of their business. Don’t expect them to cater towards it other than what you see.

5 Likes

As soon as about 50% percent of game development companies stop being a disgrace to the entire trade we might see some benefits. Even to this day it is practically a miracle if a game scales well with more than four cores, even tough there are many things you could easily use them for in a lot of genres. I am not seeing any serious memory optimization happening any time soon. Like honestly with many games these days you clearly see that quality software development was no where near the requirements. I honestly believe that the fact that at least some games can scale with more cache is because of some people that wrote the engines actually did know what they were doing. Rant over

I was able to reduce my rant because @PhaseLockedLoop covered the other part. Thanks!

5 Likes

I tried to avoid the rant myself tbch.

Look the days of consumer overclocking fun are over. Computing is evolving. You should push your workloads to evolve. As much as you can I realize that’s hard to do

2 Likes

I don’t really care much for overclocking either anymore.
But if more cache means better overall performance at lower voltages and clocks.
Then that is always a good thing.
But we just have to wait and see in which kind of workloads this will actually shows.

1 Like

Problem is the workloads have to evolve to really utilize it and the current era software development is extremely wasteful and quite often non innovative like reusing older engines a ton

Spicy take: even if they were, who would implement those requirements? Any software engineer worth the title would laugh at the salaries presented for dealing with their crud

Irony:

I’m currently evaluating if using a mature, still maintained third-party library is worth it cause it’s built on top of another library that seems to be aging poorly in terms of performance.

Features, readability, stability, etc… often trump performance when you can technically just throw more data centers at the problem.

Edit;

It’s Flurl for .Net vs System.Text.JSON + HttpClient for the curious