Ryzen 3000 spec/price leaked (supposedly)

5% clock for clock, the IPC advantage is 5%. Intel wins out with a near 20% clock speed advantage when factoring in max overclocking.

1 Like

You were saying “real world”.

Real world is much greater than 5%, if you are aiming for high refresh experience.

As of right now, “Clock for clock” is meaningless because AMD chips can’t compete in the frequency game.

2 Likes

Just wanted to mention, my stepson has the 2560x1440 165hz Acer main monitor, 1080p second monitor, 6700k @ 4.8ghz and a 1070, and he can game, stream, run youtube and discord all at the same time with the lowly 4 core 8 thread chip.

You do not need 12/16 threads to accomplish this despite what AMD will have you believe.

1 Like

… do you even understand the definition of the word “anecdotal”?

here, lets start here:

Everything you wrote above is anecdotal.

Lets break it down.

66.5% of people on Steam game at 1080p or better. That means 33.5% of people game at even lower resolutions.

There are (roughly) 65 million people on Steam. Its higher, but i like round numbers, so well go with 65 million.

That means 21 million people play at a resolution below 1080p.
That means 39.65 million people play a 1080p.
That means that less than 2.6 million people play at 1440p
And that means that less than 970,000 people play at 4k.

Even if EVERY MACHINE YOU’VE EVER BUILT was for 4k, you wouldn’t even be a drop in the bucket of 4k players, much less anyone else.

Your experience is ENTIRELY (almost the exact definition of) anecdotal. Just because you build a lot of 1440p+ rigs doesn’t mean a lot of people play on 1440p+ rigs.

They provably dont. Your personal experience here is irrelevant. Trying to say “well i build a lot of rigs for 1440p and streaming and stuff, and that means that that is what everyone is doing” is laughable at best. You do nt represent even a fraction of a fraction of a single percentage of machines.

You dont get to determine that, and you dont get to throw out facts to make your argument look better because the facts make your argument look like dung.

Fact is, over 35% of people game on sub-1080p resolutions.

The first thing you’ll need is the forum ID for the person that you want to ignore. You can get this easily even if the user is a troll and has hidden their activity. Easy way is to go to one of the user’s posts, click their name, and click view activity. Then up in the URL, after their realm, add .json. The user ID will be very early in the code. Here’s an example that may be extremely useful to you:
https://us.forums.blizzard.com/en/wow/u/You-darrowmere.json

The second thing you’ll need is a CSS browser extension. This will let you modify the code of the websites you are trying to display in your browser. I know you’re on a Mac for browsing, so you might be using Safari. I don’t know of any CSS browser extensions for Safari off the top of my head, but you will likely be able to find them among the dev extensions approved by Apple. For Chrome, I recommend Stylus:
https://chrome.google.com/webstore/detail/stylus/clngdbkpkpeebahjckkjfobafhncgmne

Once you have the extension installed, open it, and create a new style. Give the style a name, and add the following line of code: article[data-user-id="14751"] {display:none;} That code will work for You. You can replace the id number with whoever you’d like to ignore. The key with this code is to make sure that you are using straight quotes instead of curly quotes (yes, those are the technical terms). Copy/pasting my code may give you the curly quotes around the id number, and those will not work.

Anyway, once you’ve got the code added, you can setup which websites the modification applies to. I set my style to apply to all URLs starting with: https://us.forums.blizzard.com You can just let your code apply to all websites if you want.

And that should be it. Save your style. Enjoy your browsing. When someone you have ignored has posted, you will not see anything except a thicker break between posts. Your scrolling may be a little funky, as well, but simply reloading the page usually fixes this for me.

Now enjoy being able to completely ignore anyone who would fail Logic 101 to your heart’s content.

1 Like

I touch all kinds of machines the dual core and hdd ones can get extremely frustrating just doing maintenance takes 2-3 times longer. Saved money but your more frustrated and spending more on IT…

Luckily 90% of what gets upgraded in the last 2ish years we change the drive out for a SSD.

I bet you take the Steam statistics and look at newer AAA games the people playing those most likely are 90% 1080p+ and at least half 1440p+.

Steam sells a LOT of games that don’t take much to run.

1 Like

When did this topic become about dual core machines?

As far as I’m aware, my argument was that 6/12 machines are more than sufficient for AAA titles and background tasks for the foreseeable future. At least, during the effective lifespan of any chips presently on the market anyway.

Interestingly enough, I have a video going up next week sometime with 4 chrome tabs open, CPU-Z GPU-Z MSI Afterburner and Discord open, it was as much as 15-20% decrease in performance when comparing to another video I did. And since I only do real world testing, I may do a video soon and see the real performance lost when you are gaming with normal programs open. Because literally no one games with all programs closed. None of my customers do.

@Kag come on dude.

Check the up to date link: https:// store. steampowered .com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

So roughly 61% game at 1080p. Removing 1366x768 since those are almost all laptops.
1080p 60.74%
768p 13.91%
1080pw 0.95%
1440p 3.87%
1440pw 0.46%
4k 1.45%
Other above 1080p 1.64%
Total people gaming 1080p above removing laptops 83.02%

People gaming below 1080p on desktop 16.98% that are not gaming on laptops. Laptops are not relevant since we are talking about CPU Desktop architecture.

Also there is a BIG difference between anecdotal and reality. I deal with reality and what real people are doing. People who are making real AMD vs Intel decisions. Not people whose parents buy an HP Omen with either a 2600 or 8400. Because those people are not important in this conversation. They are not making these decisions.

So when you say anecdotal evidence, I say stop using irrelevant information in an argument that adds false sense of actuality to the conversation. Anecdotal evidence would be if I built 2 PC’s and I am assuming the rest of the world follows those 2 trends.

But when I have personally used every single piece of hardware that has come out over the past 7 years with performance results, hundreds of customers and 100+ videos, that is research and fact, not just a 1 off personal experience. You mention that you’ve done all of this work, but haven’t even seen any evidence backing that up. If you do built for thousands of esports teams and streamers, then you would not be saying that 99% of people use 1 monitor in the customer PC space.

The only point in this conversation is, what do people who are ordering a customer built PC or building their own do with their PC’s and what hardware will maximize the performance for the budget.

Because most people you’re referencing are not using custom high end CPU’s where the actual difference in performance lies. In my video going up on the 12th, you will see that the I3-8100 has a 4.75% advantage over the Ryzen 3 1300x. Intel has literally a 5% gain over AMD right now clock for clock.

The people buying 4GHz+ Intel chips are streaming, using multiple monitors, are recording and multitasking so it is very important to consider those things. You know who is using single monitors and not streaming? People who buy locked Intel Systems in HP Omens of which the performance difference between Intel and AMD is nearly margin of error.

Quit shilling your god-awful YouTube channel. Seriously. No one cares, or takes you remotely seriously. Your testing methods are bush-league bullcrap, you have no screen presence, no ability to put together an interesting script, nada. To be fair… i don’t have any of those things either, which is why you dont see me trying to shill myself on YouTube. You aren’t the next Paul, Linus, Jay, Kyle, et al.

Also… 20% performance loss running programs that dont take more than 5% CPU on a single core (other than perhaps Chrome depending on what you’ve got open. I’ve seen some web apps eat up 2-3 cores)… what the hell did you do to that computer?

I dont even get that usage on a freaking CHROMEBOOK.

I used YOUR numbers.

… yes, they are probably laptops. Are you trying to claim people dont game on laptops?

you dont get to simply REMOVE data you find inconvenient to your argument. It doesn’t work that way.

“here, let me remove this inconvenient fact to prop up my bullcrap argument”.

No, we’re really not. You keep shifting the goalposts every time someone shuts your weak-butt arguments down.

The discussion was about “you cant play on Quad Cores or even do daily driving tasks on them”… which is bunk.

Correct. Anecdotal evidence is what you’re basing your ENTIRE argument on. Literally. You’re conflating your absurdly (statistically irrelevant) limited experience with “reality”.

No, you’re dealing with what the extremely limited number of people you deal with are doing. Even if you built 10,000 computers a year… that’d still be irrelevant, statistically. Its completely anecdotal.

Oh, hey, lets throw even more data out because it makes your argument look bad. Man, it’s easy to “win” arguments when you exclude any data that doesn’t support what you’re saying.

They are, actually. Because for every computer you build for someone “making a decision”, 99 other people (more, honestly) are deciding on which pre-built system to buy their kid to game on.

Which is… exactly what you did. You could build 10 rigs a day, and STILL not have worthwhile data. We’re talking about sample set here in the TENS. OF. MILLIONS.

Nothing i said was irrelevant. I used YOUR data. The argument being made was that somehow, a quad-core system with a decidedly midrange GPU and producing very playable framerates (and the moment it moved past a lower mid range GPU, 60+ fps at all times) was just “not enough for the average gamer”.

I debunked the crap out of that.

You literally just defined “anecdotal”. whole HUNDREDS of customers? HUNDREDS? OH. MY. GOD. Stop the presses. Sample size of hundreds!

thats not even statistically relevant. Thats the very definition of anecdotal.

Uhh… ive also never claimed my own anecdotal experience means anything. So… whats your point? I dont owe you a damn thing.

I said, “hardware for tournaments” - ive never said i built for eSports teams. I have a friend who provides hardware for conventions, tournaments, and other events. I build a lot of his machines for him. Quite honestly, we’ve never put anything past low-midrange hardware in any of them. But i have built hundreds of machines over the last 6 or 7 years. That sounds more impressive than it really is because in batches of 20-50 they are all absolutely identical. Its really more of an assembly line process at that point. And we deliberately choose easy to work in cases. We’ve also sold a number of batches/sets to conventions/events we aren’t close enough to service ourselves (because the cost for us to drive out there and put the event on would be more than just selling them the rigs).

Never said i built for streamers. Not once. Ever.

Because i dont. The streamers we employ bring their own rigs.

Yeah, i would, because every single one of those people doesn’t add up to 200,000 people. Professional esports guys are such an infintessimally small number of people its absurd that you’re even talking about it. Even if you go to there are 10x that number of wannabes on Twitch… that’s still 2 million. Out of a market of over 70 million (if you we take Steam as gospel) or well over 100 million (if we take the number of Fortnite installs as gospel). Again… not a relevant market section. Tens of millions of people game on pre-built systems. The entire boutique crowd + build your own crowd dont add up to even 5% of the people gaming on their PCs. If that.

Uh… you quite literally just made that up out of whole cloth. It was NEVER about that. Learn to read, man. Also, YOU dont dictate where the conversation goes.

I’m not entirely sure why any of these even matters as it is not germane to the argument.

The argument was (ill recap for you, since you seem to have forgotten)

1 - if these new AMD chips really are 6/12 at the R3 price point, its massive overkill. You dont need anything nearly that powerful or with that many cores for that market segment. You can still game at 1080p/60/Ultra with a quad core and a top-midrange GPU.

2 - You jumps in with his usual blither about “nuh uh, gotta have likez a bajillion coarz to even get OK performance” - because for one he’s a fool and two, he seems to think “acceptable performance” for the average person gaming on their PC is 100+fps at a resolution higher than 1080p.

3 - links are posted debunking his garbage. He counters with links that… support the links debunking his garbage. The argument then turns towards “but the market is moving rapidly to super many coarz and higher than 1080p rez”.

4 - links and data are posted showing that for the lie that it is (you even contributed one, showing that it took… 7ish years for 1440p uptake to even hit 4.5%?). 1080p is by FAR the most prevalent resolution people game at, and will be for a number of additional years - well beyond the expected life of any low to mid-range rig you build right now.

5 - then you jump in trying to claim that somehow the number of people gaming on their PCs at rez higher than 1080p is secretly much higher than the actual statistics show because you know better because of your anecdotal sample size of hundreds.

You could have built hundreds of PCs a year for the last 15 years… it still wouldn’t be anything other than anecdotal experience. Just like my experience is utterly anecdotal. Yeah, i’ve built hundreds of machines… but so what. There are tens of millions sold a year.

And what CPUs AMD offers in its product stack does matter.

The reason Sal and i are so skeptical and think it’s stupid is that no one buying a low end machine needs 6 cores/12 threads to do anything they do. Even all at once. CPU/resource usage of things is going DOWN, not up, because so much stuff is being made to work on phones and tablets. So you end up with a part that is very cheap that canibalizes sales of better parts. I would never bother with anything but the lower end 6/12 part if i built a hackintosh to replace my daily driver. It would do everything i needed for 5+ years no problem.

So that would have cost AMD the potential sale of a higher-end SKU from me.

It isn’t somehow magically different in the land of OEMs and SIs. If the cheapest offering still delivers more performance than needed, theyll never use anything but that.

Offering that kind of chip at that low of a price is a great way to go bankrupt. You have to do something to drive customers to your higher price SKUs… and “customers” doesnt just mean the nearly irrelevant schlubs like us who build our own rigs…

It most definitely includes OEMs and SIs.

1 Like

The numbers are skewed because this may be a machine that doesn’t play games at all or they are a 2nd/3rd machine where only really really LIGHT gaming goes on. I’ve used a server before to keep downloads synced between multiple machines. Steam has a bunch of games that will run on a toaster I’d guess the majority of the extremely low resolution machines are running those. If you look at the machines running demanding games the picture changes.

Like every other leak that has promised me the moon and then failed to deliver, I’ll believe it when I have the moon in my hands. Until then it’s nothing but meaningless made up fantasy numbers extrapolated from Wishful Thinking.

The CPUs fail to pass the common sense test, and the GPUs numbers are even more outlandish than the CPUs.

Not sure how anyone could fall for this.

2 Likes

Well core counts make sense. Frequencies are just guesses and prices normally don’t get finalized until even a week before they go on sale. Keep in mind NONE of this was stated by AMD themselves.

1 Like

This is important too.

AMD has been completely mum about this.

The core counts make ZERO sense.

Why wold you sell a 6/12 core CPU as your low-end SKU to people who only need a 2/4 or at best a 4 core?

Its literally throwing profit in the garbage.

No one is lying about TDP. You need to find out what their definition of TDP is, Intel’s TDP doesn’t include boost speed for example. AMD is not free of trickery here either.

There’s nothing wrong with a 2600K. A 6700K will just be better at whatever the 2600K can do. Same with the Ryzen.

You keep forgetting this is the 1st time we’re getting chiplets. AMD is able to sell near 100% if not 100% of the entire wafer.

Both Intel and AMD are pushing post 6 core for gaming for the future. For the near present you’re absolutely correct, most games won’t benefit from more than 6 cores.

Depending on the layout and app, yes you’re absolutely correct. Some CCX will be 4+4 while others look like they’re full 8. A 12 or a 16 is guaranteed to have worse latency. Question is how much will it matter? The i/O module can make or break performance. Also the 5Ghz clocks might offset some of the problems a bit. We’ll have to wait and see.

That’s not how Google SEO works. Scanning through the vid it’s at about 70fps with no activity. Yeaah noo…g

Where are you getting this info? Not saying it’s wrong, what’s the source?

This is the part you don’t seem to understand… chiplets are cheap. If Intel/Nvidia dies don’t make the cut they’re done for. AMD keeps going down lower in the totem pole with chiplets. This isn’t even possible on Zen or Zen+ yet. Their profit margins aren’t plummeting from lower prices.

AAA games are already using more than 4 cores now. GTA 5, Witcher 3, AC, BF to name a new. It’s hardly ‘overkill’.

Radeon 580 was priced in that range. The 580 replacement would be expected to cost around as much.

They were mum about Zen also and those turns out true.

Chiplets being cheap doesn’t mean anything.

The $.50 materials cost for Coffee and $10 minimum wage for workers doesn’t account for my coffee costing $7.

There aren’t many companies that will sell things for less profit just “because”. If they can make more money on it, they will.

Its actually the law in the US that (if they are publicly traded) they MUST maximise profits.

You think so but their margins are WAY higher even than Zen 1 and they are wanting to gain market share from Intel still. The design takes very little effort to scale cores off the base designs.

1 Like

There is always something left on the floor no way of knowing how the consumer will react. You can sell 100k at $200 each or 500k at $150. The cheaper price is more profitable…

1 Like