The Core i7-1165G7 was seen shredding a Ryzen 7 4800U Zen 2 APU in single-threaded benchmarks

When it comes to Intel’s Tiger Lake processors, quite a bit has leaked to the internet already ahead of its early September launch. We’ve seen benchmarks of various members of the family, with the most frequent appearances being made by the Core i7-1165G7. In one of its most recent outings, the Core i7-1165G7 was seen shredding a Ryzen 7 4800U Zen 2 APU in single-threaded benchmarks.

Source: https://hothardware.com/news/intel-tiger-lake-core-i7-1185g7-xe

Lot’s of interesting news today! :slightly_smiling_face: (well, rumors, not news, but it’s still fun!)

EDIT:

This part of the story is interesting: “Intel also announced advancements in their Xe GPU technology, strategy, and planning that could shake up the industry in the next couple of years.”

Source: https://arstechnica.com/gadgets/2020/08/intel-is-getting-serious-about-xe-gpus-laptop-desktop-and-datacenter/

“It has been a long time since any third party really challenged the two-party lock on high-end graphics cards—for roughly 20 years, your only realistic high-performance GPU choices have been Nvidia or Radeon chipsets. We first got wind of Intel’s plans to change that in 2019”

If they actually jump into the space and there is a genuine third competitor things could get very interesting.

First generation entries never topple the existing entities. They could put out a solid gpu that, with iteration, will be competition. That said, I don’t forsee anything groundbreaking from Intel in the GPU space this soon into the fray.

1 Like

Yeah, I agree - it will probably be a while before they’re even competitive. It’s nice to see them getting involve though. :slight_smile:

pfft… If they had taken that GPU team budget and invested in node improvement/fabrication… they would all ready have 7nm or 3d 10nm chips mainstream by now.

Like… Seriously, 10nm mass production is the closest intel gets to “vaporware” in the component segment.

They’re trying to focus on the all in one computer units, vs the custom consumer market.

It’s pretty clear - between the big little + igpu improvements, these will benefit laptops and all in one units like nucs most.

1 Like

It’s because the mobile market is far larger than the desktop DIY gaming market. You may only know a handful of people with top of the line gaming PC’s, but definitely know countless people with laptops.

I’m not looking for an argument and assume you’re being facetious - but this is nonsense. People say this about Blizzard/WoW too. These are multi-billion dollar profitable companies that have more than enough budget for simultaneous initiatives. Also, what indication do we have that this is simply a “throw more money at it and it would be fixed” issue? I believe the underlying technological issues are a bit more complex. (If you are being facetious, then apologies for the serious response.)

Both of these answers are more nuanced and match my understanding as well. My understanding (based on rumors, not insight) is that they are improving their GPUs and may make a move into high end GPUs if the market presents an opportunity that they believe they can capitalize on. Heck, Nvidia still has a stranglehold on market share - they don’t have to win to have a viable division, but they also need to not be windows phone competing with iphone and android.

But Ray has it right - every laptop does need to have graphics - at least until Neuralink is further along… :slightly_smiling_face:

What do you guys mean by “first generation entries never topple the existing entities”?

Sorry noob question

My understanding, and hence my response, was that she meant that Nvidia and AMD are the two established companies that dominate the market (existing entities). Intel, in this case (and assuming they go this direction), would be the new entry in the high end GPU market (first generation entry).

So the interpretation is that it will be difficult for Intel to become competitive and grab market share, particularly in the early stages, if they decide to try and compete in the gamer/enthusiast GPU space.

Lol, no problem - we’re all playing armchair-quarterback here… :slight_smile:

Oh so you were talking about the onboard gpu then?

As far as the cpu goes its very good isn’t it?

That part of the conversation was about the GPU and how Intel may be planning to build discreet GPUs in the future.

If the rumors are true, yes the CPUs are performing well.

good attempt, but I will explain it myself, if that is ok?

Whenever a company attempts to enter into another enterprise, it is doing so with new resources, and no previous foundation to build on.

While Intel may benefit from knowledge of existing GPU architectures, and a baseline understanding of how they work… They have not had to fabricate, iterate, and streamline their own discrete GPU card. Nvidia has 20+ years, and Radeon has nearly as long as well. They both have their own architecture (because such things are INCREDIBLY proprietary). This means that Intel must also create their own proprietary architecture to develop a competing product. This not something that comes easy, and based on how they are doing CPU’s, I don’t see them being particularly revolutionary with GPUs out the gate.

Which is why I suggested that while they may come out with an interesting card that can do performance, they just don’t have the backbone of previous work to skip the much needed development that comes with creating something.

A good example of this is AMD Ryzen. Now, AMD had the benefit of being in CPU industry almost as long as Intel. Even with that being the case… it took them 8yrs to bring the Ryzen architecture from idea to fabrication, to the home. And then they spent the next 3yrs iterating and improving. You look at the current generation Ryzen chips and there is only a statistically represented difference between them and Intel in many single-threaded tasks, and AMD multi-threaded tasks have been doing the rounds on Intel since the 2000 series chips.

Intel will be in the same position, but without the history of GPU production. This puts them at mostly a disadvantage. That said, it is important to note that this also gives Intel an opportunity to try something completely new. As they are not (technically) tethered to a pre-existing architecture that they are trying to get returns from investment on.

My final verdict is still that Intel has an uphill road ahead of them. They CAN do it, I just don’t think they will do it first gen.

1 Like

And yes, I was being facetious. And sarcastic… Mostly.

Their gpu’s can look mighty impressive on paper but until we have real world performance numbers, it doesn’t mean anything.

Kind of like how Bulldozer looked great on paper.

Why you have to say that man. We don’t say the quiet part out loud anymore.

Feels great on paper, too.

/sensible chuckle

As one Intel employee described it to me “we go out the front door straight to the mail box at a brisk walk while they (AMD) plant to sprint around the entire lawn to get to the same mail box”

i can’t believe nobody got my paperweight joke

I did… I just didn’t have time to respond yet. My company installed some new software on the work computer to track what we are doing during hours (Yay, work from home). So had to break out the laptop, and… let’s just say it has been a while since I used it.

1 Like

I used bulldozer for a long time. Sure, it wasn’t the best performing chip… but that thing was like the AK/47 of CPU’s. the chip itself didn’t die until core 4 finally bricked.

Take a wild guess how many cores it took to brick my i3? I’ll give you a hint: 4 minus 3.

1 Like