Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Ulfric

macrumors regular
Apr 4, 2018
159
123
Maybe FreeSync2 works with DP 1.2 but HDR10 does not.

Freesync2 monitors must have to have HDR10 compatibility. That was a primary requirement for getting Freesync2 certification. Or At least what I know.
 

cube

Suspended
May 10, 2004
17,011
4,972
Freesync2 monitors must have to have HDR10 compatibility. That was a primary requirement for getting Freesync2 certification. Or At least what I know.
Some FreeSync2 monitors only have DP 1.2 and HDMI 2.0 . DP 1.2 does not support HDR10.

AMD only mentions "DisplayPort". It would be weird if they only supported FreeSync2 through HDMI in these monitors.

FreeSync2 does not use HDR10 and requires application support.

You would still need HDR10 for other content.

You don't want to use 2 cables, hence the need for DP 1.4 .

Because HDMI 2.0 does not support 4K@60Hz 4:4:4 10-bit.

I use a 4KTV with a 960M. The performance is similar to Vega 11.

But the laptop does not support HDR and and the TV is crippled by HDMI 2.0.

But as long as there's no new 3D displays, I am sticking with it (when I buy a new computer at least I will get HDR).

I would like a 46" 8K 3D HDR SmartTV with FreeSync2 and DP 2.0 next (I have 48" now).
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
I wonder how the die sizes and fabrication of Zen 12nm and Zen+ compare.

It would be wasteful to make the first when it would cost the same to make the latter.
It seems the 12nm Zen was being produced before Zen+ launched.

I thought it was a pipe cleaner.

The sizes of the 14nm 1600 and the 12nm 2600 would be the same, with more space between transistors in the 2600.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
So the 300GE is Zen+ 12nm but the 3000G is Zen 14nm?

And I thought the normal APU numbering confused people!
 

cube

Suspended
May 10, 2004
17,011
4,972
Some games still like SMT disabled.

The 3500X is also available boxed with Wraith Stealth in China.

It would have been better to use the fastest gaming GPU to test:


The 14nm 1600 comes with the original version of the Wraith Spire, the 12nm 1600 comes with the smaller and cheaper Wraith Stealth.

The Stealth is not great for 6-core:

 

cube

Suspended
May 10, 2004
17,011
4,972
There should really be Zen Athlons with 5 CUs.

Now you have the A6-9500 with weak cores and the Zen Athlons with weak GPU.
 

cube

Suspended
May 10, 2004
17,011
4,972
To win me over in notebooks, AMD needs Thunderbolt besides a 256-bit FPU and 8-core.

Until Intel implements 512 in performance parts.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
I found more suck in FX compared to K10. Really, moving from Phenom II X6 makes no sense unless you're desperate for compatibility.

Well, FX always fights back:


 

cube

Suspended
May 10, 2004
17,011
4,972
People are paying more than 100 bucks for a used bare 8350.


I think desktop parts need twice the RAM channels.
 

cube

Suspended
May 10, 2004
17,011
4,972
Just use SODIMM everywhere.

It would be interesting to see how the A10-6700 performs against AM4 (cache).

People are paying more than 100 bucks for a used bare 8350.
I think my best option will be to buy a used Phenom II X4 and still wait for a cooler PCIe 4.0 chipset.

I would prefer to push it until DDR5 arrives (better get rid of DIMMs).
 

cube

Suspended
May 10, 2004
17,011
4,972
Athlon: 2 channel
Ryzen: 4 channel
Threadripper: 8 channel
Epyc: 16 channel

DDR5 can wait.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
I think the new sockets should be planned so that AM5 is 4x DDR4 and there can be AM5+ DDR4+DDR5 CPUs compatible with both AM5 and AM6.

There could even be hybrid AM5+ motherboards like in the past where half the slots are DDR4 and the other half are DDR5 allowing only one SO-DIMM per channel but compatible with AM5, AM5+, and AM6 CPUs. Being able to mix a total of 4 sticks of DDR4 and DDR5 would be an added bonus.

1. I built a system with an AM2+ mobo, AM2+ Phenom X3, 2x DDR2.
2. I added 2x DDR2
3. Upgraded to AM3 Phenom II X6
4. AM2+ mobo died
5. I could have bought a hybrid AM3 mobo and keep the DDR2, but that meant only 2 usable slots. So I bought an AM3+ board, 2x DDR3, and a GT 720 (sucked, I wanted an R7 260).
6. Replaced the DVD burner with a BD burner.
7. AM3+ mobo died. Repaired.
8. Replaced the system HD with a 2.5" SSD.
9. Replaced the 2000-era case with a USB 3.0 case.
10. SSD crawl. Refresh.
11. Replaced the 2.5" SSD with an old SATA M.2 and the GT 720 with an ex-eGPU RX 460 (fiddled with). 2.5" went to a laptop.
12. Upgraded the cooler and added fans.
13. Upgraded the PSU, bought an AM2+ mobo and case, and reused the X3, DDR2, GT 720, DVD burner, old PSU, and an SSHD to build a new system.
14. RX 460 died. Temporarily replaced with a new GT 710 (sucks).
15. I could upgrade the AM3+ system with an 8-core AM3+ FX and move the X6 back to the AM2+.
16. Replace the AM3+ system internals with Zen and move them to the AM2+ system.

I was disappointed that FX had half the FPUs per core than K10.

I was disappointed that you could not put an AM4 CPU on an AM3+ board with DDR3.

I am now disappointed that one cannot find 8-core FX with Wraith at retailers anymore (I would put the Wraith on the AM2+ system).

I saw an interesting FX deal quite a while ago, but it was not Wraith. Something much better came up later when I was not looking.

A new 8-core FX with Wraith should not cost more than Ryzen 1200.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
It would be better to buy TRX40 than AM4 because Threadripper 3rd gen might be made compatible with future 8 channel motherboards (operating in 4 channel mode of course), while AM4 should be too small for 4 channels.

If there are salvage dies, it would be good to bring back 8-core Threadripper.

Too bad TR40 is not backwards compatible.

Even $100 for FX looks good now if it enables waiting for TRX40 prices to come down.

It looks better to spend 60 euro on a B450 board.

But not great if you want to be able to go all the way to the 3950X eventually.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
DDR5 supports 2 subchannels while DIMMs have the same number of pins.

So it would make little sense to introduce bigger sockets just to support 4x and 16x DDR4.

I think it would also be bad to make AM5 very different just so that a few people can run 4 displays from an APU.

So the upgrade paths would be:

AM4 (2x DDR4) -> AM4+ (DDR4+DDR5) -> AM5 (2x "4x" DDR5)

sTRX4 (4x DDR4) -> sTRX4+ (DDR4+DDR5) - > sTRX5 (4x "8x" DDR5)
| | | (one arrow straight down for each)
v v v
sWRX8 (8x DDR4) -> sWRX8+ (DDR4+DDR5) -> sWRX9 (8x "16x" DDR5)

SP3 (8x DDR4) -> SP3+ (DDR4+DDR5) -> SP4 (8x "16x" DDR5)


sWRX E-ATX can manage with SO-DIMM, but people coming from sTRX should prefer DIMM.

There should also be diagonal upgrade paths for Threadripper.

================================================================

A German site checked the chip ID code of the 12nm 1600 and it is the same as the 2600 (but it seems they did not check the microcode version).

Overclocking puts it almost on par with the 2600. Maybe the latter is a better bin.

The reason given by AMD is that they stopped making 14nm CPUs and they had no more 1600 inventory.
 
Last edited:

jinnyman

macrumors 6502a
Sep 2, 2011
760
670
Lincolnshire, IL
Finally 3rd gen zen mobile is going to blast through whatever Intel will be offering.
Great future (at least for a while) for AMD,and we shall see how Apple will respond.
 

faust

macrumors 6502
Sep 11, 2007
382
173
Los Angeles, CA
The point would be to have inexpensive DP 1.4 motherboards for FreeSync2 APUs.


All of the boards in that link have DisplayPort I/O on the motherboard. It's the kind of board you should be buying for usage with an APU like an AMD Ryzen 3200-3400 G.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.