The POWER and PowerPC General Discussion / News Thread

To be fair I ran the laptop version of that and it was kinda trash Core 2’s WANT CACHE, and they literally have barely any. If your C2 doesn’t have 6mb minimum your experience will be hell

1 Like

Thats a whole new interesting thing to the discussion. Using onlder hardware platforms to develop new concepts on systems you don’t want to break lol. I guess thats what I’m going with my G5’s lol

if I remember the cache resync was ass and had to loop back a billion times just to know what the fuck it was doing. Literally hardware that emulates Alzheimers

1 Like

Windows 7 and 8 and 8.1 used to get sluggish with time. Never had that issue with Windows 10 and now with 11 on my work laptop. I’d rather not run Windows if I can though, thank you very much.

1 Like

W7 is stupid fast compared to W10 today, W10 has become so big and bloated.

Sounds good. We need more of it. Too many people abandon devices that can be used just “because” and others are forced to because of lack of software support.

1 Like

Most every CPU wants cache especially if you are doing multi media or gaming. There are few scenarios where cache doesn’t help.
I had some of the lower end Steamroller & Excavator parts and the cache was the killer of the entire platform. It is interesting because Excavator which most people didn’t have was actually really fast, it was just hampered by only being released in SKUs with low cache. I think they intentionally did this to make Ryzen seem like a much bigger leap when they first launched it since all the “building machine” architectures had a bad rep associated with them.

Yeah but I am saying thta I hadn’t even felt the chips mere modern when they wer new. The only ones that I like are the custom ones built for apple, and thats because apple gave intel the finger. One of few things apple did to help the tech industry, legitimately, was pushing for more cache.

Its like THE reason I hate intel cpu’s. I like p4’s cuz they were my first pc, and I like researching where tech started, so pentium m’s are cool, but still both hampered.

As for the FX chips, I think theres more to it than thta. The 8 cores are just hyperthreaded quad cores, but the 6 cores ACTUALLY HAVE SIX CORES. I think theres some backend tomfoolery happening there and I think the unreliability of the boards at the time had to du with the mobo manufacs trying to keep up with however many sku’s amd put out.

I think their plan was that they knew they were gunna go dark for a while, so they put the extra effort into the console industry as fast as possible to make sure the company would have guaranteed output. Then whatever they had for cpu’s after they had released what ta the time was good, intel just kinda went ‘hi we optimize threads now’ and steamrolled until skylake. So they stuck to consoles and mass office sales with as many 860k’s as they could make.

Ryzen, I think, is possibly left overs from when they were trying to make real 8cores work in desktop sku’s and they just had to give up. And when, desktop wise, E450’s and athlons are what sell, well then ok what else can me min max.

It had more to do with keeping stock prices than proping up a later platform. I have seen interviews where the gist of the story was ‘when it finally worked we screamed and cried’ and having that with all the info from adoredTV and the overclockers that are out there, plus the absolute butfuck of multicore potential that early on, yeah of course they are gunna work on increasing core count, yeah of course they are going to research cache setups, and duh when they figured out that they didn’t have control of their fab anymore they had to do a different design than intel to make their multicore infrastructure as good as it wanted to be.

Basically intel was making the same chip for 12 years and not fixing it knowing they were putting their competition out of business via lawsuits, so their competition went low power mode and then when their batteries werx charged they drilled thru the wall of bullshit intel had built up and were like ‘hi, remember bulldozer? Heres its cousin’ and the same flip flop fighting happened again, just like the 6100 vs the 2500, but instead of being sued out of the ability to research, intel literally couldn’t do anything and panicked asd did what they did, ie xeon gold and all the other stupid bullshit to try and get stupid people to pump money into the fabs.

Its a joke when you understand it.

I’m sorry to disagree here but none of your analysis is even close to what happened to these companies at this point in time.
As multi-core CPU’s were taking a foothold (due to AMD making the first real multicore chips affordable). Developers were trying to learn how to make software multi-threaded and take advantage of multiple cores at the same time.
AMD’s idea with FX was to make 2 smaller cores that SHARED physical resources; so they could make more cores more affordable and run at higher clockspeeds without such a deep pipeline. Then operations like FMA (which was a big up-comming extension during this time) would use both of the “cores resources” to do operations faster. While software wasn’t so wildly threaded. Since basic operations are still integer; they have 2 integer ports per “core” and 1 fp port per “core” then the 2 cores can combine those 4 total INT units and 2x FP units to do FMA (int) and AVX (fp). So that the design would have both the benefits of many integer cores for multi-tasking but have more throughput for single bigger and heavier operations which were starting to take root.
Intel saw the prelim performance of AMD with FMA and decided to basically obsolete FMA because they wouldn’t be able to compete. So they basically took FMA out of their compiler, sold their compiler to many companies and de-optimized for AMD by pushing AVX instructions which they knew would halve FX throughput on each operation.

Remember also that Intels overall anti-competitive tactics had made AMD’s budget shrink significantly. Most OEM PCs were still selling garbage P4 chips and Core2 chips at a huge premium when AMD was faster and cheaper. Most computer stores in the USA barely carried AMD products due to “Intel rebate” programs and the rest of their nonsense that put AMD PCs in the back of stores, pushed propaganda about “incompatibility”, etc.
So AMD went all in on a design that would provide benefits on both sides and Intel just stifeled all PC innovation for years with their tactics. This is how Ryzen came out swinging. They built cores that Intel couldn’t de-optimize against without screwing themselves over, they built cores that beat Intel in every way at their own game. This is also why they sandbag on performance now, keeping details secretive and making obscure demonstrations that show them partly ahead when they are really beating Intel in another manner behind closed doors.

Console market only came after AMD made Jaguar (which was actually a lower power revision of the Phenom II / K10 Core) w/ a dual stage Front end and penalties for heavy math operations. These chips handily beat everything in the “low power” market at the time including offerings from ARM and Intel and as such Intel launched “Contra Revenue” which was basically a scheme to give away over $1BN of Atom CPU’s (again to shut AMD out of the market). Since AMD had spent money on Jaguar designs which now they couldn’t really get into any machines due to companies getting free Atom Chips + Designs (yes Intel literally would give away free schematics and help with board design just to keep AMD out of the market); they decided to leverage it in another way.
Console makers had no new IBM CPUs to work with as IBM had stopped making lower power POWER chips and the CELL processor from PS3 had proved to be very expensive to build and support… so both Microsoft and Sony were looking for an “all in one” company design as MS Intel + Nvidia design had lead to mass console failures & returns + neither company provided great support for their hardware to MS (which is pretty amazing considering its MS).
So AMD was the only company that has both CPU + GPU that is competitive and can be done in a good power envelope which could be mass produced and wouldn’t risk burning out like the XB360 did… Jaguar (based based on K10) was actually surprisingly good in games which didn’t use specific instructions so they just needed a bit more clock speed and they were good to go. Jaguar was cheap and AMD needed to fufill their GloFo contracts. So they gave MS and Sony a great deal on the GPU + CPU design and the rest is history.

Ryzen is a full Ground-Up Re-design by Jim Keller (A CPU architecture Legend) and also the person responsible for AMD’s K8 (Intels first real kick in the pants) and a co-author of x86-64 & HyperTransport.
The rest of them “having to do something because they don’t have control of their Fab” just isn’t correct at all. They spun off GloFo and I explained what happened before.
Intel sold revisions of the same design all the way until 12th gen, basically because they got stuck designing their “10nm” for a long time. So Basically Ryzen just was a more efficient design from the ground up and chiplets made it possible to create high core counts for cheap which is something that Intel couldn’t do. To today Intel cannot do it as effectively as AMD and this is why they went with the hybrid design with 12th gen. They simply cannot compete with AMD’s chiplet design in terms of mfg efficiency, yields and therefore cost.

5 Likes

Is there anywhere I can read docs on these chips? A lot of my info is based on other sources and info from a lawsuit that happened 4 ish years ago, so some of my memory could be hazy. However, I don’t forget documentation. I can save that :slight_smile:

Yeah I don’t have that sorta info. And by “Not having control of their fab” I meant, they literally didn’t own it anymore. It was its own company, and with things being split up like that, the logistics of the company moving forward was to fill the empty spaces that made sense in the market.

i think we’re on the same page we just both have different levels of understanding.

Blackbird does not use POWER8 chips

The original Talos I design would have used POWER8 chips, but it did not succeed in crowdfunding on CrowdSupply. It was a dramatically different board, and even needed to embed IBM’s Centaur memory buffer because POWER8 was not designed to interface with DDR RAM directly at all.

Both Talos II and Blackbird boards use POWER9 Scale Out (native-DDR4 not Centaur) chips, packaged in Sforza processor modules, socketing into a 2601-pin LGA-style socket.

There was an effort by Raptor to design a third POWER9 board (Condor) that would socket the larger LaGrange module, still containing a Scale Out (DDR4) chip, exposing IBM’s fancy OpenCAPI interface rather than purely PCIe, but that was cancelled.

1 Like

I’m honestly not sure where we can read about it in specific. I have been in the scene quite a long time and keep up with the markets on a daily basis.
Maybe there are some tech history websites, but I am a history buff and like to do deep dives into these things as well.

Please write some posts!

Depends on what you want to learn about. I can’t just write posts randomly without any context lol

Sure you can, this forum is crawled by search engines fairly well, so any bit of information would also benefit the world wide hivemind those who do not frequent the forum. If you worry it would make too meandering a thread, you could always post it in the #blog category. #wiki is a good choice for particular kinds of informative posts as well.

For example, I had a thread in the #blog section keeping track of orbital launches last year; 2021 Orbital Launches (COSPAR quick reference) perhaps a bit niche, but I thought it could be of interest to others.

I would say, however, that further discussion about AMD chips probably belongs elsewhere than this thread.

1 Like

Apparently suspend to RAM is not currently available on POWER9 boards, but the newer revision of Blackbird boards (1.02) makes it at least feasible.

2 Likes

It would be a bit curious why a workstation / server mobo needs to be able to hibernate, that’s a feature for laptops. Actually more curious why they are even trying to implement this.

1 Like

presumably so you don’t have to reboot and start up all your stuff from scratch every time

the alternative is running your desktop 24/7

2 Likes