Can AMD Pull Off SoC? or..Schrodinger's Rabbit

Hi. new here.

Let's chat about about the often sung potents of SoC.

There is no rabbit in AMD's hat. premise one.

AMD have often let their fanbase down, I say often but I really mean three times, but that hurt a bit didn't it? Phenom 1, Zambezi FX (on release) and their gaff on the server market, which means we don't get Opteron side-tech the following year for desktop.

The Phenom 1 is the best example, they had *every* opportunity to make a x64 4 core that would make Intel regret the 'nap' that was the pentium series..but they just couldn't do it.. it wasn't that they weren't trying, they just couldn't do it. ..if I were to guess, the money they made on the first x64 dual cores went into buying ATI.. which *WAS* smart! now they have the same opportunity, with no other 'Lions' in sight!

Intel can't compete *if* AMD manages x86/x64 parallel computing. it is estimated* 100 to 1000 times faster than standard CPU's. for this reason alone.. I think there is no rabbit in AMD's hat. AMD suffer from 'back luck/poor execution'* and NO company is *that* lucky.

I think there might be a 10% increase in computing power, possibly. and AMD will not unlock the potential of the GPU until Intel have managed to move to/past graphene and past the nanometer (2019-2025).

GREAT HOUDINI'S GHOST :O . premise two: they manage it.

I'm actually an AMD fan, I like the compatibility and not having to change my whole system when i upgrade, I have a mental block to new Intel gear because of this fact.

The stuff they are doing with APU's, is something special to behold, IGP's have traditionally been the equivalent of eating out of the bins at macdonalds, now you can game.. proper game with options-and-resolutions-and-3D-and-everything!.. all the gamers jazz we like to hear. 720p is easy now. 1080p is okay..read that again '1080p'.

AMD have been planning SoC for *many many* years.. infact they are the only major player to have read books on blackjack. *

They have money pumping at them from Xbox One and PS4 sales, not to mention their own graphics cards sales. They have every opportunity now.. the 'darlings' I think should be now courting AMD financially, because they have so much weight within the industry .. i don't mean the plain 'desktop' market or the shrinking AMD server space..but the general consumer is now getting to know AMD, oh wow!..and investors and potential engineers like that. imagine you're a genius engineer and you work for AMD, you have to explain to your family who AMD are.. you don't have to with Intel. that might and i think prolly will change soon.

So to conclude..

This in the next 3 to 4 years is AMD's 'big moment'.. rise and shine, or fall and red face*.

I root for them.. go AMD!

peace

 

Dava

 

Notes ------------

*I heard they patch it now but not sure as I didn't own one. but I hear its only 10% less in perf. to the Vishera's now.

*I watched youtube video about GPU's the guy was pretty convincing

*accidental pun

*(you could say a couple of others but money)

* rise and shine 'here comes the sun', or fall and red face 'Tom Green Movie'.

Intel can't compete if AMD manages x86/x64 parallel computing. it is estimated* 100 to 1000 times faster than standard CPU's

Please explain further.

 

 

GPU's can compute much much faster than CPU's.. the reason we need CPU's is they are designed for a specific task (and the instruction set etc).

 

Think of all the pure data that pours through a graphics card while your gaming..while that's happening.. the only thing the CPU is doing is 'flowing' the data, doing physics, and keeping the OS stable.. everything else is the graphics card/NB/SB and RAM.

 

GPU's read and process that data in 3D space in real time to your movements. If CPU's had that kind of power, we wouldn't ever see any kind of little spinning windows wheel while loading.

Well, it is not that simple.

 

GPU are made out of purely SIMD hardware, which is only an extension to CPUs.

 

GPUs are incredible fast doing certain task, and cannot do regular computing.

 

CPU are doing much more than you described.

 

 

GPUs are in basics just a lot of vectors sharing a couple of CU. All these Vectors in the same cluster needs to do the same instruction, else it will turn itself off.

 

Actually you would need an entire new way of storing and tranporting data if you want to get rid of windows spinning wheel.

 

NB (and SB in some situations) are already on the die of SoCs CPUs.

 

 

 

are you trolling me?

 

I understand what I have written, you are not educating me.

Well, then you certainly would understand what I have reading, because your statements are false. (More or less)

I thought CPUs were insanely good at computation. And I thought GPUs were great at graphical-associated computations but when it came to number crunching, CPUs were supreme. Someone explain if I'm incorrect using science/engineering (not your opinion).

From wiki.


General-Purpose Computing on Graphics Processing Units (GPGPU, rarely GPGP or GP²U) is the utilization of a graphics processing unit (GPU), which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the central processing unit (CPU).[1][2][3] Any GPU providing a functionally complete set of operations performed on arbitrary bits can compute any computable value. Additionally, the use of multiple graphics cards in one computer, or large numbers of graphics chips, further parallelizes the already parallel nature of graphics processing.[4]

OpenCL is the currently dominant open general-purpose GPU computing language. The dominant proprietary framework is Nvidia's CUDA.

it's really about each thread for each core. AMD have been investing in this strategy (for a few years now with APU's) as a means to not just add-in a GPU, but for the future.

I'm not trying to 'mis-lead' anyone. System on Chip means a whole lot more than a fancy IGP. I am talking up to 4/5 years in the future, depending on schedule, consumer climate.  but yes AMD's plan is to move to a HSA 

https://en.wikipedia.org/wiki/Heterogeneous_computing

which the first step is with system on chip having HUMA.

https://en.wikipedia.org/wiki/Uniform_memory_access

 

Soooo.. 'is this step wrong for AMD?' should be your questions.. not whether or not i have any clue what im talking about.

 

good day gents.

 

On an unrelated note, I just read your profile bio dava4444. Did you really invent the Windows recovery partition? I would have thought a team of people developed it.

SIMD = Single instruction multiple data.

 

You have a single CU (control unit) for what 32(?) vectors in a GPU.

Those vectors in the cluster (The one sharing the CU) need to do the same instructions. Example doing an add or something similar.

 

This is very effective for certain task, however, this leads to an issue when you are running multiple processes.

You can execute MIMD (Multiple instruction multiple data), due to some compiler hacks, but it cannot utilize the GPU effeciently enough.

 

A GPU doesn't have any ALUs (for integer), which is what a CPU offer. GPUs doesn't have the same memory-rights as CPU have. In the end GPU computing is still only helpful for certain task.

Hi.

I can prove my 3D TV does 6k.. here.. 

https://onedrive.live.com/?cid=2606815C874684A6&id=2606815C874684A6%21149

I have gave feedback after feedback to NV about the drivers not keeping all resolutions.

 

and ASH.. the public beta had a feedback system and I have a story how this happened.. but during the course of 4 months I sent MS over 40 suggestions.. 1 being adding 'downloads' to the start menu another was they had taken out Visualization in WMP for the beta, that was a group one.. well we all had an invite to technet, and a much longer story I usually only tell people when im in a good mood.

 

 

and vmN.. you are 100% correct man.. these are the problems that AMD is facing and has to crack to unlock to potential of GPU computing. 

 

also Prince Vultan.. I thought a  video of my TV might help you believe what I have said..if you think I just grabbed the pics from the web. here is my TV.. 6400x1600p @40hz. 7680x1080p@50hz ..HDMI.

https://www.youtube.com/watch?v=xsVctOcdOCU

 

7680 × 4320 is what your looking for.... and a 6k tv is insanely expensive.

 

hi  :) .. yeah i know man,  it was just an offhand comment in my profile coz as you can see in the vid.. i can't correctly quantify what 'k' it is

not shure believe it, but its enough that I'll retract my statement. 

statement retracted.

 

More information about text formats

#232323; line-height: 24px; background-color: #f9f9f9;">Text format

appreciated. have a good day man!

I don't mean to sound rude but your resolutions don't make sense to me. 7680x1080 would be 4 1080p monitors side by side but it appears that you are just running one 16:9 monitor so I wouldn't put any stock into that resolution. 6400x1600 (the only scenario I can even come up with to explain this resolution is two rows of 5 1280x800 monitors stacked) is also a really wide aspect ratio. It seems to me that you are just forcing whatever resolution you want and either your monitor or gpu is scaling it down so it will fit on your screen. That is why borderlands looks so distorted when you run it. Turn off all scaling on your GPU and monitor and I'd wager you will find that you can only see a portion of the image if any. I'm just curious, what is the model of tv you are running? I'm having a hard time even finding rumors of a 6k TV. Maybe I'm ignorant but your video pretty much just convinced me you don't have a 6k TV.

Hi crazymobster! .. what your saying is basically correct, but.. I actually (unless you mean 'screen squeeze') don't have any distortion, if you saw some it was prolly the camera on my phone.  My resolutions height wise go f to 1920x2400i@24hz.. that's the tallest my tv will do.. I have 3200x2400 but it makes my tv and whole system go nuts, black screen no recover..

but on the width side ooftt.. this tv is only *supposed* to be 1920x1080p..the highest (call it 'lateral'?) width is the one you seen in the video.. the best combo of both is 6650x2160i @ 24hz.. my tv can also do super high refreshes too..but the nv driver doesnt keep them.

not bad for a TV that cost me £279 .. I feel blessed.

 

..but this is all waaay off topic. PM me for more info man.

 

If your TV is only supposed to be 1080p then that is all it will do. Software hacks won't change how many pixels you have. You can modify the refresh rate and you can even modify the output resolution of your GPU (which is what I suspect you are really doing), but the pixel count of your TV is a fixed property of the panel. In fact, I have forced a 2560x1440 resolution on my 1080p monitor just as you have so I could see how games run at the higher resolution.  And yes, what I was calling distortion is what you call screen squeeze, though this is clearly not from your phone unless your tv really is 7 times wider than it is tall.