Can AMD Pull Off SoC? or..Schrodinger's Rabbit

+1

I don't think any company should waste their time investing in SOC's because parts of a device will always sell more often then a device as a whole.  that said I have no problem if AMD makes tablets.

It's not as simple as "just making a better chip".

x86 chip manufacturers (Intel owns the patent, AMD and VIA are the only licensees) have to drag along the Wintel platform, which is really weighing them down.

Intel and AMD have been making SoC's for a while now, they're doing ever more functional integration with every new generation, but they just can't go as fast as platforms that grew post-linux. They have Windows to worry about, so they can only go very slow.

Microsoft is not interested in innovation, they are only interested in a locked down platform that doesn't cost them much in terms of developing new products, but in which they can charge huge license fees. They've been holding x86 down since linux came out basically, and are suffocating the Personal Computing market more than ever before. Intel and AMD have to put up with that, they have no choice, because their US shareholders wouldn't want them to develop for linux only, which is a luxury ARM based chip developers have.

The most sold GPU in the world for the moment is arguably the Adreno family of GPU's. This GPU line from Qualcomm is the result of the research that dates back to the days where Imageon was still owned by ATI Canada... yup, Qualcomm gets it's GPU building experience from exactly the same source as AMD. The second most sold GPU on the planet is arguably ARM's Mali GPU family, developed by ARM Norway.

The most sold x86 platform GPU is arguably the least advanced: the Intel iGPU family. Intel has not been able to tap into a longstanding GPU making experience, and they're having HUGE problems making any progress at all. They are making progress in linux with Beignet, but they can't do anything with it, because of Windows being their primary technology focus, and they couldn't bring out a product that does something in linux that it doesn't do in Windows, even though they can. That's why their products evolve so slow. Bringing a new SoC to market takes years for Intel, months for Qualcomm. And when it comes out, Intel faces a lot of incompatibility problems that lead to problematic performance, whereas Qualcomm doesn't have those problems, because everything that is based on linux is just technologically far less fragmented, and the toochains for development are present and free, and the 1.4 million userspace applications are already there...

nooo that's not what happening.... man why all the hostility here?? so many of you have griefed on me now.. why bother.. you have all destroyed my will to contribute to YOUR community.

 

and as a FINAL OUT.. the TV IS 3D!!! I worked it out that the panel needed to be able to output 1080p twice for each eye!!  .. I think there is something wrong with the community here.. you all seem so instantly hostel. I've had ONE person be friendly to me since I got here (I signed up over a year ago but been busy)... I THINK the community here ENJOYS treating people like NOOBS.. GLORIES IN IT (I seen you ALL BASHING a guy looking for help, in a thread..he wasn't even doing anything!!!) ..even when they actually have experience. that's not enjoyable . BYE AND OUT.

 

PROFILE DELETED BY ME!

We're not being hostile on purpose. You have written some controversial ideas and we're reacting to them.

  • When a user challenged your idea you took it as a personal attack rather than criticism.
  • You claim to own a TV with extremely high resolution that doesn't seem to be available on the consumer market.
  • You wrote that you invented the Windows recovery partition and that your suggestion to include the Downloads folder on the start menu was your original idea.

Of course Tek users are going to question these assertions. But we're not personally attacking you, just your claims.

Another post full of bullshit, do I really need the time to explain why you are bullshitting?

 

 

Yes, please explain :D

Intel made a deal with IBM to make sure Intel isn't the only processor manufacturer. (Reason why the got AMD), also the reason why Intel will never have full control over the processor market.

 

Zoltan are you implying Intel is slowing down the x86 manufacturing process?

 

SoC are still considered new for both manufacturers.

Windows is in no way slowing the process of SoC processors down. (Where do get this rumor from?).

 

Microsoft adopt their code to support CISC (CISC is for x86), and can support all CISC processors.

RISC (which ARM uses) are used on smaller devices. An OS can support both CISC and RISC.

Only certain things require smaller hack to function well, like SMT (Hyper-threading) and CMT (Cluster-core, what AMD calls an module).

 

Are you saying Intels iGP isn't making any progress, because it certainly is. If I remember correctly, their plan is to increase the iGP performance by ~40% for each generation. Intel wont jump in the GPU market because they aren't able to compete in it, they wont because it wont be worth it.

 

They would need to invest heavily on GPU manufacturing process, doing a ton of research to make their GPUs competetive. This will require a ton of time and cash.

 

Intel and AMD aren't focusing on windows specifically, they can help them to adopt certain features the CPU support. (Like SMT and CMT)

 

I guess you've made up your mind about this community, but for anyone that is interested a 1080p 3D TV does not output 1080p twice for each eye. 3D TV's deliver their image in one of two ways.

In passive 3D TV's, both images are displayed at the same time at 1/2 1080p. The image for the left eye might be dedicated to every odd line and the image for the right eye might be dedicated to every even line. Each image is in essence 1920x540. You then wear your polarized glasses and each lense blocks out the light dedicated for the other eye and the brain sees a 3d image.

In an active 3D TV, a high refresh rate (120HZ if I recall correctly, but this could be wrong) is used to alternate between each image at full 1080p. You then put on your battery powered glasses which blocks the light from reaching the wrong eye for each given frame. The end result is that each eye sees a slightly different image at 60 fps giving the 3D effect.

To my knowledge there is no such implementation of 3D TV that displays multiple 1080P images concurrently unless that TV is a passive 4k 3D TV, in which case each image would be 3840x1080 (1/2 4k). And for the record it wasn't my intention to bash anyone. In fact I did my very best to articulate clearly why I thought you misunderstood what was happening when you applied your custom resolutions. I do apologize if I came across as being abrasive. That being said, when you make extraordinary claims such as owning a 6k TV and developing the windows recovery partition, people are going call these claims into question and you need to be prepared to defend them. I basically live my life with the idea that extraordinary claims require extraordinary evidence.

To the OP, if you've not deleted your account by now, concerning the nature of the community and what you perceive as 'open hostility': This community, for the most part, is filled with people who want to learn more and more about technology in an open forum with free discussion. There are several members of this community who have rather specialized knowledge about certain fields and topics, such that their input holds a certain degree of credibility. The thing of vital importance to realize is that, if we question your sources or the statements that you make, it is not an attack on your person. We simply want to make certain that the information presented here is accurate and understandable.

As for what I think about your personality based on your interactions on this single thread? I think that you may be a little too sensitive to any critique, since you immediately went on the offensive after vmN sought to clarify the differences between the architecture of processing units used by a CPU and vector units used by a GPU.

Take it easy, we aren't all out here to get you, but if you play the victim routine, then you will get trolled by some of our members who have a low tolerance for melodrama.

I'll admit, reading this was a little painful...

"Intel wont jump in the GPU market because they aren't able to compete in it, they wont because it wont be worth it."

So true. It amazes me how few people understand this. Intel is improving their CPU graphics so people can actually see some improvement in performance. Most people do not need anything close to the power of an i3, but the low grade graphics on Intel chips is noticeable.