Why are there no Laptop AMD-FX processors?

I purchased a laptop about a year ago that has a A10-4600M and a 7970M. Expecting the processor to perform about on par with intel's moblie high-i3 low i5 range. However I noticed that while playing some games. My old dell was able to run games faster and smoother than the laptop I bought. Thinking it was just the GPU, I looked up bench marks and found out that the 7970M performs twice as fast than the 9800GTX I have in my Dell. Looking up benchmarks of the 4600m I found that it performs no where near the capabilities of Intel's i5 laptop processors and in some cases performs even worse then some of AMDs old laptop processors. So I guess my real question is why as AMD allowed intel to practically take over the laptop processor market because I don't hear too much about their laptop processors as I used to? How come no one over at AMD decided to make the FX line into a laptop processor only the under performing A series? This laptop is a prime example of what Logan describes as a Tricycle with a rocket attacked to it.

I'm pretty sure it has to do with the heat amd's fx processors give off.

FX processors use too much power and put off too much heat to put in a laptop. The laptop would have a large and heavy power supply, be loud, and basically not have a battery.

Since the no FX mystery has been solved. Why have they only decided to make laptop A series processors. Instead of also making a new line like they did before with Turion (unless I got this mixed up with something else). Intel based laptop have gotten way more expensive from what I've seen. There is basically no competition in that market.

A quad-core FX at a similar clock speed to Intel's mobile CPUs wouldn't be prohibitively hot or power hungry. I don't know why they never did that.

We can experiment. Can anyone here with a desktop FX series run some benchmarks with 4 active cores at, say, 2.5-2.8ghz and report what kind of temperatures you get compared to normal?

The reason you don't see fx processors in laptops is because the 8 cores are literally server parts. The fx parts are also too power consuming and hot. The lowest performing fx processors are 95w. They are cheap but that is too much power for a laptop. Amd is probably looking for new laptop and mobile solutions as we speak. Intel has always been about making really fast parts that are as energy efficient as possible. Amd is oriented at being a great option at a cheap price. They aren't concerned about power efficiency. The fx 9590 is 220w. That is an insane amount of power. The core i7 4930k only draws about 130w.

They would still need to scale it further down to match an Intels heat and power output, and a quad core FX (4300?) clocked at 2.5-3.0GHz would still be way behind an Intel CPU at similar clocked.

 

Bulldozer and it successors aren't great at lower clockspeed. (That is where bobcats family is)

 

And the limited cooling solution on laptops are generally horrible.

Not saying they have to stick 8 or 6 cores in a laptop. I guess what I'm really trying to get at here is why hasn't AMD released an i7-4900MQ equlivant. Or at the very least something that can at least perform on par with an i5-4200M. Just processors that are barely more capable then the phenom series of laptop processors.

The problem is primarily the inefficiency of the clock cycles of the AMD lineup of processors. They simply aren't able to perform as many operations per clock cycle as Intel, this is why a 3.1 GHz Intel Core i5 can squash a 4 GHz AMD FX 4350k in synthetic benchmarks. Of course, these are traditionally seen as important and, ironically paradoxically insignificant in relation to the actual performance in everyday tasks which aren't optimized for such testing. I would love to see AMD take a greater concern in the development of next-generation transistor gate and sink design. Intel still holds the patent for their tri-gate design, but that may not necessarily be the best design that is possible. The thing that AMD needs to focus on is not the extreme minimization that Intel is so gung-ho about, but instead focus on shrinking the transistor size to the point at which you reach the threshold where the heat severely hampers the capacity to reach higher clock speeds, and the leakage of the transistors is the primary limitation. Only bother with going that far, then find ways to optimize and lower the power usage for that specific clock rate. I know Intel touted their use of rare-earth metals such as Yttrium in their transistors to help prevent leakage when they shrank the transistor size to below 30 nm.