The Tek 0214: This Does Not Stay in Vegas | Tek Syndicate

About Comcast talking about using more electricity, I'm not trying to defend ISPs, I'm just trying to understand some things.
I know that, on a computer, the amount of electricity being used depends on the system load.
Does the power consumption of an ISP's network depend on the traffic ?
Is there an estimation on how traffic affects power consumption for, let's say, Comcast's network ?
I'm just curious.

I'd say one of the big reasons we aren't likely to see add-on/ modular GPUs for consoles, or other similar upgrade capabilities, is that similar attempts have been made before, and most have been commercial failures.

Peripherals, or anything that doesn't come with the default system, usually don't do well in consoles for a number of reasons. We have good examples of the difference between an included peripheral and a later add-on peripheral in the last console generation, and the relative success of each (Motion controls for Wii, PS3 and XBox, by the far the one that did the best was the built-in Wii motion controls, even if the others were in some ways superior).

But for better examples closer to what Wendel mentioned, and as Qain touched on very briefly when he mentioned the n64 memory upgrade, we need to look further back. There was a console specifically designed with the idea of using modules to upgrade it's capabilities over time, so that consumers could get one base console now and it could be improved over time as new technology warranted it without having to buy a new console. That console was the ColecoVision, introduced in the early 1980s. Some here may have heard of it, others may not have, I doubt most the general public has ever even heard of it. It was a decent competitor to the Atari 2600 in that it was a cool system. It could even play Atari 2600 games which was a huge deal (think about being able to play PS4 games on an Xbox One or vice versa). But, it sold less than 2 million consoles over it's entire short lived life, compared to the Atari 2600s 30 mil and even the Intellivision's 3 mil.
It had a great slogan on the back of the box saying it was the "most advanced system of today and the future!" hyping it's module ability. Sadly, only 3 modules ever came out (the one allowing for playing Atari 2600 games, one that was raelly just a steering wheel, and one that turned it into a Coleco Adam computer with keyboard and printer capabilities). No real performance boost modules ever came out, it didn't last long enough to get to those.

It's not the only attempt like that either, some have had more or less success in different markets. The Disk Drive for the NES I hear did decently in Japan, though the N64s Disk Drive didn't do much anywhere (they didn't push it much either). The next big performance boosting module failure though came from SEGA, with all the add ons the then dominant Genesis received (32X in the U.S., SEGA CD everywhere), which ended up hurting the company overall pretty bad. By comparison, the SuperFX Chip the Super-Nintendo used to improve graphics and processing for select games by overriding the processor with an on cartridge one did a lot better, but still only a handful of successful games used it. The N64s memory expansion was a neat idea, but even fewer games used it at all, even though Nintendo was smart enough to make the expansion pack an add-in you could get with the first major game to use it (Donkey Kong 64). It seems Nintendo learned after that to stop making that kind of extra add-on, since it's rumoured the motion controls were originally designed as an add on for the Gamecube, but Nintendo made the smart decision to release it with/as a core part of a new console. There are more examples too, but they get even more obscure (and thus less successful).

So, a Modular GPU console would be pretty cool, but I doubt we'll see it since the console makers are mostly pretty well aware of how similar attempts of improving systems late with optional add-ons have played out. Even ones that were good, usually didn't make them much profit, and some are believed to have hurt companies pretty badly (as with SEGA, who no longer makes consoles).

I've always felt @paul just belonged on the Tek Syndicate team. Paul is cool peeps. Paul is good. More Paul.

1 Like

a good drink for you guys to have is

a black/white russian for @Logan

coffee liquor and vodka (white russian has either double cream or baileys etc)

The reason why they failed, is because console gamers want the console to be as simple as possible.

One of the complaints that they have about PC gaming is that you have to know what CPU, what GPU, and how much RAM you need for every single game, even though, that's not actually a problem, since you can tell a store what games you want to play, and then, that store will assemble a PC that can play those games, and it will also install Windows and all the drivers, but apparently, console gamers don't know that they can do that.

They don't want to know if a certain game requires a certain module for a console, they want a box that can play all the games that were made for that box.

Also, another complaint is that, when you have variable hardware, games need to have video quality setting, and you have to know how to tweak them, even thought, that is no longer a big problem, since most modern games have hardware detection.

Console gamers are casual, they don't want to bother with too much information, even though, like I said earlier, they don't have to.

Sure, that's one reason, though I think there are more reasons and there have been simple enough attempts that even console gamers didn't mind that aspect in certain instances.

Another problem is simply that you have to convince developers to make games for the add-on/variation, and you need to convince a lot of them to make games people want to play. Otherwise, console gamers will resist shelling out extra money for an extra that doesn't allow them to play new games they want (the exclusives angle is a big deal). But, since not as many people have the add on as have the base console, developers are more likely to opt to just make a game for the base console as it will have a wider market. So, you end up in a tough situation to get the add-on to catch on, since consumers don't want something without the games, and developers don't want something with a smaller market (in general).

Now, something that has been done pretty smartly is to package a new add-on with a particularly desirable game (or, proportionally desirable). Nintendo did this in a couple instances with the Wii, giving new controller add ons like the motion plus with Wii Sports Resort, and some other instances. Guitar Hero and Rock Band did this with guitar controllers and their games, and some other games have done it as well (though we didn't end up seeing many other games developed for most those controllers, so it a certain sense that didn't spread the way you'd want a module GPU to). Sony tried to do with with several Move titles, but it didn't work out for them, so even that is a risky move (and thus all the more reason to avoid the field).

But, whatever the reason, the attempts have failed in the market, and so companies are going to avoid going into it in most cases.

I haven't heard from any local multiplayer game on the console apart from Fifa in a long time. Could you name some?

I have played local multiplayer many times at a friends house on a laptop connected to the tv, mostly games up to 8 people using playstation, xbox, and pc controllers via bluetooth or usb hubs. For example team buddies, starwhals, rocket league, castle crashers, and many more. I don't get your argument why consoles are better at this?

I don't know about current versions of sports games, but I know that older versions of FIFA and PES for the PC support two players on the same PC, each player can have it's own keyboard, or gamepad, or you can have one player on a keyboard and another on a gamepad. Is that still supported on the latest versions of FIFA and PES ?

In order for that to work you have to include a module with every game that requires it.
But, what if you buy two different games that both include the same add-on controller, and you might only need one controller.
Imagine that, instead of a controller, those two games both include an expensive GPU module. You only need one of those modules per console.

Absolutely, which is why that model doesn't carry over to everything. Though, typically there is only ever one game that has the add on built in. I don't recall anything other than Donkey Kong 64 coming with the memory expansion, for instance, nor anything else coming with the motion plus packed in, but I could be wrong. The point of the pack in with a game is to give the person something to do with it and show off the capabilities, and if its a popular enough game that people want anyway, suddenly the add on is in the hands of enough people to tell other developers it's worth using. This does not always work out though either (Final Fantasy XI did this with a ps2 harddrive I believe, that didn't catch on).

With a GPU, I'd say this would be harder, due to the greater expense that is likely for the component. It would likely leave you in the 32X situation, which was one of the worst failures in console history.

Though, on second thought, the SuperFX chip essentially did this too with including the better processor/graphics capabilities in a chip on every game that used it. SEGA criticized it saying the 32X was a "one time purchase giving lots of capable games", but the superFX turned out to be a better fit. However, you can't do that on disc based consoles.

Looks like you've only posted Youtube comments until now.
The reply button on the bottom of the page is to reply to the entire thread, not to a specific user. That's how you make your own comment.

Its mostly for games like halo COD zombies other fps and racing games. It is true that a lot of indie games are closing the gap in this respect specifically in the last five years but for the most part if people want to play split screen on any of the AAA titles it has to be on console. I guess the big game companies don't realize that there is a large number of people that want a living room gaming experience with their PC or maybe they do but they just want to push consoles for that. If PC got some good split screen fps's I would be happy. maybe there are some I just don't know about them yet.

Thanks so much for providing the source <3!

Yeah, local multiplayer is an aspect pc gaming is not very good at. Though it seems to mostly be an awareness problem (you can't local multiplayer on pc so people who want that don't even come asking so few games support it).
A good example of this is Portal 2. While I don't recall whether it had a simultaneous launch on all platforms, it had local coop on consoles, but not on pc (even though you could get it to work somewhat through console commands). Later into the game's life local coop was patched in on pc, but as previously mentioned it was always present in the pc version, just not officially supported or exposed.

Another thing to consider is, that if a game already has online multiplayer, adding local is going to loose some sales, since you only need one copy for local and at least two for online.

Something else related to this is the issue of typical pc input with a keyboard and mouse. I was quite surprised when I found out that using multiple keyboards and mice as different input sources is entirely possible on pc (well at least on windows, don't know about other oss) and has been for a long time. But how many applications/games do you know that support that? Yeah, none seems about right unfortunately (I only know a single one).

1 Like

Sony and Microsoft also bled money for 5 years before turning a single penny in pure profit (aka not revenue), Microsoft also lost huge amounts of money on the original Xbox. I can't find the original source anymore but Microsoft lost billions due to their consoles and I don't think Microsoft has ever actually turned a profit.


This is why Sony and Microsoft are (or were) so scary in the console market, they can afford to waste billions of dollars getting a trojan horse into peoples living room in hopes of making that money back within a decade later. Companies like Nintendo or Sega simply can't keep up without having laughably outragous prices.

Anyway I'm hugely disappointed that a Pachter "article' made it on The Tek. The guy exists purely for clickbait. Pachter is a guy who claimed the PS3 would kill PC gaming, in 2010 he claimed 5 year old 7th gen consoles were better than 90% of PCs in 2010 (http://www.playstationlifestyle.net/2010/03/23/pachter-ps3-is-more-powerful-than-90-of-pcs/). Pachter is the guy who said every year "THIS IS THE YEAR THE WII HD RELEASES!" only to give up and declare his dumbest comment ever; "PS3 is the Wii HD!". (http://www.eurogamer.net/articles/being-michael-pachter-interview?page=3)
Pachter is usually so incredibly wrong it's not in outside the realm of possibility that he's trying to influence people, companies and markets.

That is the dumbest thing I ever read. 2010 saw graphics technology make a pretty big leap with dx11 by the time he wrote that article graphics wise their where things running circles around the last gen consoles. A radeon 5450 a low budget low profile gpu offers more than double the performance of a nvida 7900 gtx which was the graphics card around performance of the one in a ps3. Hell I had to do a upgrade 2011 just to run games due to the graphics leap got a 550 ti back then.

Shows how small pc gaming was back then most of the comments are defending him lol.

4TB SSD i found me a 13TB SSD don't think i can get one, as I am a consumer
http://www.fixstars.com/en/ssd/spec/

Another aspect I think folks are missing are the children. I know when I was 6 or 7 and got my first SEGA Genesis I was happy as shit and my four year old son loves playing PS4 now, but he can't play PC games. He doesn't know how Steam keys work. He wouldn't know how to troubleshoot or what settings to tweak, etc. All the benefits of PC gaming would be lost on him and it would cease to be fun for him.

I think the PC crowd is always feeling themselves like they are the hottest shit ever and honestly it's that attitude that makes me enraged.

1 Like

Do I really have to repeat myself:

What ? Steam keys ? Knowing how keys work is only useful if he wants to buy games. If he just wants to play games that have been purchased by you, he doesn't need to know what keys are. Why should a 4 year old child know how to purchase games ? Do you let him buy his own games on your PS4 ?

That's because, when you were a child, and when the SEGA Genesis was released, PC gaming was done on MS-DOS, and it sucked.
I think somebody called that period the dark age of PC gaming. Back then, consoles had a purpose, even for tech savvy people.
But now, Windows and Steam have made PC gaming so easy, that consoles don't have a purpose anymore.

I'm from Romania, and I was born right after communism fell, in 1990. Before that, people didn't have PCs or consoles.

I, and other children of my generation, got our first computers in a time when games were being developed for Windows, because nobody could afford a computer when I was 4 or 5, we got our first computers when we were around 10, so around the year 2000.

There was almost no gaming culture before I got my first PC, because our parents lived in communism, and even years after communism fell most people didn't have a console or a PC because they couldn't afford one, so it was basically a blank slate, nobody had any preconceived notions about consoles or PCs.

I also had an old Nintendo console, I forgot what it was called, that's how much I cared about it, I played on it for a while, but when I got my first computer, I immediately dumped it.
I dumped it because, even as a young child I realized that even if I play on a console, consoles can only play video games, I would still need a PC to do everything else. I preferred having a device that can do everything instead of 2 different devices.

I only knew one other guy who owned a console, a PlayStation, everyone else I knew, was playing, and are still playing, on PC.

My Nintendo and that guy's PlayStation are the only consoles that I have ever seen in real life, everything else I've only seen on the Internet. Before I had Internet, I didn't even knew that other consoles existed, or that in developed countries there's a fan base for those things. It's all very weird for me.

Then, I went to college, where I lived in a dorm, and guess what ? I visited a lot of rooms and there were no consoles to be seen, it was like they never existed. Every single student only had a laptop and nothing else, with some exceptions when somebody had a desktop instead.

If, during my childhood, PC gaming was on MS-DOS instead of Windows, I would've probably kept on using consoles.

I still game on my i7 920 @3.9GHz and I run pretty much everything at Very High or Ultra in 1080p... There were a couple of upgrades along the way like a sata 3 pcie card and an ssd. I recently bought a 970, it might bottleneck but I can't detect it when all the games run smoothly at 1080p. So, yeah, it's possible to build something that'll last a long time when all you need to change is the GPU from time to time.