Building a home media server

That's exactly the point! That's exactly it! That is what everyone misses.

People think that "oh my game suck, I better upgrade my CPU" are just flat out wrong 99% of the time. The same holds true for CPU loads while gaming (more so for Plex/Emby than game-streaming).

If the CPU isn't doing anything because the bottleneck is the GPU, then it doesn't matter if there is encoding in the background or streaming or whatnot. The fact that the CPU that was idling, is now under load, just doesn't matter because the bottleneck is still the GPU. The only GPUs that can shift any amount of that bottleneck to the CPU (even an FX 8350 @ 3.4 Ghz) are those $500+ graphics cards. Even an FX 8350 @ 3.4 Ghz (lauded as terrible for gaming) can actually get every frame out of a 1050 Ti (or 99%).

If you have anything less, like a GTX 690, GTX 770, GTX 960 or 1050 Ti (see the trend?) then CPU loads will NOT affect gaming performance. People do not understand that because all of the other reviewers only ever test with $500 GPUs because they are trying to shift the bottleneck to the CPU because otherwise every CPU would perform exactly the same.

That is why I recommended to the OP that a cheaper alternative would be just to put the media server software on their home PC. Unless they have some card (card "x") that performs high above a 1050 Ti (6k points) and below a 1080 Ti (13k points), even under CPU load, the system will still game the same.

And by that point (say a GTX 1070), you are already seeing such a high average FPS counts that it doesn't actually matter even then. Humans can't really differentiate between 100 fps and 85fps after all.

As someone with a 144hz monitor and call tell between 144hz and 120hz, I LAUGH at that. I've mistakenly tested that on myself. After moving my system around to do changes it switched down to 120hz and I was unaware that the change had occurred (fucking dvi), but after playing CS for minutes I could tell something was wrong.

Yeah, but most people don't pair a $300 8 core 16 thread cpu with a <$150 graphics card...

That's just plainly wrong. My Xeon 1231v3, bottlenecks a 980 Ti (rough equivalent to a 1070), at 1080p and even at 1440p (using DSR to simulate) without having any transcoding task going on, just playing games. In battlefield 1 and 4, my Xeon 1231v3 bottlenecks the 980 Ti into the 80% range and even dipping as low as the 60% range at 1440p in multiplayer. The 1070 prices are back into the $400s and will soon drop back into the $300 range as the mining craze passes, that's inevitable.

Yeah it is one of the first gen Chromecast which operate on 2.4Ghz the second gen one does 2.4GHz and 5GHz

This is either placebo or actually a frame-times issue. The frame-time issue is a topic with closer relation to minimum frame times than averages. For example, to the human eye 24 fps video looks smoother than 40 fps in games. Why? Frame times. Some games have issues with frame times especially poorly implemented times.

It is not that you can tell the difference between 120 fps and 144 fps, you can't. It's that sometimes a given frame will take longer to encode than other frames so even though the minimum or average frame rates do not show it, the actual experience is to watch a specific frame on the monitor longer than other frames which come faster in succession. I actually did not perform enough testing to draw enough definitive conclusions for frame times under load. However, for miniumums (as reported) and averages, the conclusion is clear.

Agreed. Most people who PC game do so at 720p using integrated graphics. They pair their $200 Intel CPU with nothing.

Now of the sub-group of most people, there is the sub-group of pc gamers, there is sub group of pc gamers who decent hardware (builders), then there is another sub-group that cares about getting the best performance they can out of their games.

edit: typo

No, its not. There is a reason that people actually buy higher than 60hz monitors, BECAUSE THERE IS A DIFFERENCE. I'm playing CS:GO on a 980 ti and a haswell quad with HT xeon @ 3.6ghz, there are no frame time issues. I sustain higher than 400fps constantly, without dips, ever.

So your counter argument is that there is a small subsection of pc gamers with good hardware? ARE WE NOT TALKING ABOUT THIS SPECIFIC SUBSECTION RIGHT NOW? The dude is building a FreeNas box in addition to his pre-existing system, you think he has shitty hardware? His leftovers are a fairly competent system by themselves.

Also your plainly wrong that most people play with integrated graphics...

http://store.steampowered.com/hwsurvey


The last bit. If you couldn't tell, I was addressing the first part of text I quoted and not the second, specifically this bit:

I never addressed the second part of the quote and it was left in by accident. No I'm not going to do benchmarks to prove common sense, your benchmarks are ridiculous and prove no non-obvious point. Yes if you pair a half-decent 8350 or a very, very good 1700 with a basic (or years out of date) tier graphics card there is going to be cpu overhead to spare that could go towards encoding. Now if your working with a modern graphics card that's 1070 level or higher, you are going to have issues. Once again, most people, with a good cpu, aren't pairing it with a $150 graphics card. There is a REASON people say that the graphics card should be the most important component in a gaming desktop.


I'm done with this thread, these arguments are ridiculous. @ OP if you decide to go FreeNas and want help setting it up tag me and I'd be happy to help.

Cool story bro. Any Benchmarks? Any controlled testing? Anything besides saying nu-uh and using caps lock?

The idea is that we are talking about a really niche group and we have not asked about his specific hardware.

Knowing that CPU loads, like encoding, do not actually effect gaming performance for any relevant CPU (not dual-cores) and are not even noticable in benchmarks, except for ultra-high end ones where FPS counts are so high they do not matter anymore, I gave every option, including recommendations supported by benchmarks, and let him pick based upon his available resources, knowledge of his hardware etc.

So your counter argument is that there is a small subsection of pc gamers with good hardware that would be able to measure a difference in benchmarks? And that OP is totally probably within that niche within a niche without asking? Wait what?

I am so very glad we both agree on this point.

It was exactly to put actual numbers on that very obvious point, that the CPU just does not matter for gaming, even while encoding in the background, that it was worth doing the benchmarks I did. :slight_smile:

So far with Chomecast and Fire TV Stick some videos tend to stutter (Firestick Showed with Direct Play) as for running the desktop app its runs butter smooth which is probably due to the wired connection. I cant seem to connect the second gen Fire Stick to the 5GHz to see if that changes things. Anyway with the older hardware I have gotten about 12 Blu ray rips size of 3-4 gb that from screen shots from the stright mkv rip vs the handbrake with no cropping, 1920x1080 RF 20 and x264 preset Medium can’t seem to tell the difference.

Some people can tell the difference, I am one of them.

I would love to move it to slow and slower but man it already take 1 hr and 30-40min I bet slow is like 2hours

Results of 2 concurrent Streams from 2 desktop pc. Waiting to test over the Internet next

Media PC Specs:
AMD Phenom II X6 1090T Black Edition
Asus M4a87TD EVO ATX Motherboard
8GB DDR3 Rip Jaws
1TB WD Blue
64GB ADATA SSD for jails (since it wasn’t being used :slight_smile: )
CX550M PSU

CPU USAGE
CPU usage

Disk Usage

Memory
Memory

Processes
Processes

1 Like


Hardware used: Xeon 1231v3, 16GB DDR3, 980 Ti, 1080p, Fraps as FPS benchmark

Game: Battlefield 1 1080p DX11 (due to stuttering in DX12 mode and overall higher performance in DX11 mode) ultra preset in full 64P multiplayer server on map Empires Edge, test running from Default span to A objective and back

OCCT used to simulate 1 to 4 threads being occupied to simulate one or numerous encoding tasks, used as generic cpu load since media server not setup on PC and cpu is cpu load


So a Haswell Quad with hyperthreading running at 3.6Ghz, that’s a relevant CPU isn’t it? It’s around the same speed as a I5 7600k stock to a 1500x stock. As you can see, when you use a balanced cpu and Gpu combo, not a ridiculous 1050 ti and 8 core ryzen 7 combo, there is definitely a performance difference when you add external cpu load. It’s not like a 980 Ti is a super powerful top of the line card anymore either, its equal or slower than a mid-tier Gtx 1070 or R9 Fury X.

-_-


Ran in BF1 opposed to cs:go because of the impossibility of having a repeatable test in a live game in cs:go, whereas BF1 has areas on the perimeter of the map where continuous uninterrupted bench-marking can occur.

86% prefered 120hz to 60hz and 88% were able to tell which was which. Sorry mate, but that’s pretty definitive.

1 Like

So Xeon 1231v3, 980 Ti
BF1 @ 1080p under OCCT
unknown OS

  1. Props for actually doing the benchmarks. ^.^
  2. OCCT is stress-testing software. The ones I did, were literally encoding in the background + games which is the use case I described to the OP opposed to stress testing software + games, so mine are more “real-world” than theoretical like yours.
  3. The biggest difference should be that OCCT should produce significantly lower minimums when testing compared to encoding software. Given how important minimums/frame times are to gaming, it is not recommended to extrapolate OCCT + gaming results compared to actual encodes + gaming.
  4. I don’t really want to high-light this, but it has to be said: In terms of CPU architecture, remember that the FX series is an 8-core CPU where every 2 cores share an FPU. As per the original TekSyndicate FX 8350 Streaming Benchmarks [Youtube video], that architecture performs better than 4-core + hyperthreading architecture when under load. True 4-core architectures (w/HT) should see a higher % drop in average and minimum FPS when streaming. The OP has a 6-core CPU, no HT that is 1 generation older than the FX series. So that gives us a very good idea as to how it will perform, but it is difficult to draw hard conclusions.
  5. Remember that different applications respond differently. Each one has it’s own quirks. So testing different CPU architectures with a different GPU, with a different OS (this can actually matter), across different applications makes it very difficult to directly compare results. We can only compare the trends due to the all of the differences noted above.

With the disclaimers above, let’s combine your results with mine. The numbers themselves and percents are not comparable but the trends should be. Here are my synthetic GPU results for idle vs load:

It would be helpful if you did Unigine-Heaven benchmarks actually. But anyway. The Unigine-Heaven averages do not take into account the minimums, so they can be completely ignored.

Metro Last Light is horribly unoptimized and always CPU-bound…just always. It is an example of how not to program games:

SMT does not play well with MLL. So anyway, 3 graphics, at least 3 conclusions right?

  1. OCCT/Haswell Quad-Core w/HT + 980 Ti: The difference when under 100% core load is measurable across cores. The average is above 60 FPS in all cases, even while under running 4 threads that fully load every core.
  2. Unigine-Heaven Chart: (Remember that Unigine does not take into account minimums when reporting averages.) Minimums are always affected by CPU load.
  3. MLL: Both averages in CPU bound games and minimums drop significantly

The 4th conclusion is of course the one from the Shadow of Mordor charts.

  1. In GPU bound games, CPU encoding load does not affect gaming significantly.

And you confirmed my suspicion or “ours” rather:

So what you mean when you say “balanced” is a $200 CPU and a $650 GPU? Yeah…it’s pretty clear what component games scale off of and which component games do not care about. Gamers Nexus did some benchmarks on the G4560 $70 (modern intel dual-core with HT) and found the scaling cap to be at around a $200-250 GTX 1060 Ti 6GB. My tests show an FX 8350 @ 3.4 Ghz (5 year old $200 CPU) does not bottleneck a 1050 Ti (modern $120). They did some other testing that shows a 30-50% performance increase when switching from FX to a i7-7700k given a 1080 Ti. Your tests are not conclusive enough to draw scaling information but do show that performance hits for games when using a very high end GPU while under CPU load are measurable.

By the way, it is not like I never understood there was no difference, but rather that the difference was only measurable as opposed to something you would actually ever care about during gameplay. I did not draw any conclusions regarding minimums due to lack of data besides noting that they are affected even more.

So your results show that you are above 60 FPS even while under load in a CPU heavy-title when paired with a GPU that was originally $600? Yeah, the CPU does not matter for gaming, even while under load.

It serves the purpose of taking cpu cycles, specific task does not matter as long as cpu cycles are being taken.

The average was yes still above 60 with 50% of the CPU knee capped, but I was on an extremity of the map, and the variance in fps made it near unplayable. When turning The character, the fps would go from above 60 and down to sub 50 and staying above the minimum. This gave massive tearing, stutter, and an overall bad playing experience. If I could measure frame time consistency, it would be appalling.

The problem with heaven is that it only really uses one too two threads, as can be seen with task manager while running it. It’s optimized to be used for benchmarking the graphics card, not at all the CPU (unlike a game which taxes both), and thus is a fair test of gpu performance but not system gaming ability

Price is not a good metric of determining balance. A $300 7700k will game better with a 1080 Ti than a $1000 7900x, and a $300 7700k will game better with a 1080 ti than a $500 1800x. But I think most people would say a now 7600k/1500x equivalent 1231v3 would be a pretty balanced pairing with a now 1070 equivalent 980 Ti.

1 Like

So what happened in the end? Did you build a temporary junk server to see if it will suit your needs before you drop a grand on some new gear? How is it going?

Yeah, I built one with some left over parts and a 1TB drive to test it out. I’ve put about 25-28 movies on it. Working out so far with streaming on Wired desktop in the house. Has some issues with Chromecast but tried it with newest version. I am at the point though going to makeMKV to handbrake is a bit time consuming for 1hr 30min for a movie. Rather just go threw makeMKV and call it but with the bitrate for 15-24gb movies is a bit of a strain but It looks good at the setting I have it on at look equal to the mkv main rip.

I have never tried to do any of this, even though I use freenas extensively to watch movies, TV, listen to music, archive data etc.

I guess it depends what you are trying to achieve and why, and what limitations you have to deal with.
Do you want to watch perfect copies of your blurays without changing discs etc? That is going to take work, no easy one click automatic solution.

I can think of some fantastical solutions that may or may not work.
Seems the software to rip only works on windows/mac, but you could run it in a VM.
Or you could rip on PC, send the files to freenas where handbrake will transcode and dump in your media folder.

For the effort put in to get all of that working, you could probably just go and download it, knowing you have bought the item.

Even then you run a legal risk. Even if you own the physical medium, downloading the exact same content is illegal in several countries.

Getting an ISO or Remuxing the Bluray is also still illegal in several countries (because you have to circumvent the copyright protection), so in those countries there is technically no legal way to obtain a copy for a NAS… Though ISO might be a little easier if it’s a block level copy, because then the copy protection is intact.

What are you a boy scout?

@anon54210716 Don’t worry too much about the legal garbage esp. if you are ripping your own discs. Downloading is so much easier, tho

1 Like

I’m just saying, some people want to be cautious or just care about it :stuck_out_tongue:

What anyone does with that information is no business of mine :wink:

Also it kinda bothers me myself that there is technically no legal way (for me), not that I haven’t ever downloaded anything…