Does server tick rate in fps games affect user hardware/bandwidth

With BF4 finally testing 30 tickrate servers up from 10Hz in a bid to mitigate some of the bad netcode aka shot behind cover etc problems, is there a reason why they can't up the tick rate to 60 to 100 like Counterstrike Source or is there a hardware/bandwidth bottleneck on server or client side that prevents this?

Well I believe it would be more intensive on the servers, definitely in terms of bandwidth, and I believe it would also be more intensive on the hardware.

As for upping the tickrate to be that high.. well at least intuitively it would be a lot more intensive. Differences between weapons in BF4 and CS:S, vehicles, map destruction, etc. could end up being a huge strain on players' internet. At any rate, it would likely require much more powerful servers, which EA doesn't want to spend a ton of money on.

Is it possible to roughly estimate the Bandwidth required for a 60 tick rate. 

Keep in mind that doubling the tick rate uses twice as much bandwidth, and you have to be twice as fast at sending out all those packets in time to calculate the next server frame, etc. I imagine that they don't waste money by having servers that are idle half the time, so they try to run as many servers on the same box as hardware will allow. So upping the tick rate will have to mean either more hardware, or software improvements that free up the necessary resources.

Let x represent the required bandwidth to run a single server at 60Hz.

Let y represent the measured bandwidth used by a single client at 30Hz.

Let z represent the number of users serviced by a single server.

x = (60Hz / 30Hz) * y * z