What's up guys.
I know that you can run an 8ohm speaker on a 4ohm amplifier or unit just fine,
but running a 4ohm speaker off an 8ohm unit is not good for your setup.
I was reading some Amazon responses to some of the user questions, and some guy was asking about the impedance of a certain powered subwoofer. The response: "Doesn't matter(what the impedance is), the internal amp and speaker are already impedance-matched".
How accurate is this? Am I safe to just ignore the impedance of an amplifier's load in a self-powered unit?
Lastly, if this is indeed correct, am I safe to hook an amplifier with a 4ohm load to a unit that has a dedicated subwoofer line out for an 8ohm load?
Thanks a lot people!
How will you be attaching the subwoofer unit to the receiver unit?
If you're receiver/amplifier has a dedicated low level subwoofer output, a la RCA cable, then it's perfectly fine and safe to connect any self-powered subwoofer to it that has a low level input. This is because within the receiver/amp the audio signals for the speaker outputs and dedicated subwoofer output are usually run through entirely different amplifier chips.
Yeah, it'll be an RCA cable with the ends spliced into the unit's proprietary connector. So I'll have to chop up one of my spare cables.
Thanks for your clear response. I appreciate your help!
Most subwoofers are 'self powered'. If it plugs into the wall, then it has its own amplifier (and it is safe to assume the manufacturer matched it accordingly...I hope!).
The 8ohm and 4ohm specs are for unpowered speakers getting an amplified signal. It doesn't matter for unamplified signals i.e. the cords sending the audio signal to the amp, or the amp to a powered subwoofer.
Unamplified signals have very little power (watts). Amplified signals can have a hundred watts or more (this is what the watt rating on speakers and amplifiers is). 100 watts is very very loud fyi. Anyway when the impedance is halved, the amplifier needs to kick up the amperage (gets hot) to keep the wattage at the same level because the voltage drops (I might have this technically wrong, but its not too important). This is where the problems can come in if the amplifier isn't build for a 4ohm load. Naturally if the amp is built into the sub, the manufacturer should have an amp that matches the needs of the sub.
Alrighty, that makes sense. Thanks a lot!
It can be, It will very much depend on driver sensitivity (SPL Measured by Db when powered by 1watt @ 1 meter away.)
My Yamaha SV115's push 90dbs 1 meter away when powered by 1 watt. This is about as loud as a freight train at 15 meters. at 100dbs the sound level doubles. If the driver has a sensitivity of 90dbs with an Amp @ 50 watts it will be much louder than a driver with 70dbs with 150 watts.