# Pre-Out Voltages - More preout voltage makes my amp work less and get louder!



## FAUEE

Here's one that always amuses me.

Higher pre-out voltages DO NOT make your system louder. They DO NOT make your amplifier work less. The benefit of a higher pre-out voltage is that any distortion signals that may be transferred into your RCAs will be a lower % of your original signal, and that if you are running a large number of amps(ie more than 1 off each pre-out) you will have a higher voltage signal at each amp.

Here's why, your amplifier can only output up to a certain voltage before it clips. That's a more or less "set in stone" thing based on the amplifier itself. For the sake of argument, we'll say its 20V. The gain on your amplifier is used to match the amplifier's output voltage to its maximum by compensating for your different input voltages. Thus say we have a input voltage of 2V, you could set it to 10, or if you had a 5V pre-out, you would set it to 4. IF you had a 2V pre-out, then upgraded to a 5V pre-out without adjusting your gains you would be trying to make your amplifier run up to 50V, which is MEGA clipping. 

It will not make your amp "work less" or run cooler. You amp is only capable of a certain voltage. If you want to make your amp work less, turn down your volume. A higher pre-out voltage will actually make your amp work harder and run hotter, as it will reach its maximum output voltage faster.

Also, your gain knob is NOT a volume knob!


----------



## instalher

you realize that clipping the amp is the best way to run the signal..as the transistors are fully open and operating at as close to 100% efficient as possible..


----------



## Sarthos

True. Although it may seem louder if you don't set your gains right, but that's only because you start clipping.

However, there is one way that higher pre-out voltages can make your system louder. If you do something like have a stereo with 0.5 volt signal output, then you run RCA splitters and split the signal to 50 different amps, the gain knob at max won't reach RMS power. But that's not a common problem for people to have 

There is a single big advantage to high volt outputs. Noise rejection. I know lots of people have problems with noise in their setups, especially if the RCAs get close to power wire. For laughs, I wrapped my RCAs around my power wire, and get no noise. 8 v pre-amp outputs really help with that, the lower your gains are, the less noise you get in the RCA cables


----------



## instalher

here is a good example of high output voltage... ok lets say we are in a machine shop... and you are the manager trying to get the attention of one of the workers... so you are up on the mezzanian and you yell as loud as you can to get his attention.. thats a low ac output deck, but take the same time and add a megaphone to his voice and now he can hear you.. has the noise in the system changed no, but the output has masked the noise of the system... so high output dosent lessen the noise, its still there, it just goes higher than it is..


----------



## SQ_Bronco

instalher said:


> you realize that clipping the amp is the best way to run the signal..as the transistors are fully open and operating at as close to 100% efficient as possible..


Wait, what?


----------



## instalher

this is the premis of class d amps as they are running at almost full power and are very efficient.. make sense?


----------



## trojan fan

instalher said:


> this is the premis of class d amps as they are running at almost full power and are very efficient.. make sense?


Is that why they sound like $hit?


----------



## subwoofery

instalher said:


> this is the premis of class d amps as they are running at almost full power and are very efficient.. make sense?


Well actually class D amps are less efficient when stressed. Said to be closer in efficiency to a class A/B @ full power. 

Kelvin


----------



## ChrisB

instalher said:


> this is the premis of class d amps as they are running at almost full power and are very efficient.. make sense?


Actually you have it backwards. At lower volume levels class d amplifiers are more efficient than their class ab counterparts due to the nature of their switching power supplies. 

At max volume, the most inefficient component will be the subwoofer since it will take all that power and convert most of it to heat.


----------



## Pdogg

The majority of the original post is correct with the exception that high pre-out voltage reduces distortion.

The purpose of high pre-out is to reduce noise. The signal traveling to the amplifier can pick up all kinds of stray signals. Also the amp's front end gain is high so it can introduce more hiss, etc. By increasing the output voltage from the signal source, we can reduce the amp gain and thus reduce noise.

They use this concept on satellite receivers by putting the LNA at the antenna before the cable.


----------



## emilimo701

FAUEE said:


> Also, your gain knob is NOT a volume knob!


Well, if you are going to be technical, you should say "your gain know is NOT [MEANT TO BE USED AS] a volume knob!"

It's a potentiometer that, although serving a different purpose, is designed like and works the same as those same mystical little knobs we see labeled "volume"


----------



## Gary S

As a couple of people have stated above, all higher voltage pre-outs will do is lower certain types of noise... but that's only if you have noise to begin with... if you have no noise, it will do absolutely zero for you.


----------



## ryomanx

Gary S said:


> As a couple of people have stated above, all higher voltage pre-outs will do is lower certain types of noise... but that's only if you have noise to begin with... if you have no noise, it will do absolutely zero for you.


i haz teh noize! i hatz it!!


----------



## RongGe

emilimo701 said:


> Well, if you are going to be technical, you should say "your gain know is NOT [MEANT TO BE USED AS] a volume knob!"
> 
> It's a potentiometer that, although serving a different purpose, is designed like and works the same as those same mystical little knobs we see labeled "volume"


I think this is another possible myth. "your gain knob is NOT a volume knob!"
But this myth serves as a good purpose because it makes it easier for the installer to deterr customers who don't know what is going on from touching the settings.


----------



## ryomanx

RongGe said:


> I think this is another possible myth. "your gain knob is NOT a volume knob!"
> But this myth serves as a good purpose because it makes it easier for the installer to deterr customers who don't know what is going on from touching the settings.


I find that sometimes, a lot of customer will change all the setting regardless of what I tell them. I tried to explain to them that is a sort of multiplier of the level of sound coming from the headunit. That explanation seems the satisfy most but, I'm sure it's not the most correct answer.


----------



## Reach

So... (noob question time) ...if you have a balanced differential signal going from HU to an amp that accepts this noise-rejecting signal, there won't be any benefit from increasing the voltage with a line-driver? I can just add in a little more input gain until I reach the edge of clipping, right?


----------



## 96jimmyslt

"gain knob is NOT a volume knob"

Damn, I know this is not a thread dedicated to that, but I am having some issues with the gain on my new amp.


----------



## sonikaccord

Reach said:


> So... (noob question time) ...if you have a balanced differential signal going from HU to an amp that accepts this noise-rejecting signal, there won't be any benefit from increasing the voltage with a line-driver? I can just add in a little more input gain until I reach the edge of clipping, right?


Probably not...I'm pretty sure your balanced signal has enough juice to drive the amp to it's max. The point of this thread is about noise and you have balanced so you're good.

Yes just add gain.


----------



## St. Dark

Reach said:


> So... (noob question time) ...if you have a balanced differential signal going from HU to an amp that accepts this noise-rejecting signal, there won't be any benefit from increasing the voltage with a line-driver? I can just add in a little more input gain until I reach the edge of clipping, right?



A balanced differential input will reject noise incurred prior to the input stage. You would still have whatever level of noise the various circuits in the amp themselves produce; so, if you have a really low input signal then boosting it would help you keep more of the music well above the amp's noise floor.
Typically, though, unless your amp has a really low S/N and your input voltage is really low, I wouldn't worry about it.


----------



## Troon

Sarthos said:


> However, there is one way that higher pre-out voltages can make your system louder. If you do something like have a stereo with 0.5 volt signal output, then you run RCA splitters and split the signal to 50 different amps, the gain knob at max won't reach RMS power. But that's not a common problem for people to have


It's also not really true.

Using splitters sends the same voltage to each output with an ideal source with zero output impedance. If you are using so many splitters that the combined load is getting close to, or lower than, the source's output impedance, you're in trouble regardless of the source output voltage.

Nor is it true that higher output voltages are intended to reduce the % distortion. That will remain the same - or even get worse - when the internal signals are amplified for output. The only argument for running higher voltages across your interconnects is to improve signal to noise ratio - the noise from external EMI is constant for a given installation, and independent of source voltage.

This doesn't really have any significant effect either. The whole output voltage phenomenon is a marketing scam designed to ensnare the ignorant. Just avoid running interconnects at unnecessarily low voltages (low output on HU, high gain on amps, for example) and you shouldn't have noise problems.

The proper way to solve this problem, if it existed in a car environment, is with a balanced line.

Studio equipment runs millivolts of microphone signal over enormous lengths of electrically noisy environment by doing it properly - with a balanced line connection. This runs two signal lines per channel, one an inverted form of the other. Here's how it works.

Conductor 1 carries a signal _x_
Conductor 2 carries its inverse _-x_

Along the line, interference _y_ affects both lines equally:

Cond 1 becomes _x+y_
Cond 2 becomes _-x+y_

At the "far" end, the receiving equipment subtracts Cond 2 from Cond 1 to get the signal:

output = _(x+y)-(-x+y)_ = _2x_

Boom. No interference. That's the real way to do it.


----------



## Gary S

A gain control is used to match the input curve to the output curve for maximum unclipped output with reasonably low noise. Turning the amp gain (or other processor gains) higher will actually result in reduced output.


----------



## RongGe

Your first statement is correct.
However after the gains are set properly, increasing gain does not reduce output as you stated. Instead it slowly hits a plateau and stops increasing.


----------



## Streetbeat Customz

Running multiple amps drops RCA amperage more than RCA voltage correct?


----------



## SoulFly

I swapped out my 1200 watt MTX Elite amp to a 300 watt MTX xd thunder. The old amp worked fine at gain of 0, but the thunder I noticed that i had to turn the gain at least 3/4's of the way just so i can hear it (there is no sound below 1/2 gain). i cranked it almost all the way just to match my fronts which run off of the 14 watt per ch HU. The HU has sub out RCA @ 4v. There is no high input switch on the xd amp so its not like its set for a high input.

Why is that?


----------



## Troon

Streetbeat Customz said:


> Running multiple amps drops RCA amperage more than RCA voltage correct?


1) It's "current", not "amperage".

2) Sort of, yes, but that's irrelevant until taken to extremes. These aren't speaker loads we're talking about: it's a milliamp or two.

If you split an output between multiple inputs, all of the inputs are in parallel and all see the same voltage. The problem arises when you have too many inputs being fed, and their combined impedance drops below what the output can source. Effectively, you're partly sort-circuiting the output and it's likely to do things like distort or show varying frequency response.

This is true whatever the preamp output voltage. A more robust output need not have a higher voltage, just be able to drive a lower impedance load.

Think of it like the difference between a 100W 1-ohm capable power amp (a low output voltage but capable of lots of current) and a 200W 4-ohm capable amp (high output voltage, but can't put out as much current). If you have a 1-ohm sub, it'll only work with the 100W amp.


----------



## envisionelec

Wow. _This thread is supposed to dispel myths?_

There is a lot of misinformation here!

Ideal source impedance is zero ohms (someone mentioned this). However, the best source units are 50 ohms with most in the 200-1k range.

A typical amplifier input impedance (Z) is 10k-50k ohm per channel. Do the math: with 200 ohm source impedance and 10k input impedance, you can drive 10k/200= 50 channels with no signal amplitude loss. But with 1k output Z and 10k input Z, you can only drive 10 channels before signal loss. For the SPL crowd with 30 amplifiers...this matters! They need line drivers. A line driver isn't supposed to make everything louder, it is simply a current amplifier for line level voltage. It keeps the output voltage up so that it can drive hundreds of amplifier inputs!

So where do HV output headunits actually make a difference? 

It depends on how crappy your amp is. 

Think about it this way: You are simply transferring the gain increase to another place. If your headunit is quieter than your amplifier at full gain, then get that HV output and turn the amplifier gain down. Vice versa? Save your pennies and keep that 1V output HU.


----------



## Brian10962001

Another thing I would like to point out; most "4 volt" output head units I have dealt with are no different than their regular "2 volt" cousin's. It's mostly marketing BS at this point. Also the main benefit I found for the Audio Control line driver I have was the ground isolation (adjustable jumpers inside). It would quiet things down quite a bit when it was installed on old fosgate power back in the day. I still have that thing in the closet I may throw it in my truck just for old time sake


----------



## subwoofery

Here's what I've read from Ray @ LP: 


> I am not sure you will be able to tell from the manufacturers specs, but by tearing them down or looking at the schematics as I do everyday, more and more radios are doing this to save money, instead of building a high end pre-amp section with its own power supply to get the high output voltage. Its much easier and cheaper for them to steal it from the high level output signal from the audio ic.
> 
> The problem here is that anytime you increase the signal, aka increase the voltage or power, and then reduce it again, noise filters into the equation. Also audio ics are well known not to have the high fidelity that you want for SQ. So its not as clean of a sound that a nicely done pre-amp voltage section has, thats problem number one. Problem two is that you have amplified the signal to 20 volts or higher thru a less than ideal set of electronics them you are buffering the voltage back down thru a cheap set of resistors to get the voltage down to the usable line voltage of say 3 to 8 volts. This adds noise, takes away dynamics. It's just a nasty way to get what appears to be a high end signal very cheaply for the manufacturer.
> 
> Almost all of your radio manufacturers are going to this on low to middle level units to save a buck and to give the lower end audio customer what he thinks is a high end feature. Your better, high end radio units will still be true preamp sections producing a very clean high voltage signal.


Kelvin


----------



## slipchuck

when you talk about pre-out voltage, do you all mean total voltage or voltage per channel... my blaupunkt puts out .5 volts per channel x 4 channels

thanks

randy


----------



## envisionelec

slipchuck said:


> when you talk about pre-out voltage, do you all mean total voltage or voltage per channel... my blaupunkt puts out .5 volts per channel x 4 channels
> 
> thanks
> 
> randy


Per channel. 500mV is a tad on the low side for car audio, but still usable.


----------



## AKheathen

well, most amps are rated to be run to full power with as little as 200mv. as for the comment that "this is not like wiring speakers", well actually it is. the same thing happens as with your speaker outputs, and has the same limitations, as well. your source will put out x amount of voltage, say 2v. well you hook up 10 different sources, then they are all each getting 2v, the current is just allowed to increase via the reduced resistance created. the only thing that would drop that 2v is if the resistance gets to the point that the power is tapped out of the source and begins to "sag" exactly like the ouput of an amp.. and, "current" vs "amperage", well........ amperage is a measuring unit of current, so using either term does work fine, imho. as for the classd vs class a/b. i would like to point out that the efficiency, etc of the 2 is nearly the same in regaurds to power supply. you can have a class a/b power supply, and run it as a class d amp, since what makes it class d is how the output signal is converted into a high frequency square wave that averages out to the lower frequency signal. typically, they use more logic regulated power supplies, as the tech is also something that is more modern, about the same period of the d-class output tech, but it was actually used before class-d. now, although the higher voltage signal does not make your amp run different, it is usually the user that makes the amp run different. so, if the user is given a high voltage clean signal to work with, and turns it down, rather than adjusting a low voltage signal to clipping, etc, then the end result is a cleaner, cooler less-clipped amp operation, though it is simply end-user action that is causing the effect, not the equipment. as sated, the higher the signal voltage, the less the low voltage noise will be played. that is the point of high voltage. there is also one other byproduct- i hae seen/installed/worked on a hand-full of amps that would need about 3-5v input on the most sensitive gain setting to actually reach full output, and without it, the closest you can get is feeding it a clipped signal from the head. years back, i had modified the input and bias sections of low power non-hcca amps to crank out their real potential, before all this new tech made that 100x more complicated. as an example- i pulled 600watts out of an alphasonic 2035 ([email protected]) untill it shreded the cones on a pair of old school 4ohm punch 12's. more recently, i ran into a cadence that would not reach full power with 2v and bass-boost on. made for bad sound, so the sys. got turned down to a reasonable sound quality, gain still maxed. one day, i'm going to get that truck back and put a line driver in to compensate, but i digress......lol


----------



## AKheathen

trojan fan said:


> Is that why they sound like $hit?


they don't all sound like shiz. which onesw have you listened to? i can tell you some of the first classd amps sounded awesome. there is some that had power supplies that were not quite as responsive to the output, giving a little bit of a muddy output with the fluctuations on music, and then there is bad signal processing, but that's the manufacturer, not the class d fault


----------



## t3sn4f2

[url=http://nwavguy.blogspot.com/2011/02/testing-methods.html]NwAvGuy: Testing Methods[/URL]

"HIGH-END BENCH DMM (revised 4/15): A surprising number of people are trying to make audio measurements with typical portable DMM’s. And the readings are often grossly wrong without even realizing it. True RMS measurements are not trivial. In effect, the meter has to accurately measure the “area under the curve” and time average it—see True RMS Measurements for more information. This proves to be rather difficult across a wide range of frequencies if you want to maintain reasonable accuracy at high frequencies and not have the reading “hunt” up and down at low frequencies. The fact is, most DMM’s priced under a few hundred dollars that claim “True RMS” are really only accurate around 60hz—i.e. power line frequencies. Some will measure sine waves accurately across the audio band, but many will not even do that. I have a $150 “True RMS” Extech meter--a relatively well regarded brand--that’s off by nearly 6 dB at 20 Khz compared to 60 hz on a sine wave and is a joke above 1 Khz on non-sinusoidal waveforms. And really complex rapidly changing waveforms like white/pink noise or real music drive such meters crazy. To do it right, you need expensive true RMS circuitry and the ability to optimize the sample rate and averaging for the waveform being measured. Good high end bench DMM’s, like the Agilent 344xx series, let you set these parameters. They also read directly in dB. I use a 6 1/2 digit Agilent true RMS bench DMM that's extremely accurate and flat from 10 hz - 100 Khz for exact levels and other measurements. It has resolution down to 0.1 microvolts so it can even be used to measure noise."


----------



## AKheathen

^^^and this is whay you go for the $800+ fluke. there is also the option on many gooder meters for calibration to keep it accurate. especially one bnib.


----------



## CAudio

I am using a modern Nakamichi headunit, an MB-X, with 5 Volt Pre-Outs.
I intend on using a mid 1980s era Nakamichi amplifier for the active part of the front stage...2 x midrange drivers and 2 x tweeters. 

A highly respected Nakamichi aficionado and friend told me that the mid 1980s Nak amplifiers were designed for their TD cassette head units of the 1980s and that I may have an issue with input sensitivity.

Thoughts?

If this is the case, should I have signal reach my other amplifier first, and then the vintage 1980s Nak amplifier, so that it sees a lower voltage being "last in line"?

Thank you.


----------



## firebirdude

Good lord. Why did you post this in this thread??


----------



## ttocs388

only thing I want to add to this is that if your using a line driver to boost the voltage it HAS to be as close to the signal source to keep the signal to noise level down. Too often I see them in the back 6" from the amp and there all you are now doing is boosting the noise that was picked up already.


----------



## Jepalan

HU~~~~CABLE~~~~AMP
HU~LINE DRIVER~~~~CABLE~~~~AMP

In Summary:
Most amps (all?) can output full power when fed with "normal" or "high voltage" line level signals.

High voltage line outputs do NOT increase volume/power if gains are tuned properly

High Voltage outputs *can* help reduce the effects of *noise induced in the cable* between the HU and the AMP by keeping the signal level above the *induced* noise level and requiring less gain at the amp input. But shielded cable will help here as well.

True balanced differential pair connections help reduce common mode noise induced into the cable between the HU and the AMP. HU & AMP must have balanced differential output and inputs. The principal is simple. Identical noise is induced onto the + and - wires in the cable between HU & AMP. The amp's input circuit sums the + signal with the inverted - signal, effectively doubling the desired signal and cancelling out the common-mode induced noise.

High voltage outputs and/or balanced differential connections will not fix a noisy source. They also will not fix noisy circuits in a cheap amp.

Did I miss anything?


----------



## firebirdude

Jepalan said:


> Did I miss anything?


The fact this thread is almost 6 years old and has been covered many many times.


----------



## dsw1204

I've a couple of questions regarding this subject:

1: How important is the line out impedance on the head unit with regards to noise? I never quite understood that specification. I am using an Eclipse CD8051 who's line-out impedance is lower than any other head unit's I've ever seen. But, like I said, I just don't understand the importance of the line-out impedance specification.

2. I, also, have an Eclipse CD8053 without the BLA. I was considering getting the BLA but I, also, started considering upgrading my amp. One of the amps I was considering was an old-school Zapco Symbilink amp. For lack of noise purposes only, would I be better off going with the BLA and a non-Symbilink Zapco amp or getting the Symbilik amp and not the BLA?

Can anybody, here, educate me?


----------



## Jepalan

dsw1204 said:


> I've a couple of questions regarding this subject:
> 1: How important is the line out impedance on the head unit with regards to noise? I never quite understood that specification. I am using an Eclipse CD8051 who's line-out impedance is lower than any other head unit's I've ever seen. But, like I said, I just don't understand the importance of the line-out impedance specification.


My 2 cents regarding your first question...

When it comes to line-level signals, lower output impedance is better, but in most cases doesn't make a big difference.
With line-level connections, we are trying to keep the RMS volts out of the HU higher than any noise that might get picked up along the way to the next device in the signal chain.
The logic behind wanting a lower output impedance is simple. Think voltage-divider.

Consider a simple single-ended connection scenario:
HU is modeled as a perfect voltage source followed by a series resistor equal to the output impedance. 
Downstream device input(s) are modeled as a shunt resistor to ground equal to the device input impedance.

HU_RMS_VOLTS<-->100ohms<-->RCA cable<-->10Kohms<-->GND

The higher the output impedance, the lower the signal voltage on the RCA cable due to the voltage divider formed by the source's output impedance and the end-device's input impedance.
For example: If the source output impedance was equal to the load input impedance then the RCA signal voltage would be cut in half.

In practical application you don't need to worry too much about line-level output/input impedance unless you are driving a bunch of downstream devices via RCA splitters. In that case, each downstream device's input impedance sums in parallel and if the source device also has non-ideal (high) output impedance you may be attenuating the RCA signal voltage down too close to the noise floor.


----------



## glockcoma

firebirdude said:


> The fact this thread is almost 6 years old and has been covered many many times.




I've always wondered on forums why people are bothered when old threads are resurrected. 
One of the ways to find this thread buried deep in its hole would be to use the search function. 

If he would have posted a new topic with a question that's been answered a thousand times everyone would be preaching to use the search feature. 

So which one is preferred?


----------



## nineball76

glockcoma said:


> I've always wondered on forums why people are bothered when old threads are resurrected.
> One of the ways to find this thread buried deep in its hole would be to use the search function.
> 
> If he would have posted a new topic with a question that's been answered a thousand times everyone would be preaching to use the search feature.
> 
> So which one is preferred?


Bitching and nagging is the preferred method.


----------



## firebirdude

glockcoma said:


> I've always wondered on forums why people are bothered when old threads are resurrected.
> One of the ways to find this thread buried deep in its hole would be to use the search function.
> 
> If he would have posted a new topic with a question that's been answered a thousand times everyone would be preaching to use the search feature.
> 
> So which one is preferred?


he wasn't asking a question. he was trying to answer a topic that was covered 6 years ago.

And heck yes, bitching is the way to go.


----------



## dsw1204

Jepalan said:


> My 2 cents regarding your first question...
> 
> When it comes to line-level signals, lower output impedance is better, but in most cases doesn't make a big difference.
> With line-level connections, we are trying to keep the RMS volts out of the HU higher than any noise that might get picked up along the way to the next device in the signal chain.
> The logic behind wanting a lower output impedance is simple. Think voltage-divider.
> 
> Consider a simple single-ended connection scenario:
> HU is modeled as a perfect voltage source followed by a series resistor equal to the output impedance.
> Downstream device input(s) are modeled as a shunt resistor to ground equal to the device input impedance.
> 
> Thanks for the info. It was helpful.
> 
> HU_RMS_VOLTS<-->100ohms<-->RCA cable<-->10Kohms<-->GND
> 
> The higher the output impedance, the lower the signal voltage on the RCA cable due to the voltage divider formed by the source's output impedance and the end-device's input impedance.
> For example: If the source output impedance was equal to the load input impedance then the RCA signal voltage would be cut in half.
> 
> In practical application you don't need to worry too much about line-level output/input impedance unless you are driving a bunch of downstream devices via RCA splitters. In that case, each downstream device's input impedance sums in parallel and if the source device also has non-ideal (high) output impedance you may be attenuating the RCA signal voltage down too close to the noise floor.


Thanks for the information. It was helpful.


----------



## Jepalan

firebirdude said:


> he wasn't asking a question. he was trying to answer a topic that was covered 6 years ago.
> 
> And heck yes, bitching is the way to go.


Actually, I was responding to the poster ahead of me (ttocs338) and failed to notice *he* had resurrected a 6yr old thread... my bad... and now we are completely off-topic and bitching nits in a 6yr old thread


----------



## BMW Alpina

glockcoma said:


> I've always wondered on forums why people are bothered when old threads are resurrected.
> One of the ways to find this thread buried deep in its hole would be to use the search function.
> 
> If he would have posted a new topic with a question that's been answered a thousand times everyone would be preaching to use the search feature.
> 
> So which one is preferred?





nineball76 said:


> Bitching and nagging is the preferred method.





firebirdude said:


> he wasn't asking a question. he was trying to answer a topic that was covered 6 years ago.
> 
> And heck yes, bitching is the way to go.





Jepalan said:


> Actually, I was responding to the poster ahead of me (ttocs338) and failed to notice *he* had resurrected a 6yr old thread... my bad... and now we are completely off-topic and bitching nits in a 6yr old thread


Was reading this thread from beginning seriously, but can't help but laugh in the end 
well, at least I gain more knowledge


----------



## geshat00

BMW Alpina said:


> Was reading this thread from beginning seriously, but can't help but laugh in the end
> well, at least I gain more knowledge


Was that a pun "gain more knowledge"?

Sent from my HTC6535LVW using Tapatalk


----------



## thornygravy

I'm sure like with anything there's diminishing returns at a certain point but going from a 1v preout to 2.5v and finally to recently 4v preouts.. each has sounded so much better (cleaner sounding at louder volumes) than the previous.

I know this is an old thread, but it's still something that gets asked about a lot.


----------



## KVH69

I just want to know where people are getting balanced cables for car audio...? RCA is unbalanced so do you guys make your own XLR or TRS cables? Quite the mystery for me.


----------



## DarmoZ

I have three amps, front(D), rear(AB), sub(D).
HU is 4v pre out.

Systems sound **** (muddy, distorted) When setting gains correctly. However can someone explain to me why my system sounds sooo much better (cleaner and better dynamic range, louder) When i go into the head unit xover section and reduce the gain on all channels to -8db (effectively lowering the head unit pre-out voltage) and then increasing the amp gain to compensate. 

So essentially, like for like volume level:
Option A (HU pre-out normal, amp gains lower) - Music sound ****
Option B (HU pre-out lowered, amp gain higher) - Music sound better, as tracks should sound.

Note: I thought it was something wrong with my HU so i swapped it out to another brand/model also 4v. even replaced my iphone 6 to an iphone 8 in case it was the music source. Same thing as above applies still. So something to do with the amps maybe? electrical system?


----------



## instalher

head unit output is clipping at 4 volts... just because it puts out 4 volts ac, dosnet mean its a clean unclipped signal... you wil need an oscope for that issue.. your ears hear are your best tool... My Pioneer head unit is 4 volt out as well but clipping starts at about 32 on the volume out of 40. No way am I getting the claimed 4 volts out, probably closer to 3, but I will take 3 clean vs 4 clipped any day.


----------



## thornygravy

DarmoZ, cheaper 4 channel d classes tend to exhibit the behavior you're describing, in my experience.


----------

