This post is a part of the series on audio amplifier feedback. The contents of the series can be found here.
In my previous posts, I mentioned that dominant pole compensation limits loop gain, which increases distortion in two ways.
- Lower loop gain means that the feedback loop has lower ability to correct distortion as the Error Transfer Function $ETF={1 \over {1 - LG}}$ increases.
- Lower loop gain means the input stage needs to deal with a larger signal.
In this post, I will focus on the second point and look at how lower feedback affects the input stage.
The input signal seen by the input stage of a feedback amplifier with loop gain $LG$ is $V_{diff}$, the difference between the input signal to the whole amplifier, $V_x$ and the portion of its output signal produced by the feedback network, $B \times V_y$ (see also the first post of the series): $$V_{diff}={V_x + {B \times V_y}} = Vx {1 \over {1 - LG}} + V_{err} {B \over {1 - LG}}$$
Clearly, the input signal seen by the input stage of a feedback amplifier depends on the loop gain - the larger the loop gain, the smaller the differential input signal.
For example, for an amplifier with 30dB (= approx 32) of feedback and 2V peak input voltage, the peak difference voltage the input stage sees would be (assuming the distortion is small compared to the signal and ignoring the sign) $$V_{diff}=Vx {1 \over {1 - LG}}= {2V \over {1 - 32}} = 65mV$$
Is that large or small? Let us look at a typical input stage - a differential amplifier a.k.a. long tail pair (LTP) - and at how its output voltage (between collectors) changes with the differential input signal (between bases):
It can only handle about 60mV peak input voltage before clipping, and is visibly nonlinear with much smaller inputs:
Oops. With 30dB of loop gain, the input stage will clip at relatively low input signals!
The standard way of fixing this is to degenerate the input stage by adding resistors into the emitters of the LTP:
The typical value of resistors is about 10x the intrinsic emitter resistance, which depends on emitter current $r_e={26mV / I_e}$. For the emitter current of 0.5mA, $r_e$ is about 50ohm, so 470ohm generation resistors are acceptable.
With degeneration, the gain decreases but the LTP can handle much larger input signals:
The nonlinearity is not so evident, so let's have a closer look:
This is 0.0074% of total harmonic distortion (THD). Unlike the distortion of the output stage, this distortion cannot be corrected by the feedback loop, as it is appears at the amplifier's input and is indistinguishable from the input signal. That is, 0.0074% will appear at the amplifier output.
Nevertheless, there are ways to decrease this distortion:
- One is to make the input stage intrinsically more linear. This usually means making it more complicated or, if the input stage is an opamp, selecting a better (and more expensive) opamp.
- Another is to make the differential input signal smaller by increasing the loop gain.
For example, with 100dB of loop gain, the distortion of a simple two-transistor differential input stage is unmeasurable:
With high loop gains, there is no need for complicated input stages or expensive opamps. Even a single transistor will do the job perfectly.