When people talk about "biasing" an amplifier, they are referring to setting the "idle", or quiescent, current in the power output tubes. All tubes must be biased, both preamp and output tubes, but it is not always clear whether or not the bias needs to be adjusted when changing tubes.
Why do you need to bias a tube?
Tubes have to be properly biased in order to function as amplification stages. A tube is biased by setting the amount of DC current that flows in the tube when there is no signal present at the tube's grid with respect to it's cathode. This DC bias current can be set in a number of ways. The bias point determines several things about a tube amplification stage. It determines the power output, amount of distortion, headroom (the size of input signal that can be applied before the output signal clips), efficiency of the stage (the amount of output signal power vs. DC input power), gain of the stage (the magnitude of the output signal for a given input signal), noise of the stage, and class of operation (class A, AB, etc.). The proper bias point is a tradeoff between all of these factors, and selecting the optimum bias point can sometimes be difficult, and it will vary depending on the amplification stage requirements.
There are two main types of biasing: fixed biasing and cathode biasing. Fixed biasing does not mean the bias is not adjustable, in fact, it usually means the opposite. Cathode biasing is usually fixed, and not adjustable, and fixed biasing is usually adjustable with a small trimmer potentiometer, or "trimpot". It is no wonder the subject is confusing to people!
Fixed biasing means the tube is biased by means of a DC voltage, which is usually a negative voltage applied to the grid of the tube with respect to the cathode. As the negative grid voltage is adjusted, the bias current will increase or decrease, depending upon the direction the bias voltage is going. In general, as the bias voltage becomes more negative, the bias current becomes smaller, and the tube is biased "colder". As the bias voltage is adjusted less negative, towards zero volts DC, the bias current becomes larger, and the tube is biased "hotter". This is because a tube is a "normally on" device; that is, it allows current to flow from the cathode to the plate when the grid is at zero volts with respect to the cathode. The tube can be turned off, and the current flow stopped, by making the grid voltage negative with respect to the cathode. The tube can also be biased by referencing the grid to ground, or zero volts DC, and applying a positive DC voltage to the cathode. This is the same as keeping the cathode at ground and applying a negative DC voltage to the grid, because it is the grid voltage with respect to the cathode that determines the amount of bias current in the tube.
Since vacuum tubes are "normally on" devices, a trick can be used to bias them without having to supply a negative DC voltage source to the grid. If a resistor is placed between the cathode and ground, and the grid of the tube is referenced to ground (usually by connecting a large value resistor, such as a 1Meg, from grid to ground), the tube will try to conduct a large current from cathode to plate, since the grid and cathode are initially at ground potential. However, this cathode current flow will cause a voltage drop across the cathode resistor, making the cathode voltage positive with respect to the grid. Since the cathode voltage is now positive with respect to the grid, the current flow will decrease, and the tube will head back towards cutoff. A point of equilibrium will quickly be reached where the increase in current is offset exactly by the increase in cathode voltage, and the bias current will stabilize at some particular value. It will remain at this value unless the resistor value is changed, or a different tube with different characteristics is plugged in. This allows the desired bias point to be set by varying the value of the cathode resistor.
When do you use fixed biasing instead of cathode biasing?
Since cathode biasing eliminates the need for a special negative DC bias supply, why don't all amplifiers use cathode biasing? Well, cathode biasing is not without its faults. It turns out that in order to keep the DC bias supply voltage at the cathode constant while the input signal is changing, the cathode resistor must be bypassed with a large capacitor. This capacitor effectively "shorts" the AC signal component to ground, while allowing the DC voltage to remain relatively constant. If the capacitor is removed, the cathode DC voltage will have a signal voltage superimposed on it, which will subtract from the grid-to-cathode signal voltage, and reduce the gain of the stage.
The problem comes in when there are large signal level changes, and the average DC level of the cathode voltage changes. This causes a bias shift, usually in the direction of a colder bias point. This bias shift can be audible, but is sometimes desirable for guitar amp use, as it adds varying harmonic overtones to the sound. If the bias shift is severe, the tube will go into cutoff, and large amounts of "crossover" distortion will occur. In addition, the current flow through the cathode resistor generates a necessarily large bias voltage on the cathode for proper tube operation (typically 30-50 volts for most higher power output tubes). This voltage subtracts from the total plate voltage, which decreases the available output power. Between this voltage decrease and the bias shift, the output power in cathode biased operation is reduced when compared to fixed bias operation. Therefore, fixed bias is usually used for higher power amplifiers (50W and higher), and cathode bias is usually used for lower power amplifiers.
Preamp tubes are almost universally cathode biased, because they are used for signal amplification, not power amplification, and the side effects of cathode biasing are not as important. Also, cathode biasing makes the circuit less dependent on tube parameters, and more forgiving with respect to bias point. This allows replacement of tubes without having to rebias the amplifier stage. Output tubes that are cathode-biased should always be checked when replacing tubes, however, because they vary widely in terms of idle current for a given cathode resistor value, and it may be necessary to change the cathode resistor value to return the output stage to it's proper bias current.
Copyright © 2000 Randall Aiken. May not be reproduced in any form without written approval from Aiken Amplification.