It seems to be "common internet wisdom" that decreasing a preamp tube's cathode resistor will warm the bias and lead to more gain. But drawing tube load line charts doesn't seem to bare this out. On the charts, changing the cathode resistor value has a very minimal effect on the plate load line slope that determines the tube circuit's gain. 1.5k cathode resistor = 250v / (100k + 1.5k) = 2.46ma 2.7k cathode resistor = 250v / (100k + 2.7k) = 2.43ma These values form the left end of the red plate load line in the chart below so the slope of the line (and therefore gain) barely change. The main effect of altering the cathode resistor value is the operating point (intersection of plate loade line and cathode load line) is shifted along the plate load line which obviously affects headroom. A center bias allows the most voltage swing without distortion giving the max headroom. Moving the operating point away from the center bias point will give earlier distortion thus less headroom. The chart below shows a common 12AX7 guitar amp gain stage with 250v supply voltage, 100k plate load and 1.5k and 2.7k cathode resistors (unbypassed in this example for simplicity). Am I missing something or does changing the cathode resistor value have minimal effect on gain?