Resistor question

Mark C

Member
Messages
4,417
Ok, bit by bit I am learning about working on tube amps. My question is about wattage ratings on resistors. From what I have read, the wattage in conjunction with the resistance determines the voltage drop across the resistor. If I am looking at a layout or schematic and the wattage is not specified, I am wondering what things to consider when choosing the resistor. Also, does it make a much of a difference to go up in wattage for the same value resistor - ie. using a 2 watt resistor in place of a 1/2 watt?

Thanks for any help.
 

John Phillips

Member
Messages
13,038
The wattage rating has nothing to do with the voltage drop, by itself - it's just the maximum amount of power that the resistor will take before burning out.

(In fact, it's related to the environmental temperature too - which matters inside something like a tube amp).

The power developed in any resistor is the product of the voltage across it and the current through it, which is also equivalent to the square of the voltage divided by the resistance, or the square of the current multiplied by the resistance - so you only need to know two of these values to work out the power.

Typically, if the resistor power ratings are not marked or specified, they will be assumed to be 'signal' - ie no significant power developed - and usually 1/4W or even 1/8W. There is no reason why you can't increase the power rating - it just increases the safety margin.

In fact, if any resistor in an amp I'm working on has burned out, I do often increase the power rating, unless the cause of the fault is clearly something else. Many amp makers under-specify resistors for the simple reason that higher power-rated ones are more expensive.

Some types of resistors are far more robust and tolerant of short-term overloads too - wirewound especially. I tend to use these for high-stress uses such as screen-grid resistors, the B+ chain, cathode resistors, low-voltage supply resistors etc - even if the power is below the point where a carbon or metal-film resistor can be used. They're just more reliable.

(It's often stated that you should use 'non-inductive' wirewounds, but I've never come across a single case in a guitar amplifier where this matters. I wouldn't put them in the direct audio path though.)

One thing to be careful of is that sometimes the voltage rating of a resistor matters, especially for things like plate resistors where there is a very large voltage drop - even though the resistor value is very high, so the power developed is still low. In these cases you may have to use a larger power rating than you really need, since higher-powered resistors generally also have higher voltage ratings.
 

Wakarusa

Member
Messages
1,458
More tidbits...

A good safety margin is 100%, so if under normal operating conditions the amperage and voltage across a resistor will make it dissipate 1/2W, then a 1W component is (IMHO) called for.

As noted above, a high temperature environment calls for derating components. A resistor's power rating is a measure of its ability to disspate heat -- something more difficult to do if its hot outside. So resistors near the power section or hot transformers get more margin.

Also, there are various sources of noise in resistors. One of which (contact noise?) is proportional to the cross section area of the resistor. In general, higher wattage ratings mean bigger resistors and less noise.
 

JamesPeters

Member
Messages
1,397
Originally posted by Wakarusa Amp
In general, higher wattage ratings mean bigger resistors and less noise.
This may be true in some circumstances but it's not what I've experienced in general. Providing the resistor is rated well enough for the job (usually, at least two to three times what it's going to dissipate), going with a higher power rating doesn't decrease the noise factor. For instance, using 1W resistors in the preamp vs. 1/2W generally doesn't improve noise performance.
 




Top