Discussion in 'Amps and Cabs' started by gunslinger, Oct 9, 2018 at 6:02 PM.
As opposed to other methods?
No. Use a meter to get in the ballpark, then tweak with your ears.
I dare anyone to hear the difference.
Only time a guy biased my Marshall with a scope, it came back sounding terrible and the bias was set too low. That’s when I bought a meter and learned to do it myself.
I learned that you actually have to know what you are doing to bias with a scope.
You know the oscilloscope is just a tool, and doesn't dictate whether the biasing is "good" or not, right?
No. If u had a lot of experience biasing amps one thing u would realize is it's not the great on/off switch of tone it's made out to be by some. First of all, it will make only make a noticeable difference under one of these 2 scenarios...1-it EXTREMELY over or under biased, or 2- you play with the output cranked all the way. And even then IMO it's something too subtle to bother worrying about. Simply bias it to 70% and spend your time worrying about things that REALLY matter. seriously, it's just not a big deal. You'd be better off worrying about what tube model/brand u are using than how it's biased.
Yep, I just figured the guy knew how to use his tools. He didn’t.
As theory has it, setting bias with a scope will minimize crossover distortion. But who really cares about that in a guitar amp, they're non-linear in frequency response anyway...I want it to sound good and a scope doesn't have ears.
There's no right or wrong though...do what works best for you.
Biasing an amp properly is knowing where your maximum is, and setting it where it sounds best, anywhere below that.
That "max" isn't a number set in stone either. It depends on the tubes you're using, the plate & screen voltage, how hard you push the amp (and how often), and how much risk you're willing to accept in terms of premature tube wear and potential failure.
For reliability, it's a science. For tone, it's more of an art.
Using a scope to minimize cross-over distortion is extremely subjective and not reliable. The best way to bias is to use bias probes to set bias for the tubes to between 60-70% of max operating range. As long as the tube currents are within 10% of each other, you are good. Any other method is highly subjective and will not produce consistent results. Also, adjusting to minimize crossover distortion with a scope does not guarantee that you will not set the tube bias either hot or cold.
Same story here! Bought a Biasrite soon after.
Many techs are good at fixing amps but can't play one and sound terrible when they test the amp out. It's like getting a haircut from a blind barber. They rely on you to tell them what sounds good to you.
Correct me if I'm wrong, but the only "bad" you can do with biasing, is too much resulting in "red plating" the tubes. As long as you're not in the red plate zone, you're 100% safe... so bias for tone and wiggly lines be damned.
A tube can run too hot and not exhibit visible red plating. It’s entirely possible to bias an amp so the tube life will be abysmal and/or tubes will be likely to fail, but see nothing wrong visibly.
You can usually hear and feel it with the extra heat they put off. But it’s not always visible.
You heard it on TGP first, boys and girls.
Using specialized measuring equipment is extremely subjective
I get medical and use a stethoscope.
These miss the point.
Using a scope for biasing is about adjusting for maximum clean output power, which is directly observable at the dummy load. It is not about biasing for least crossover distortion (which shouldn't be visible anyway until both the idle current is very low and the drive signal is made quite small). Checking for crossover is an extra step.
I do agree with others that the scope is just a tool, and using it won't make the amp "sound better". The benefit is scope biasing sees what the tubes being used are actually doing when they amplify; it doesn't treat all tubes the same with respect to a target current. The tech could get a sense of whether the tubes being used are very strong or near the end of their lifespan when the tubes are biased for max clean output power, and that amount of power is compared to what's typical for that tube type, in that circuit. This is actually a better test of output tubes than any tube tester.
Just as long as you don't get anal & use a colonoscope!
The good ones do, at least. Why would you want your amp matched to what someone else happens to like? (assuming it's within the appropriate range)
I started a thread awhile back asking about the same question but on the Tech Corner forum (https://www.thegearpage.net/board/index.php?threads/bias-set-by-meter-or-scope.1907813/). I saw a Mr Carlson's video (love this dudes Youtube channel) on how he setup a tube audio amp with a scope and why (cross over distortion). Basically came down to on a guitar amp it really doesn't matter if you use a scope or meter, on a high fidelity amp it would.
Bias setup is around 59:00 into the video:
You would have to first observe the waveform on an oscilloscope in order to know what cathode current was optimum for those tubes in that amp. Using a number published in a book 50 years ago is pretty arbitrary. Just my opinion.