Originally posted by hipfan
Sorry to resurrect an old thread, but this seemed to be the appropriate place for this follow-up question.
Is there also an issue with the impedance the OT wants to see, when switching from 6L6's in an amp designed around those tubes, to EL34's? For instance, in a thread on the HC amp forum, James Peters mentioned that, when running EL34's in the Peavey 5150, it could be a good idea to use the 16 ohm tap for the user's 8 ohm cab.
I'm wondering what the technical reason is for this. Does this hold true for all 6L6-designed amps where the user is trying to use EL34's?
Specifically, I have a Demeter TGA2 50 watter that is designed to use 6L6's. On my request, the guys at Demeter set the amp up for use with EL34's (bias circuit, pin connections), but it still has the same transformers as "stock" TGA2's. Should I use the 16 ohm tap for my 8 ohm cab when using 6L6's? Are there any reasons (other than sound quality preference and the heater current issue noted by Clay) why it might be a bad idea to run EL34's in an amp designed around 6L6's?
Thanks for any help.
Originally posted by hipfan
Mike - Thanks. I take it that you mean that the output impedance is close enough that running an 8 ohm cab from the 8 ohm tap is the right way to go?
Is that generally true universally for 6L6 amps using EL34's, or is your statement specific to the Demeter? Just curious. Thanks again!