/* Google analytics ================================ my script would go here */

Saturday, August 25, 2012

Amp distortion

In the '50s, guitar amps were available in what is now considered to be the lower power range for amps. This means they had less headroom and were more prone to distortion. Since there was less clean headroom, when you needed a little more volume, or you had hotter humbucker pickups, you entered the operating range where distortion was present. So unintentionally, amp distortion was part of the sound to which we grew accustomed.

But distortion is also an interesting sound, more harmonics, a little bit sax-like, and players liked intentionally exploring it. (That, added to the fact that it was also a bit rebellious to be loud enough where the grown ups yelled at you to turn it down.) I've read stories of John Lennon, Eric Clapton, and Keith Richards fighting with the recording engineers to leave the distortion in the recording. The Beano album, with Clapton playing a Les Paul through a cranked Marshall is still today considered a hallmark of guitar tone, but it wouldn't have been had the "common sense" engineering of the time won out.

So the situation was one where distortion was present and being explored. Imagine though if more powerful amps were available during rock and roll's formative years. There would have been more clean headroom, less distortion, and distortion would have been harder to come by without being really loud. The sound of rock and roll would be quite different today. By the mid 60's, when 100 watt amps were finally readily available, it was too late for this effect to happen. Distortion had been dyed into the fabric of rock and roll, it was part of the tone. So in order to get the tone with the bigger amps, you had to turn up, and instead of becoming cleaner sounding like it would have in the 50's, in the 60's rock and roll got louder. The same bit of technology introduced at a different time had a different effect.

There was still some uncertainty about the role of distortion in the '60s. Around 1965 Leo Fender sold his company to CBS. CBS brought in engineers that tweaked the circuits with the intent of making them more hi-fi--they hadn't caught on to what was happening--and the CBS silverface amps got a bad reputation among musicians. Pre-CBS blackface amps were sought after then and still command a higher price than the silverface models, although enough time has passed where this is starting to fade. (A lot of silverface amps are modified to be more like the blackface circuitry, and used to be a bargain if you didn't care about the cosmetics.) Modeling amps still pay homage to the sound--nearly every digital modeling amp has a "blackface" or "California" sound on it.

In the '70s a watershed moment for distortion happened--it was accepted. Randall Smith was heavily modifying Fender Princeton amps to add more gain stages. In a December 2005 Guitar Player article, he said at the time he was doing this because the blues and rock players wanted more distortion at lower volumes. The cascaded tube stages he added produced the distortion, and master volume controls limited the volume going to the power amp. So the power amp was no longer distorting, since it was working less hard, while the preamp tubes were. But not all distortion is the same. When they distort, the small preamp tubes sound different than the larger power tubes. In essence, this is the sound that the Mesa Boogie Mark I amp brought to the table. and later Mark II amps made adjustable with the amp's knobs. Distortion tone had become more adjustable.

So the palette of distortion currently available in amps now involves several approaches. Bands with a vintage sound either play really loudly (AC/DC) to get the power tube crunch, or use power attenuators to dump some of the volume as heat. Metal bands rely more on cascaded tube stages to develop the sound (although they play loudly as well). Master volume controls give control over the character of the distortion. And all this is without considering the huge role played by stompboxes in front of the amp.

Cascaded gain stages had another effect on amps. The drawback of cascaded tube stages is you lose touch sensitivity; as you play lighter on the strings, or roll back your volume, the amp doesn't clean up the way it does with a vintage amp, the tones available from your fingers are reduced. I find this to be a limitation because it reduces dynamics and is more compressed, and a series of notes played on the guitar comes out all with the same tone. On a vintage amp, the dynamics of your playing influence the tone of the notes. But to each his/her own.

In order to get back the clean tones on a  cascaded amp you need to have different circuitry. This is done by having multiple channels. Most high-gain amps have more than one channel, usually footswitchable. The channels have increasing amounts of gain and have names like "clean," "crunch," "overdrive," etc. So when you want the clean sound that you can get with a lighter touch on a vintage amp, you step on a switch to change the channel. The advantage of this approach is the range of distortion sounds available is much greater than you can get with a vintage amp.

An electric guitar without its amp is only half the instrument. The foundation of your tone starts with your fingers, but it's shaped by your guitar and amp--and the settings you choose.

No comments:

Post a Comment