← BACK TO SHOP
← BACK TO SHOP

THX Deep Note Part 1

THX.jpg

This episode was written & produced by Kevin Edds.

Since 1983 one of the most memorable parts of going to the movies has been the THX certification played during the previews. The accompanying sound logo called “The Deep Note” has fascinated, terrified, and mystified audiences for over three decades. What is THX really?  How was “The Deep Note” created?  And why does it elicit such a reaction from those who hear it?  Featuring Andy Moorer, creator of “The Deep Note” and global director of marketing for THX, Rob Cowles. 
 

MUSIC IN THIS EPISODE

Time (Instrumental) - Joy Like
Flicker (No Oohs & Ahhs) - Airplanes
Drift - Tony Anderson
Southern Queen (Instrumental) - Lost Terra
Light Bridges - Dexter Britain
Wide Eyed Wonder - Dustin Lau
Hydrogen Sulfide - Steven Gutheinz
The Weight of it All (Instrumental) - Kaleigh Baker
Nothing Ever Happens - Lost Terra

20K is made out of the studios of Defacto Sound and hosted by Dallas Taylor.

Follow the show on Twitter & Facebook. Our website is 20k.org.

Consider supporting the show at donate.20k.org.

To get your 20K referral link and earn rewards, visit 20k.org/refer.

Check out wetransfer.com for all of your file sending needs!

Go to forhims.com/20k for your $5 trial month.

DeD9P3aVQAMSIF2.jpg

View Transcript ▶︎

[SFX: Popcorn popping, little chatter]

Imagine yourself in a movie theater. You’re settling into your seat - it's one of the nice ones with a headrest, and it leans back - you’ve managed to balance your popcorn on the arm rest.

[SFX - fade in subtle atmosphere and context]

The lights dim. [SFX: Movie Previews] Soon the previews are over, and the lights fade out completely.

[SFX: THX Deep Note]

You’re listening to Twenty Thousand Hertz. The stories behind the world’s most recognizable and interesting sounds. I’m Dallas Taylor. This is the story...of the THX Deep Note.

[SFX: Deep Note finishes and rings out]

[music in]

If you’ve been to the movies any time since 1983 it’s likely that you’ve encountered the announcement that your theater is THX certified. The visual is mainly a three-letter logo. But the sound is unforgettable.

Still, before we dive into what’s behind that sound, let’s imagine what movies might sound like without THX...

[music out]

Darth Vader:[in static and muffled Vader speak] If you only knew the power of the dark side. Obi-Wan never told you what happened to your father.

Luke:[in static and muffled speak] He told me you killed him.

Darth Vader: [in static and muffled Vader speak] No, I am your father.

In 1980, that type of bad theater sound might have been what you experienced as Darth Vader revealed that he was Luke Skywalker’s father. If It wasn’t for George Lucas.

[Continue Star Wars clip]

[music in]

Andy: For Empire Strikes Back, George Lucas also hired an audio engineer, besides myself, Tom Holman. They were scouting theaters in San Francisco for the debut of Empire Strikes Back and they went to a well-known theater, one of the old majestic theaters, to make sure that the sound system was okay.

That’s Andy Moorer. In the early 80s he was the head of the Digital Audio Department in the Lucasfilm Computer Division.

Andy: When they got there, they were a bit horrified. The sound systems of the day consisted of left, center, right and surround. So there should be three speakers behind the big screen. Of the three speakers he found, one was disconnected, [SFX: wire short circuiting] one had fallen over [SFX: speaker falling thump], and the other one was turned around backwards [SFX: speaker turning around and changing tone]. They were completely horrified by that.

[music out]

So, Tom Holman said, "Heck with this. Look, let's invent something, or let's come up with a system ... A standard, by which we can measure the theaters and we can assure that the sound sounds the same in the theater as it does in the mixing theater when the artists were mixing it.

But this wasn’t the first time George Lucas had used his influence to change the cinematic experience. Before the release of the original Star Wars, it was still common for a lot of movie theaters to have a basic mono sound system. So he knew something had to change.

Andy: George Lucas liked to use his muscle if you will, when bringing out a new film.

That’s how Dolby Stereo came to be. George Lucas insisted that if you wanted the 70 millimeter first run of Star Wars, you had to put Dolby Stereo in your theater.

Andy: And that was exactly what happened. The first 160 or so theaters that played Star Wars, did so on Dolby Stereo.

By the time Return of the Jedi was in production, Star Wars was a cultural phenomenon and George Lucas was a filmmaking rock star. [SFX: Opening of John Williams’ Star Wars score] Because of this rock star status, Lucas was able to use his weight and influence again. This time insisting that, to be able to show his new film, theaters would have to go a step further - and become THX certified.

[SFX: Star Wars theme]

Andy: And that's where THX came from. The name was just made up. Taken loosely from George Lucas' student film, THX 1–138, [SFX: clip of THX 1-138] sometimes called, at least in house, it was called Tom Holman's experiment.

Tom Holman was the engineer in charge of research for this new endeavor. ...and to be clear THX is not a system for encoding or decoding audio. It has nothing to do with how sound is recorded. It’s all about how a movie is played back to an audience.

It encompases everything from the quality of the speakers, to the quality of the acoustics, to the quality of the picture - almost everything about the movie going experience.

George Lucas funded this research to guarantee that what you experience in your theater - is the exactly what the filmmakers intended.

Rob: Cinema technology has changed dramatically over the past 35 years.

That’s Rob Cowles, from THX.

Rob: Originally some of the challenges with cinemas were they weren't properly insulated, so the acoustic ability of the room was poor at best. A lot of times the acoustics of two theaters would actually compete with each other, [SFX: two movie tracks playing simultaneously] so you'd be sitting in one cinema and you could hear what was going on in the one next to you. And then there were a lot of other design elements, like doors to the theater used to let in the light, so that you'd be watching a movie and every time someone came in and out, [SFX: doors opening] it would be kind of washed out. Simple things like that. Also, a lot of cinemas didn't really have properly in store HVAC systems. [SFX: air conditioner turns on] So you'd have this kind of ambient noise in the background you wouldn't really understand, but it was actually inhibiting you from having a really good cinematic experience.

So LucasFilm’s new invention would be installed at a handful of theaters across the country. But this wasn’t a solo effort on the part of George Lucas, he hired a team of engineers to work out all of the details.

Andy: The story forms long before THX was a company. George Lucas decided to start a research institute. He really wanted to advance the state of the art in cinema, and entertainment in general.

[music in]

George didn't want to do the production, the post-production, in Hollywood anymore. So he built this building in San Rafael, California. And coincidentally it housed the computer division as well.

Today this building is known as Skywalker Sound. At the time Andy was working alongside legendary sound designers like Ben Burtt.

Andy: I asked them what they needed and he gave me a laundry list of things he needed. I had put together an audio processing system, that we call the ASP, audio signal processor, that Ben had been using.

He would come in in the mornings and he would use it up to noon. He used it on Indiana Jones. One of his requests, initially, was one for extending sounds. Like he had a sound of an arrow being shot.

[SFX: arrow shot]

It goes, "Shoop!" I mean it's gone instantly, and he wanted something that persisted. He wanted something the sound of an arrow that went on over 15, 20, 30 seconds.

[music out]

So he asked me if I could do that and I said, "Yeah, I know a way of doing that." [SFX: long arrow shoop] I gave him two minutes.

The Audio Signal Processor that Andy had invented made completely new sounds possible. From extending the sound of an arrow, or airplane in freefall [SFX: long airplane dive] to spatialization that made sounds in a theater progress from one side of the room to the other, [SFX: lightsaber from Ch. 1 to Ch. 2 in a crossfade] These tools have shaped the way movies are produced to this day. That’s part of why the THX Deep Note sounds so unique - no one else on the planet had technology like that in 1983.

Andy: George wanted some kind of video or some kind of logo that plays before the feature comes on, that says, "This is a THX certified movie theater.

[music in]

My suspicion, and I don't know that this is true, but my suspicion is that he spent all the money on the visuals and didn't have any money left for the audio, so he picked someone who was on salary, on staff and said, "Look we need some sound for the animation. It's 35 seconds long, and I want something that comes out of nowhere and gets really really big."

And I said, "Well, I think I know how to do that."

To create the soundtrack, Andy went to the Audio Signal Processor.

Andy: As soon as he mentioned it I knew exactly what I wanted to do. That is I wanted to start with something that would thoroughly bewilder everyone, they wouldn't even be sure that the sound was being played properly. That is to start with chaos and then evolve into the big chord, like a great organ chord.

[music out]

[SFX: big organ chord]

I had always been impressed by the big pipe organs and the sounds they could produce, so that was sort of the idea I had in the back of my mind.

The producer gave Andy the timing of the animation.

Andy: So I got a road map up of the intent of the animation. Of course when it showed up, all the timing was wrong.

Of course it was.

Andy: So this was one of those cases. I'm sitting in the mixing theater and they play the animation, and I sit there with my stop watch [SFX: of stop watch] and I notice that all the timing is wrong.

So while I'm sitting there, I typed in the new times [SFX: typing] into the computer and ran off a new copy of the logo theme [SFX: Long Deep Note, but fade it down after a few seconds] right then in there and we synced it up recorded it onto six track, and that was that.

That sounds straightforward, but it had taken Andy four days of work leading up to this session. Two days to get the basic sounds imported and modified, and another two days to tune it exactly how he wanted it. And remember, Andy had literally invented the technology that made this possible in the first place.

So how did he come up with this idea?

Andy: What do you say, "Steal from the best"? I remember the end of A Day in the Life from The Beatles, right, [SFX: a few seconds of “A Day in the Life”: ] with the big sweep and remember how much I liked that, and I remember [SFX: a few seconds of “D Minor Fugue”:] Bach's Fugue after noodling around a little builds this huge chord that resolves in just this massive, massive chord. So I combined those two ideas.

And then the cluster at the beginning. This is similar to stuff we did while I was at Stanford. We had done a lot of experimentation in music and one of the things that we fiddled with were clusters, because with a computer we could get immensely thick textures that would have been very, very difficult to do any other way. I mean, you couldn't buy enough synthesizers to make a sound that big or that massive, [SFX: beginning of deep note] but with a couple hours of computer time we could build sounds that had that kind of thickness or that kind of texture to it.

That was the idea for the cluster was just a dense cluster of instrumental tones that would rise and fall for which you wouldn't be able to track any one for any length of time. It would just sound like a mess, like chaos.

I had the idea for synthetic sound. I didn't envision flutes and oboes playing in it. I envisioned a completely synthetic sound because I don't know how you would do what I wanted to do with regular instruments.

I had some recordings of cello tones. I pulled one out that sounded rich and all 30 oscillators are using the same tone. It is a cello but you would never know it because one of the distinguishing things of a cello is the sound it makes when the bow hits the string. Since I eliminated that, it's a little hard to tell what's going on there. And that was the idea, except that I wanted it to be rich and natural sounding.

[music in]

With the advent of computer technology in the 70s and 80s, sound designers were able to create sounds that just weren’t possible a few decades earlier. Andy Moorer’s invention changed cinema sound forever. We’ll discover how Andy dreamed up this technology, after the break.

[music out]

MIDROLL

[music in]

In the early 80’s, there was no standard way of making any kind of sound on a computer. In fact you couldn’t even buy a computer built for audio - if you needed one, you had to build it yourself.

[music out]

Andy: Yeah, well see the big problem in computers at that time is that they weren't designed to do ... audio. Most of the audio like Apple IIe were squeaks and pops [sfx: Apple IIe squeaks and pops] and sort of Atari kinds of sounds, [sfx: Atari sounds] they weren't rich, and they weren't life-like, and they weren't natural sounding.

I knew that we could do all this on the computer, except that computers were just horribly, horribly slow in doing basic arithmetic. What we needed was a machine that doesn't do much else, but does basic arithmetic really, really fast.

[music in]

The machine I put together does arithmetic, it's called a DSP, a digital signal processor. There were a couple of other examples of this, special purpose devices that were built for the military. But for the most part they were designed to process images. They didn't have the full 24 bit sound that we like to hear in modern audio.

So I designed it from the scratch, starting from the converters and working back through the processing chain, to the point that I had a device. It was capable of 20 million computations per second, just raw arithmetic, multiply and add, to do audio at that kind of speed.

Even then at 20 million per second it was limited to 30 voices, that was as much as I could get out of the machine at that time.

[music out]

Building that computer took 2 years, and something like 200,000 lines of computer code. The Audio Signal Processor was used on many films produced at Skywalker Sound. So even though it was ground breaking, the technology was familiar to the mixers and sound designers there. So when the producer gave Andy the job of making the THX logo theme, it was almost a routine operation in their studio.

Andy: He said, "Andy we have to record it on Friday." So I said okay let’s do it. I came in, punched a button. He wasn't even there. Gary Rydstrom was, however.

Gary is an Oscar-winning sound designer.

Andy: He was there that day, almost coincidentally. I don't think anyone had told him to go in and QA it or listen to it or tell me if they had to hire somebody else real quick. But he just happened to be there and we were chatting. Well I ran the thing off, [sfx: 9-second Deep Note] and it was funny. He didn't say anything for quite a long time, except then finally said, "Can I hear that again?" and I said, "Sure."

So I punched the button, they recorded it, 15 minutes later I walked out.

[music in]

In 1983, THX was a brand new company and they had never done any marketing. Today most marketing campaigns go through a rigorous internal process. Ad agencies are brought in, different concepts are developed and pitched, focus group testing is commonplace, and many levels of management weigh in on the pros and cons of the advertising. The Deepnote had a much less formal process - The producer assigned to the job pretty much gave Andy free reign.

Andy: Subsequently I got a lot of questions about how it was done. But no, there was no ... I didn't pitch anything. But to tell you the truth, there wasn't really much quality control in the process. He literally gave me the task and then four days later I walked into the theater and mashed the button and that's what came out.

So, maybe that's good. If anyone had heard it, they might not have gone for it. I remember Tom Holman quipping that the part of the sound system that he was really the proudest of were the tinkly, crisp highs, but that's okay, this'll do.

[music out]

The reactions of Gary Rydstrom, Ben Burtt, and Tom Holman were positive, but the Deep Note had not yet been played for the big man… the head honcho… George Lucas himself.

Andy: I wasn't there when the VIPs were brought in, but what I do know is he started inviting people down to my studio. So, I played it for a number of people.

I played it right off the synthesizer, [sfx: 9-second Deep Note] just synthesize in real time right then and there, just mostly for effect, to show them what the capabilities of digital audio were.

I played it for Ray Dolby one time. I played it for ... actually, I played it for Michael Jackson. Oh, Michael Jackson enjoyed it, but when I played the Star Wars theme, he enjoyed that better. So yeah, I played the THX logo theme for a number of people. So I guess George liked it, because he was constantly bringing people down there to hear it.

I don't think they expected what I ended up with there. They kept bringing people. "Come in, come here, listen to this, listen to this!"

[music in]

Creating a score within a computer program today can look a lot like a conventional musical score. However Andy had programmed the Deep Note to playback randomly each time - so every guest heard a completely new Deep Note.

Andy: In the first couple of days I put the cello tone in and I wrote the program for generating the score. And the score was generated from a random number, since I didn't really care where the notes went in the cluster, as long as they were in a certain range. So I just wrote a program that went around all the instruments once a second and gave each one a new pitch. Each oscillator, or each cello, would receive a new pitch, it would slowly start winding towards the new pitch.

[music out]

That's what gives it that feeling of voices going up and going down. Once a second, each voice gets a new pitch.

Andy: And then I assign them the final pitches, which was the final chord. Now that one I did compose of the 30 voices. I said, "You know, we'll do three voices on this tone and two voices on that tone, and one voice on that tone, and so on." I gave them discreet pitches.

One of the first, "Oh, gee" or "Duh" moments was when I had collapsed all the oscillators to be exactly on the target pitch to three decimal places. [SFX : pitch examples softly underneath] Well then it collapsed into an electronic chord. It sounded, not like an organ, but like an electric organ. So I had to de-tune them slightly. And that's what makes the final chord shimmer, too, because they're still getting new pitches every second, they're just within a very tight range, going up and down within maybe 100 cents or so on each pitch.

What makes the Deep Note even more complex has to do with something called temperament. Most instruments today are tuned in what is known as “equal temperament”. The most basic way to think about it, is this [SFX : chromatic scale, played underneath VO]- we have 12 musical notes, and all of them are the same distance apart.

For the Deep Note Andy changed the tuning system so the ratios are actually perfect harmony, using a system known as “Pythagorean Tuning”.

Andy: This hasn't been used routinely since the middle ages, because it doesn't allow you to change keys.

Like when barber shop quartets sang, [sfx: Barber Shop quartet] they typically sang in a kind of a floating just temperament. That's what makes those chords so sharp and so crisp. They don't use vibrato and they sing in these exact pitches ... These are called Pythagorean relations.

And it's these crisp relations that make the sound of that chord sort of bigger than you would expect. It's actually bigger than an organ chord. Bigger than the [sfx: repeat Bach D-Minor chord softly under explanation] Bach chord, because he's playing it on an organ that's in equal temperament. So the pitches can't fuse as tightly.

So with Pythagorean Tuning, in very simple terms - those same 12 notes, might be tuned slightly differently depending on the key it’s in. It’s more of an absolute perfect tuning.

The truth is it’s very subtle and a lot of people can’t hear the difference. However, when you stack up almost 10 octaves of notes, the effect becomes MUCH more obvious. That’s another way Andy was able to give the THX Deep Note such a big sound.

Andy: I knew that that's what I wanted for the big chord because I knew what it was gonna sound like and that would be the formulation with the most impact. It would sound bigger than an organ chord or bigger than an orchestra chord.

[music in]

The computer program allowed Andy to create something he could never have done with a live orchestra, but there was also a problem. Since the program was random there was no real way to recreate the sound over again. He had to record the output from the computer and that would be the only record of it from that point forward. Even if he used the same program again the output would sound just slightly different than before because it was coded to be random. It wasn’t like a guitar lick that that a human performer could replicate, it was a computer that was programed to be different each and every time. He only had one recording of it.

So when Andy submitted the original Deep Note track, that was it. There were no backups. George Lucas and the team loved The Deep Note, but then something horrible happened—they lost it. Andy Moorer’s masterpiece was gone. [music out] And he wasn’t sure if he could get it back.

So how does the story end? We’ll find out, in our next episode.

[music in]

CREDITS

20K Hz is produced out of the studios of DeFacto Sound. A sound design team dedicated to making television, film, and games sound incredible. Find out more at DeFactoSound.com.

This episode was written and produced by Kevin Edds… and me, Dallas Taylor with help from Sam Schneble. It was sound designed and mixed by Nick Spradlin.

Many thanks to Andy Moorer, who is a software engineer, musician, and sound designer of the THX Deep Note. And thanks also to Rob Cowles, Global Director of marketing for THX. To learn more about THX certification visit THX.com.

The music in this episode is courtesy of our friends at Musicbed. Having great music should be an asset to your project, not a roadblock. Musicbed is dedicated to making that a reality. That’s why they’ve completely rebuilt their platform of world-class artists and composers with brand-new features and advanced filters to make finding the perfect song easier and faster. Learn more at musicbed.com/new.

To see Andy Moorer’s sheet music for the deep note, visit our website, 20 k dot org. You can also catch up on past episodes, read transcripts, and buy a t-shirt!

If you’re on Facebook or Twitter, be sure to follow us at the username 20k org. I love hearing from you, and I read each and every comment.

Finally, I need your help. Seriously, don’t zone out right now. What I need for you to do is go to your phone and tell everyone you know to subscribe to twenty thousand hertz. This show cannot grow without you doing that. Text them, call them, write on Facebook, Twitter, Reddit or wherever. Use the link 20k.org/subscribe. There it will give you a bunch of options of where to find the show.

Thanks for listening.

[music out]

Recent Episodes