Wow, more than year without updates. But then again, nothing has changed in the system at all during that time. Just been enjoying music and movies here when I can. 20+ years of tweaking and changes finally plateaued for a bit.
But I do have something related to this system, and that is an Atmos headphone processor that I received in the summer of 2020, having backed it on KickStarter waaaay back in 2016.
The Smyth Realizer A16 is an amazing piece of kit, as it can render the ‘sound’ of a room in full 3D, as it includes head-tracking. So if I look left, the sound remains anchored to the same location as when I look forward, just as if I were listening to real speakers. And it does full Atmos 3D audio rendering in up to 16 Channels. It is amazingly effective at that over a pair of headphones.
The real bonus is its ability to map an actual existing room and the actual Head-Related Transfer Function (HRTF) of the person doing the recording. They call this a Personalized Room Impulse Response (PRIR), a term only an engineer could love. These guys are geniuses, but marketing is not their forte.
My goal for this unit is to be able to watch content in my other media room, with that awesome OLED screen, yet still enjoy Atmos content (quietly), and even better if it sounds like I’m listening to the big rig in the HT.
So, I set aside a morning to go map the XStatic Theater and create my own PRIR and one for my wife as well, so she had to sit through the process for 15 minutes.
I won’t bore you with the details, but suffice it to say, the process is daunting and requires hours of study to learn how to do the first one. But once one has accomplished that, the process takes about 15 minutes per PRIR to record. I wrote some documentation on that if anyone winds up with an A16.
I pulled the main listening row seats out and put in a swivel office chair so I could better map the room, and to allow for swiveling to certain look-angles during the process.
In this shot, we see the A16 sitting on a stand to the left of the chair. I’ve yet to plug in the in-ear mics, and the blue cord is a grounding wrist strap to null the body’s own ‘hum’ from impacting the recording.
I sit in the chair, with mics stuck in my ears, and the system plays a series of tones, and instructs me where to look for the next set of tones. When it’s done, I have a customized map of my room that matches my head and ears. With 7.1 channels, that is a lot of tones to sit through 3 times.
This is what it looks like from the front of the room
The results are pretty amazing. I sit in my small media room, listening to Atmos movies & music, and it sounds as if I was in the HT. The quality of the sound is not as good as the real thing, as the dynamic headphones I use are not ESLs for sure. And when I turn my head, dialog stays anchored to the screen, reinforcing the ‘I’m listening to speakers in this nice room’ sense. During a movie, about half-way through I completely forgot I was wearing headphones, the effect is that good.
But I do have something related to this system, and that is an Atmos headphone processor that I received in the summer of 2020, having backed it on KickStarter waaaay back in 2016.
The Smyth Realizer A16 is an amazing piece of kit, as it can render the ‘sound’ of a room in full 3D, as it includes head-tracking. So if I look left, the sound remains anchored to the same location as when I look forward, just as if I were listening to real speakers. And it does full Atmos 3D audio rendering in up to 16 Channels. It is amazingly effective at that over a pair of headphones.
The real bonus is its ability to map an actual existing room and the actual Head-Related Transfer Function (HRTF) of the person doing the recording. They call this a Personalized Room Impulse Response (PRIR), a term only an engineer could love. These guys are geniuses, but marketing is not their forte.
My goal for this unit is to be able to watch content in my other media room, with that awesome OLED screen, yet still enjoy Atmos content (quietly), and even better if it sounds like I’m listening to the big rig in the HT.
So, I set aside a morning to go map the XStatic Theater and create my own PRIR and one for my wife as well, so she had to sit through the process for 15 minutes.
I won’t bore you with the details, but suffice it to say, the process is daunting and requires hours of study to learn how to do the first one. But once one has accomplished that, the process takes about 15 minutes per PRIR to record. I wrote some documentation on that if anyone winds up with an A16.
I pulled the main listening row seats out and put in a swivel office chair so I could better map the room, and to allow for swiveling to certain look-angles during the process.
In this shot, we see the A16 sitting on a stand to the left of the chair. I’ve yet to plug in the in-ear mics, and the blue cord is a grounding wrist strap to null the body’s own ‘hum’ from impacting the recording.
I sit in the chair, with mics stuck in my ears, and the system plays a series of tones, and instructs me where to look for the next set of tones. When it’s done, I have a customized map of my room that matches my head and ears. With 7.1 channels, that is a lot of tones to sit through 3 times.
This is what it looks like from the front of the room
The results are pretty amazing. I sit in my small media room, listening to Atmos movies & music, and it sounds as if I was in the HT. The quality of the sound is not as good as the real thing, as the dynamic headphones I use are not ESLs for sure. And when I turn my head, dialog stays anchored to the screen, reinforcing the ‘I’m listening to speakers in this nice room’ sense. During a movie, about half-way through I completely forgot I was wearing headphones, the effect is that good.
Last edited: