Hill’s, America’s leading brand of pet nutrition and their long-time production company, Harvest Productions, had chosen this moment to pioneer a software based tool that aims to catapult amplified sound into the twenty-first century. As a means of corporate presentation this was something never heard of before in neither Europe nor the USA.
“We have always done 360 in-the-round style presentations for Hill’s,” explained Josh Koan, Harvest’s Project Leader for the event. “And we always wanted to create a totally immersive experience for the delegates that placed them within the context of the content we created. Following the release version of Soundscape we invested immediately in this amazing technology.”
Harvest Productions are an installation partner for d&b audiotechnik, who produce d&b
Soundscape. “We saw the potential at the first demonstration while it was still in the development phase. Knowing how d&b work, Harvest owner Ron Davis had no hesitation to jump straight in,” said Koan.
d&b Soundscape is a new software platform that enables two different, but related, forms of sound processing to produce defined enhancements to the listening environment.
Based on the d&b DS100 signal engine, Soundscape manifests two distinct software components: En-Scene, a sound object positioning tool; and En-Space, an acoustic room emulation tool that provides the means to add real room reverberation signatures to a given space.
“For the Hill’s event we were in a large ballroom at a hotel in Lisbon, Portugal,” continued Koan. “A central stage was used for all speech-based presentation, the delegates surrounded the stage, and they in turn were surrounded by a 360-video panorama. The presentation was not just product oriented; the intention was to support education and research in the field of animal health and welfare. What we wanted to achieve was an audio system that worked emotionally with the video, but the meat of the show was the Hill company presenters and what they had to say. The video content was a mix: images of pets doing what they do, maybe cats transiting from one screen to another by walking in and out of wardrobes; plus, our team shot footage in and around Lisbon to give the delegates a sense of place. We then paired that to multi-track recordings so we could paint an accurate audio portrait, giving all delegates – wherever they sat – a spatially accurate experience.”
“The whole system comprised point-source loudspeakers, mainly Y10P with some E12-D,” explained Ryan Hargis from the d&b audiotechnik US Education and Application Support team, who supported Koan on this, his first use of d&b Soundscape. “Harvest flew in all the control, including the R1 and DS100s that are the backbone of Soundscape, while we arranged for Iberian d&b Sales partner Tuix & Ross to provide the loudspeaker hardware.”
With two elements to the presentation as Josh described, the physical sound design also has two components: an inner ‘out firing ring’ for speech based presentation from the central stage area used all Y10P. Chosen for their size and directivity, the dipole control of a Y10P when close to the presenters was of particular value. Then there was an ‘in-firing’ system of E12-D and Y10P on the video perimeter. Each screen was approximately four meters wide and positioned north, south, east, and west of an oval shape with the loudspeakers evenly spaced between them.
“It’s important to understand that although this appears to be two loudspeaker systems for two different roles, they in fact work together. Even though the inner ring fired outwards, they were used to support the Soundscape environment. So if, for example, we wanted a sound object to originate in the north position then as that sound transited the room and the delegates, a loudspeaker from the inner ring would pick up that role, supporting the sound object across the 120- foot throw. Both systems were in a circle so from the perspective of the d&b Soundscape software, elements of both systems always formed part of the environment. That approach really emerged when Josh and I first ran the simulations in ArrayCalc back in the US.”
For a first time ‘out of the box’ experience Koan was delighted. “In reality it was very smooth. Yes, of course, there were a couple of things we needed to adjust on site, but in terms of slotting Soundscape into the d&b workflow, ‘system reality’ it was…”
“…The trickiest thing was doing this in-the-round, and figuring out how ArrayCalc would handle it took time in the planning stage, but that brought further benefits when we got to Portugal.”
“Besides the corporate presentation we also find local artists and musicians to further engage the delegates. The violinist and guitarist of a folk group, both using radio mics, started their performance by entering from the video perimeter and walking through the delegates to the stage. We audibly choreographed that movement using En-Scene so all delegates had a geographical experience of the performers as they moved through the room, and then we added En-Space to create a reverberant concert hall experience in what was a quite dry acoustic environment,” said Koan.
He continued, “That worked so well in rehearsals that we also applied it to the speech presentations. A touch of wet to the voices made the listening experience much more natural. Instead of sounding like an amplified voice in a typical hotel ballroom it sounded more like a professional presenter with all the inherent power and emotion they are accustomed to using when throwing their voice unamplified. The delegates may not have consciously acknowledged the effect but that is good, it was that transparent. We found it so easy to do and stable to work with that we used it for the entire show – in that sense it was a total illusion.”
“Fundamentally we used QLab to program the sound moves. The video spit out Time Code, QLab took that and output OSC into the DS100 then to the matrix, so the whole process became automatic. That functionality was already built into QLab. What it meant was you just needed to drag the path of the audio through the listening area, and QLab writes the OSC code to make that happen. This was not a simple ‘snap to a new position’ audio, this was real-world transiting through space from one location to another, and without any need for complex programming to achieve that effect. For the show,” concluded Koan, “we just sat back and let it run. It really was that easy.”