Get the Latest BuZZ Each Week

Transcript: Digital Production Buzz- March 2, 2017

HOST
Larry Jordan

GUESTS
Andy Cochrane, Director, The AV Club
Nick Bicanic, Founder, RVLVR Labs
Brian Glasscock, User Experience Researcher, Member of the VR Mic Development Team, Sennheiser
Duncan Shepherd, Editor/Creative Director, Duncan Shepherd Films
Chris Bobotis, CEO and Co-Founder, Mettle
James DeRuvo, Film and Technology Reporter, DoddleNEWS

==

Larry Jordan: Tonight on The Buzz, we are talking virtual reality and 360 degree video. We start with Andy Cochrane, he’s a director working in interactive and immersive media like virtual and augmented reality. Tonight, Andy sets the scene for us by explaining what VR is, what it takes to make it successful, and how existing production techniques need to change to make VR work right.

Larry Jordan: Nick Bicanic is the founder of Revolver, a company that specializes in creating VR movies. Recently he wrote about best practices for creating 360 degree video and tonight he explains what’s possible, and what isn’t.

Larry Jordan: Brian Glasscock is a member of the VR Microphone Team at Sennheiser. Recently, they released a new mic specifically for Ambisonics and tonight Brian tells us about this new technology that not only provides 360 degree audio, but allows us to hear exactly where a sound is coming from.

Larry Jordan: Duncan Shepherd specializes in editing immersive VR experiences and shares his thoughts on the challenges in editing VR material.

Larry Jordan: Chris Bobotis is the CEO and co-founder of Mettle who make plugins for 3D and VR video, and tonight Chris explains what they are, and how they work.

Larry Jordan: All this, plus James DeRuvo on our DoddleNEWS update. The Buzz starts now.

Announcer: Since the dawn of digital filmmaking – authoritative – one show serves a worldwide network of media professionals – current – uniting industry experts – production – filmmakers – post production – and content creators around the planet – distribution. From the media capital of the world in Los Angeles, California, The Digital Production Buzz goes live now.

Larry Jordan: Welcome to The Digital Production Buzz, the world’s longest running podcast for the creative content industry, covering media production, post production and marketing around the world. Hi, my name is Larry Jordan.

Larry Jordan: This week, we’re talking about 360 degree video, and its cousin, virtual reality. Now I will confess that I’ve been a skeptic on this format until I talked with our first guest, Andy Cochrane. He helped me see VR in an entirely new perspective, which makes me better understand where it fits in today’s media landscape and I know you’ll enjoy his comments as well.

Larry Jordan: Also, there is a new form of audio called ambisonic. We’re all familiar with mono, stereo and surround audio. Ambisonic audio is sound that surrounds you, but unlike surround sound, an ambisonic mix changes in real time as you move your head and body. You not only hear the audio, you also hear where it’s coming from. Sennheiser just released a new ambisonic microphone and Brian Glasscock joins us tonight to explain this new technology and how it works.

Larry Jordan: We have a fascinating collection of guests this evening, because when we shoot or edit VR, all the traditional rules of production change. Our goal tonight is to help you understand this evolving landscape and provide some tips on how to make the most of it.

Larry Jordan: Thinking of making the most of the latest technology, it’s time now for our DoddleNEWS update with James DeRuvo. Hello James.

James DeRuvo: Hello Larry.

Larry Jordan: So what have we got that’s news this week?

James DeRuvo: Well, Blackmagic did a streaming presentation today and announced a brand new video camera, the URSA Mini Pro which they say is a cinema film, broadcast and studio camera all rolled into one, and it also is able to have easily swappable lens mounts, so you can switch from EF to PL to B4 and in the summer, they’re going to have a Nikon F mount with a special mechanical aperture ring and you can just swap in between them whenever you need to which seems to be a thing now. I’m noticing a lot of camera companies are starting to release cameras where you can hot swap out the lens mounts which is pretty cool.

Larry Jordan: Very much so.

James DeRuvo: It’ll record in SD or in C-Fast 2 cards, or you can use an optional SSD drive which will attach to the back. DaVinci Resolve also got two new color panel controllers and they start at 995 and Resolve also now supports Linux. So if they’re going to make this announcement a month before NAB, you can only imagine what’s going to be coming on the first day when they make their big annual press announcement.

Larry Jordan: I remember talking to Dan May, the President of Blackmagic US, and he said they’re starting to space out announcements so that they don’t all occur at a trade show, because they tend to get lost. Whereas, like now, the URSA Mini Pro is able to get its own moment in the sun without getting blown away by all the trade show noise.

James DeRuvo: It’ll let the other products have their own shining moment when they do get announced at the trade shows too.

Larry Jordan: So what else we got?

James DeRuvo: Sigma has added three new lenses to their Art lens line. You can get two primes, a 14mm F1.8 and a 135mm F1.8 and a 24-70 F2.8 HSM zoom. They have super fast auto focus with a manual override, nine blade apertures and a rounded diaphragm. Like I said before, the mounts for these lenses can go from EF to Nikon F mount, and Sigma, and you’ll be able to convert the mounts when needed. Pricing and availability is to be determined, but in the neighborhood of the other Art lens I imagine, so it’s probably going to be between $1,000-1300 and with these kind of lenses, indie filmmakers will be able to up their game considerably.

Larry Jordan: I like the idea of switchable lens mounts because I got myself stuck one time where I had a camera of one mount, and lenses with another, and the only way to fix it was like a $3,000 adaptor. So this is a really good trend. I encourage this. What else we got?

James DeRuvo: It looks like it’s also really easy to change them. When they did the Blackmagic presentation, it took them about 30 seconds to change the mount, it was incredibly easy.

Larry Jordan: Very cool. What else?

James DeRuvo: The last story that I have is the My RODE Reel short film contest is back with a half a million dollars in prizes. Biggest short film competition in their history. To give you a few details of the competition, you submit two videos. The first one is an up to three minute short film and then a three minute behind the scenes video that prominently features the RODE gear you used in filming the short film. Genres include drama, horror, comedy, all the usual ones. Plus new categories including blog, virtual reality, and they want people to try and make a TV commercial about a RODE product.

Larry Jordan: Very cool.

James DeRuvo: The contest is going on till midnight, June 30th, Australian Eastern Standard time, which is June 29th in the US and it’s a fantastic competition and a great excuse to buy yourself some new RODE gear. Back to you.

Larry Jordan: James, for people that want information about the RODE contest, or all the rest of our news, where do they go on the web?

James DeRuvo: All these and other stories can be found at doddlenews.com.

Larry Jordan: James DeRuvo is the senior writer for DoddleNEWS, and returns every week with our DoddleNEWS update. James, thank you so much, we’ll talk to you next week.

James DeRuvo: OK Larry, take care.

Larry Jordan: Bye bye.

Larry Jordan: Enter the new digital eco system of media, entertainment, and technology, where behavior and business have merged to redefine content, workflow and revenue streams. It’s the M.E.T Effect, a cultural phenomenon fuelled by hybrid solutions and boundless connectivity that’s changing the very nature of how we live, work and play.

Larry Jordan: Join more than 100,000 attendees from 160 countries at the NAB show. Conferences are April 22nd to the 27th and exhibits are April 24th through the 27th, at the Las Vegas Convention Center in Las Vegas, Nevada. Let’s thrive and I’ll see you there.

Larry Jordan: Andy Cochrane is a director working in interactive and immersive media. Some of his recent credits include directing a VR tour of Google’s retail program, the intro for Google’s Jump 360 video platform, and a commercial for the Barco Escape theater system featuring M&Ms. Hello Andy, welcome.

Andy Cochrane: Nice to be on the show. Thank you for having me.

Larry Jordan: What first got you interested in VR?

Andy Cochrane: Honestly, it was love at first sight. As a teenager growing up in San Jose, in the Bay area, I got to use some of the early VR systems, and always thought it was cool but it was like arcade cabinet cool. Big, impractical, really fun cool stuff. But in 2013 I was working at a company called Mirada doing a lot of digital and interactive stuff, installations and apps, and the first Oculus Rift’s developers kit came out and I’m not kidding, I put it on, tried the Tuscany demo which takes ten seconds to understand and you’ve pretty much done the whole thing in a minute, and by the time I took the headset off, I was already asking, “How do we shoot, how do we edit, how do we do the effects?” Trying to figure it out immediately. It was so clear that this is a big thing.

Larry Jordan: I accept that VR has captured your attention. What is it about VR that has caught the attention of audiences?

Andy Cochrane: VR has an enormous stumbling block which is it doesn’t do any justice to the medium to explain it. I can explain the power of VR and it’ll sound a bit lame and silly. It’ll sound like a kids toy, but a lot of the questions that people ask like “How is this not stereo TV? How is this not a fad?” are answered by simply putting it on and trying it. That’s a terrible answer because it means every single person has to try it to know why they should use it. It really is that transformative and immersive and exciting, even in its really rough early stages like we have now.

Larry Jordan: That gets to a really core point. Is VR best used for telling stories? Or best used for providing experiences?

Andy Cochrane: That’s actually one of my big scream from the mountaintop crusades. The word ‘storytelling’ does not apply in virtual reality because you’re not telling an audience anything. It’s not passive entertainment, it’s not theater or film or TV that you watch or listen to. Even if it’s 360 video and there’s no interactivity, the very fact that you are fully immersed in that video and the fact you can look anywhere and you have freedom to not look at the actor, you can look at the floor if you want, takes it out of the realm of storytelling where we’re crafting stories, and takes it into a realm of pure experience. That’s not to say that narrative doesn’t belong. There’s absolutely story happening but often the story is the experience. So if you ask somebody what that new movie is about, they’ll say there’s a character and he does this, and that and then falls in love with a girl. If you ask somebody about a virtual reality experience, they’ll say it’s really cool, you’re teleported to another planet and you do this and that. It’s all first person and experience based, and the story is your story. It’s your experience. That’s my big crusade. People who craft stories are absolutely welcome in the medium, but if they expect to be telling a story, they’re going to really burn themselves out.

Larry Jordan: I watched several of the movies on your website which were very nicely done by the way.

Andy Cochrane: Thank you very much.

Larry Jordan: But I noticed that from a production point of view, shots are wider, pacing is slower, camera movement is almost nonexistent. How do we need to change our production techniques when we’re shooting VR scenes?

Andy Cochrane: There’s different types of VR. On one end of the spectrum is 360 video which looks a lot like movies and TV shows and music videos. At the other end of the spectrum is fully immersive, fully interactive, game engine driven, real time rendered virtual reality. On the more familiar end of the spectrum, the 360 video side of things, you can storyboard, you have a script, you have a screenwriter, you have actors and sets and lights. But the audience experience is not the same experience as cinema, in the same way that if you’re going to be shooting something for TV versus IMAX, your framing and your pacing is going to be slightly different. If your audience is viewing what you’ve filmed in an IMAX dome for instance, you have to be very careful about camera movement because you can make the audience sick and because it’s such a large image, that the audience is looking around. So you don’t want to cut too quickly because you’ll confuse them.

Andy Cochrane: 360 video has similar restrictions from the standpoint of the audience is not in a theater watching it, they are standing, as far as they’re concerned, where the camera is and so traditional concepts like framing and camera movement, are not framing or camera movement. It’s audience placement and audience movement. If you move a camera, if you dolly in, in cinema, that is a move that has a certain language and a certain meaning, emotionally and narratively, that people associate with it, and that we’re used to. We understand a dolly in means getting closer and more intimate and depending on the scene and the music, we understand that as a push in. If you move the camera in VR, you’re putting the audience on a conveyor belt or a tractor beam and their response to that is not going to be intellectual, it’s going to be physical, and so you can absolutely use camera movement, but it’s not done to get a better shot or to get a better angle. It’s done with the understanding that you are going to be physically moving and affecting the audience. So in a horror environment, slowly creeping the camera forward, the audience is basically on a slow conveyor belt moving towards things they don’t want to move towards. So it’s not a dolly, it’s actually an anxiety inducing tractor beam sucking the audience towards the scary stuff. We have a lot of the same tools and a lot of the same capabilities and technologies, but the way that the audience experiences them is much more physical, much less based on their understanding of the history of film. You have to rethink everything. Even though a dolly is a dolly, a dolly move is not a dolly move when you get into 360.

Larry Jordan: Andy, I could talk with you for probably the next hour and a half and learn something during the entire conversation. For people that want to know what you’re doing and what you’re thinking, where can they go on the web?

Andy Cochrane: My own website is Andrew-cochrane.com and I use the name AVclubvids pretty much everywhere, Twitter, Instagram. If you can find an AVclubvids, that’s me.

Larry Jordan: That website is all one word, Andrew-cochrane.com and the Andrew Cochrane himself is the voice you’ve been listening to. Andy, thank you so much, this has been fascinating.

Andy Cochrane: Absolutely, it’s been my pleasure, thank you so much.

Larry Jordan: Nick Bicanic is an award winning film director and software entrepreneur. He’s also the founder of RVLVR Labs, a virtual and augmented reality storytelling company. He’s currently working to figure out how to make virtual reality storytelling compelling for audiences. Hello Nick, welcome.

Nick Bicanic: Hey, thank you very much for having me.

Larry Jordan: Let’s jump right into the heart of it. What projects are you working on right now that involve either VR or 360 video?

Nick Bicanic: We’re working on a couple of different projects, but my primary focus has actually been on the 360 video side to pick up from what you were talking about with Andy earlier. I’m focusing on scripted narratives and specifically trying to figure out how to take audiences on an emotional journey, giving them a little bit of control, because such is the nature of the medium. But obviously within the constraints of physical production, as you have to when you’re making an experience that allows the viewer to look in any direction they choose.

Larry Jordan: Andy would say that we can’t do storytelling in VR, but we can provide experiences. It sounds like you disagree with that?

Nick Bicanic: It’s multiple ways to skin a cat. At the end of the day, storytelling the way I see it, is about making people feel certain emotions. You know, when we watch, for the sake of argument, Ygritte die in an episode of ‘Game of Thrones,’ and I hope I haven’t spoiled that for anybody who is still watching it, then we’re feeling an emotion and the way we feel that emotion is a combination of mise en scene and montage, actor performance, music. Those kinds of tools are storytelling tools and at the end of the day, those tools are available to us to use in 360 video as well as virtual reality. They’re used in different ways but ultimately if we bore the audience or if we confuse the audience, if we don’t make them feel a compelling emotion, we lose. We lose just as much in a book or in a piece of printed media, or in a theater play as we do in 360 video. So I think Andy would agree that emotion is absolutely possible. Quite how you get there of course, remains to be determined.

Larry Jordan: Well that I think is a core point. Many of the production techniques that we’re used to, whether it’s fast cutting or moving the camera, don’t work in 360 video. So what does?

Nick Bicanic: I actually think they work more than you would think. They’re just tough to understand how to use. I’ll give you an example. So, it’s quite common for the producers that are trying to do 360 video, to think, “Well I have this tool so I might as well use it.” Many things that you might try and watch online, you’ll see that they put a dancer in front of you and a performer to the left, and another performer to the right, and something else is happening behind you. Not only is that not a realistic depiction of normal life, because in normal life, there’s an area of interest, unless you walk into the middle of a cave of wonders, you tend to have something happening directly in front of you. But on top of it not being a depiction of real life, it’s also an incredibly confusing thing for somebody to try and experience because they don’t know what direction to look in. The way I view this is that our human eyes can only see about 170 degrees but our ears can hear 360, so while it is quite important to consider the space, I view 360 degrees of something that’s about context. Whereas content can sit far more squarely within about 170 or 150 degrees and that has significant implications for how you produce things, and how you construct the story. It’s also worth pointing out that while if you go to a film festival or a demo of virtual reality equipment, everybody is sitting in spinning chairs and everybody’s wearing fancy headsets. The fact is today, and it will stay that way for at least a year, 18 months, the vast majority of 360 video experiences are not consumed in spinning chairs, but sitting on the sofa, watching a mobile phone. When you do that, guess what? Our necks and shoulders are not really well constructed to be able to turn around comfortably 360 degrees which is another reason why, without damaging the experience, I believe that we can creatively constrain activity and storytelling points to something that’s far lesser than 360 degrees.

Larry Jordan: I had an interesting insight, at least to me, while you were talking. It sounds like the best way to approach 360 VR is a radio play, because the audio surrounds us 360 and then once we’ve caught somebody’s attention with an audio cue off to one side, we can then bring the video over to that side for them to see it. Would an audio focus at the beginning of planning make a difference?

Nick Bicanic: Audio is very important and somewhat underdeveloped. There are a lot of tools that are out there but many people are not using them, and it’s sort of considered after the fact in the mix. So I absolutely think that it’s important. Perhaps not necessarily for the exact analogy you made with a radio play, but I do think it’s an important part of storytelling. It also illustrates something very interesting. In much the same way as when 3D filmmaking started looking like it was happening, and it demanded a massive amount of technological understanding from not just directors but also screenwriters and cinematographers, something very similar is happening here. In order to be able to try and conceive of a story, whether in pre-production or whether you’re just writing a concept, you have to understand a lot more about experiences than people used to have to up to this point. Sometimes I ask people, “What do you consider the phrase UX to mean?” If I get the answer, which I frequently do, “It’s the buttons and the menus that you have to wade through to get to the content,” I realize that they’re missing the point. Because in 360 and in VR, the UX is the content. The way in which our audiences are consuming this stuff, whether they are turning around or whether they are using a trackable headset and handset, for example the Oculus Touch or the HTC Vive controllers, or the new Samsung Gear VR with a controller, all of these things are a significant aspect of the way in which people consume this new medium, be it an experience or a story or a 360 video, it doesn’t matter. If you don’t take that into consideration when you’re making them, you’re going to make stuff that can look unfortunately quite lackluster and that’s something both Andy and I, I think, are fighting against.

Larry Jordan: Nick, I could talk with you about this a whole lot longer, and we will invite you back, but for people that want more information about where you are and what you’re working on, where can they go on the web?

Nick Bicanic: Like many other companies, we’re in flux right now, and the final website is not yet finished. But I’ve had a lot of interesting things published on a medium site, so the link I’m going to give you is towards that, lots of information on there. It’s Bitly/RVLVR2.

Larry Jordan: That’s all one word, bitly/RVLVR2, and Nick Bicanic is the founder of RVLVR Labs. Nick thanks for joining us today.

Nick Bicanic: Thank you very much for having me.

Larry Jordan: Take care, bye bye.

Larry Jordan: Brian Glasscock is a user experience researcher for Sennheiser, and he’s a member of the VR Mic Development Team. He is part of the team that develop their 360 degree VR microphone called the Ambeo VR Mic which we want to learn more about tonight. Hello Brian, welcome.

Brian Glasscock: Hey Larry, how’s it going?

Larry Jordan: I am really curious what is a user experience researcher?

Brian Glasscock: I work in the part of Sennheiser that looks about three to five years out into the future, and we look for technology trends as well as user needs, and we bring new prototypes out and experiment with them. So my main role as a user experience researcher is determining what user needs are by talking to users and working with users, as well as taking prototypes out into the field and field testing them.

Larry Jordan: Brian, I hate to break it to you, but we’ve had microphones for the better part of 100 years. What new stuff do we need to discover?

Brian Glasscock: There’s all different kinds of things and as we look to more immersive mediums that are being created these days, whether it’s virtual reality or 3D sound for sports broadcast, there will be new needs and new microphone techniques that’ll be necessary for those kinds of things.

Larry Jordan: I was just looking at the website and I do want to talk about your VR mic in just a minute, but we’ve had omnidirectional mics for decades. Why do we need something for 360 VR?

Brian Glasscock: The difference here between an omnidirectional mic is that a 360 mic actually gives you spatial information. So an omnidirectional mic will pick up sound regardless of what direction it comes from. But you don’t have any information about where that sound is coming from. So in a 360 degree setting, it’s crucial that you have information about where a sound is coming from so that, at playback, you can render that soundscape if you will, appropriately for the user.

Larry Jordan: Give me an example of what you mean.

Brian Glasscock: Say you’re taking a 360 video at a beach somewhere and you want to be able to have your user look around and look at the beach or up at the shore, but be able to explore that experience immersively. If you were using an omnidirectional microphone, as the user turned their head and got a different visual experience, they wouldn’t be able to get any kind of different audio experience. But if you’re using a 360 degree microphone, like the Ambeo VR Mic, what that allows you to do is at playback, render different audio perspectives as the user turns their head. So they get the appropriate audio experience for whatever direction they’re looking in their 3D experience.

Larry Jordan: So the sound of the waves would be in front of them if they’re looking at the water, and behind them if they were looking at the shore behind the water?

Brian Glasscock: That’s exactly correct, yes.

Larry Jordan: Tell us about this new mic you’ve developed called the Ambeo VR Mic. What is it?

Brian Glasscock: The Ambeo VR Mic is a first order ambisonics mic that allows you to capture a fully spherical sound field that at time of playback, like we just mentioned, can be rendered in 3D. So it’s a perfect complement to 360 video capture that allows you to give a matching audio experience to your high quality 360 video experiences.

Larry Jordan: You’ve used a term I’ve never heard used in audio before, which is render. I know what rendering is with video, but a mic is a mic and an audio stream is an audio stream. What does render do in audio?

Brian Glasscock: This microphone has four capsules on it. But these four capsules can’t be plugged or piped into a speaker like a normal microphone. Each of these capsules are used in a process that’s called ambisonics, which is a way of representing a sound field that is a certain sound perspective, or a certain place in sound, that has to be rendered at playback. So, rendering takes information about which way the viewer is looking, which way their head is tilted, whether it’s up, down, to the side, and then uses that information to generate an appropriate binaural render of what that soundscape sounds like. So as you turn your head, there are some calculations and some math that, based upon this principal called head related transfer functions, or HRTFs, gives lifelike, realistic sound from every direction.

Larry Jordan: So in the background you’re holding all four of these channels ready to go, and then as the person turns their head, you’re determining which of the channels to play back to give the illusion that the audio is changing in space?

Brian Glasscock: Yes, it’s a little more complicated than that, but you’re correct about the end result.

Larry Jordan: It’s always more technical than it seems at first.

Brian Glasscock: Yes, absolutely.

Larry Jordan: Is the mic shipping, and what is the cost?

Brian Glasscock: Yes, the mic is shipping. In the United States it’s $1,650.

Larry Jordan: I can see how putting a microphone in an environment, and your beach is a wonderful example, that we’re able to pick up sound vertically, up and down, and left and right, and back and forth, so we’re getting that spherical sound that you’re talking about. But most of the VR that we’re working with is game related where they’re putting a whole mix together. There, the microphone isn’t particularly useful because we have to take all these different sound elements and put them together in a traditional mix. What would be a good application of using this mic where putting it in a single location’s going to pick up all the sound that we need?

Brian Glasscock: There are certain situations where having just the mic is fine, and that would be in a quiet environment where the subjects that you’d like to pick up are close to the mic, and the ambient mix if you will, already sounds pretty good. But in most situations, we see users using the mic as kind of an ambient bed, and using lavaliers or other spot microphones for sound effects, mixed in on top of it. So you can use post production tools that work in ambisonics to spatialize other sounds whether that be from a lavalier, or from foley, so that they also sound spatialized and are coming from an appropriate place in the sound field.

Larry Jordan: So ambisonic means this spherical pick up where we have a sense of direction as well as simply hearing the sound? We can mix ambisonic sound in things like Pro Tools?

Brian Glasscock: Yes, you can mix it in Pro Tools. We see a lot of people using Reaper these days because of the high channel count. First order ambisonics like the VR mic is just a four channel signal, but you need a set of special post production plugins that allow you to both render, so you can preview while you’re mixing, as well as take mono or stereo sources and pan them into ambisonics.

Larry Jordan: Does Sennheiser provide those software tools?

Brian Glasscock: We don’t provide those software tools. There are a number of partners that provide them, from anywhere from free for a simple set of tools, such as the Facebook 360 Spatial Workstation, up to more expensive tools that run around $1,000 such as the Blue Ripple Sound tool set, which includes some additional features such as spatial compression or spatial AQ.

Larry Jordan: I was looking at the picture on the website. There’s a single XLR connector at the bottom of the mic, yet four capsules at the other end. What kind of gear do we have to record this signal on, and what are we recording?

Brian Glasscock: Out of the bottom of the mic, you actually get what’s called a din 12 connector, which is a 12 pin connector, and we provide a 1.5 meter extension for that, as well as a break out cable. So the break out cable takes that din 12 and converts it to four balanced XLRs. Now for recording equipment, you simply need a recorder that will provide phantom power, and has the capability of recording four channels at once. We have a couple of recommendations. Because ambisonics requires specific differences between capsules, so where a sound is coming from depends on what signals are received at each capsule, you’re going to want to use a recorder that has digital gain, so you don’t apply any external influence on that, as well as one where you can link together the channels. So you can turn them up and down altogether.

Larry Jordan: What would be some recommended recording gear?

Brian Glasscock: Most of the Sound Devices recorders will allow you to do that. We also see a lot of people using the Zoom F4 and F8 as well as some Tascam DR680s.

Larry Jordan: When you’re doing 360 recording, where are you putting the mic? Is it suspended above the set, or buried on the floor? What’s the position?

Brian Glasscock: Just as in a 360 video, where you place the camera defines the visual perspective for the end user, the same is true with the ambisonics mic. You’re going to want to place the microphone wherever you’d like the listener to have their perspective from. Now in most cases, that’s coaxial with the camera or located in the same axis as the camera and as close as possible to the camera itself.

Larry Jordan: So we should consider this mic basically as our ambience mic? Of picking up the whole environment? And then supplement it with additional mics which are clipped to the actor if it’s an acted type piece, to get the tight sound that we need to get the clean audio and clarity that we would want? Am I hearing this correctly?

Brian Glasscock: That’s exactly correct.

Larry Jordan: What kind of training does Sennheiser provide for this mic?

Brian Glasscock: We provide some basic training on our website. If you go to our sennheiser.com and look at our Ambeo blueprints, we provide a bunch of different tool sets and different instructions about how to mix for 3D audio and how to produce 3D audio. We also plan on producing a series of webinars for users to understand not only how to record with the mic, but also how to work with the signal afterwards. More information about those webinars will be coming soon if you follow us on Facebook, or sign up for our Ambeo newsletter.

Larry Jordan: Brian, where can people go on the web to learn more about this microphone?

Brian Glasscock: If you just go to sennheiser.com, you should be able to find out everything you need to know.

Larry Jordan: Brian Glasscock is the user experience researcher for Sennheiser, and a member of their VR mic development team. The website is sennheiser.com, and Brian, thanks for joining us today.

Brian Glasscock: Thanks Larry.

Larry Jordan: Duncan Shepherd is an old, bold video editor from the land of music videos and commercials. Last year he discovered VR and was so taken with it, he’s just devoted his time to developing post production techniques for both VR and 360 video. Hello Duncan, welcome.

Duncan Shepherd: Good evening Larry, how are you?

Larry Jordan: So what makes editing VR or 360 degree video different from editing normal HD?

Duncan Shepherd: I think that the experience of viewing the content is different. So the way I always approach a lot of my editing was thinking about what you wanted at the end. Whether you’re selling something, or you’re trying to get an emotional, empathic experience. Because of that, the way you approach everything in VR is slightly different. If you want an empathic experience you can allow the viewer to have much longer shots that would be traditionally allowable in normal HD video because you can settle yourself in the scene a lot more fully. Especially with high quality audio and sophisticated sound design, you can allow the viewer to have a much more engaging experience as an experience. Whereas, if you’re trying to sell something in narrative, that’s a different set of problems altogether because you have no real ability to direct the viewer’s attention apart from with audio. You can’t really tell where people are looking.

Larry Jordan: Well Nick was saying earlier in the show that audio was an essential part of 360 video, and what I just realized listening to you is that we’re used to thinking about moving the camera, to get to the actor. In 360 video, we need to move the actor to get to the camera, is that a true statement?

Duncan Shepherd: I think broadly speaking that’s true. It’s much more theatrical really. In the sense that a theater director will probably be better able to really rehearse a set of performances and just leave a set of actors to run without any more interaction if he’s done lots of workshops or things like that. Although, actually, because we’re still at the very beginning of this whole technology and this art form, people have started off with static cameras, and they’ve started to move the cameras a lot more now, and there are limitations to do with making the audience feel nauseous or being confused or overdoing things a little bit. But, as people get more experienced in the ways of picking up an idea and then shooting it, and then seeing how that works in post production, people are also getting more adventurous with the way they’re using movement with the cameras.

Larry Jordan: What editing techniques do we need to avoid?

Duncan Shepherd: Principally moving the horizon around too much. That’s the first thing that will make somebody feel ill. From my point of view, I think that’s really one of the only things to really worry about. If the viewer is feeling some sense of nausea and it’s just taking them out of the enjoyment of the thing. Other than that, I think you can use all sorts of editorial techniques that we’ve been developing and have been developed for the last 100 years. The thing I always think with editorial is just try it and if it works then it’s working. There’s no real set of rules. There never was a rule book with editing anyway. You can always experiment, see if it works, and if you like it, then go with it. I think that’s the key, not be held back by too many dogmatic bits of rules, apart from the obvious things, like I say, where somebody’s actually feeling ill.

Larry Jordan: What software do you like to use for your edits?

Duncan Shepherd: I cut in Final Cut X, but I’ve been doing that for several years now just because I think it’s a really powerful tool for editorial. As it happens, the good tools for cutting in VR are spread between Final Cut and the Adobe Ecosystem, so all of my VR projects now, I have to bounce between Final Cut X and After Effects, and Premiere, just to make sure that I’m covering all the bases because there’s no real one solution that fixes everything for me. I think everything you need is probably in Adobe, apart from the things that I know that I love about Final Cut X which are more to do with the creative side of editing, rather than the nuts and bolts technology of doing effects.

Larry Jordan: Duncan, this has been a fun visit. I want to thank you so much for your time. Duncan Shepherd is the creative director of Duncan Shepherd Films, and we’ll touch base in a year or a little less and see how VR is continuing to improve.

Duncan Shepherd: Look forward to it Larry.

Larry Jordan: Thank you Duncan, bye bye.

Duncan Shepherd: Thank you very much, bye.

Larry Jordan: Here’s another website I want to introduce you to. Doddlenews.com. DoddleNEWS gives you a portal into the broadcast, video and film industries. It’s a leading online resource, presenting news, reviews and products for the film and video industry. DoddleNEWS also offers a resource guide and crew management platforms specifically designed for production. These digital call sheets, along with their app, directory and premium listings, provide in depth organizational tools for busy production professionals. DoddleNEWS is a part of the Thalo Arts Community, a worldwide community of artists, filmmakers and storytellers. From photography to filmmaking, performing arts to fine arts, and everything in between, Thalo is filled with resources you need to succeed. Whether you want the latest industry news, need to network with other creative professionals or require state of the art online tools to manage your next project, there’s only one place to go. Doddlenews.com.

Larry Jordan: Chris Bobotis is the CEO and co-founder of Mettle, and the chief architect behind Mettle SkyBox Cinematic 360 VR software for Adobe After Effects, and Premiere Pro. We’ve talked about how to shoot VR, how to have audio for VR, how to edit VR, but now let’s take a look at some of the effects that are available to us. Hello Chris, welcome.

Chris Bobotis: Hey Larry, how are you?

Larry Jordan: What was it that got you to start Mettle in the first place?

Chris Bobotis: I started in advertising originally and I was fast tracked, so what that means is you become a creative director and you don’t get to do much creative. You get to do a lot of hand holding in the industry and keeping clients calm and cool and collected. I met Nancy my partner at Mettle, and we were like minded. We started Mettle as a facilitator for advertising agencies basically, so we got them to desk top early on and Mettle started off as a creative and production facility.

Larry Jordan: It started that way, but what’s it doing now?

Chris Bobotis: In 96 we created proprietary tools for ourselves because we adopted the Adobe suite of products right from way back when, from After Effects through to Photoshop and Premiere Pro, we’ve used every version one pretty much.

Larry Jordan: Amazing there was even electricity back that far ago.

Chris Bobotis: Desk top solutions were amazing but there were certainly holes and we started to create tools in 96. One of the Adobe evangelists caught wind of this, and said “There’s a marketplace for this kind of stuff,” and we said, “Wonderful, let’s put it out into the wild, and see what happens.” So we did, and fast forward to about 2008, Adobe approached us and asked if we would bundle one of our products with After Effects, so we did that for a couple of years. It was very encouraging because you get this incredible validation and exposure to a space in the marketplace that we couldn’t possibly afford to do on our own. That encouraged us, and we created a couple more 3D plugins that went into the GPU, so they’re fast and pretty cool. But even those didn’t get our undivided attention. We were running a pretty lucrative but very demanding service business and we were always looking for a way to move over from the service company into a product company, so about 16 months ago, one of our agency customers turned to us and said “Can you help us up with some 360 work that we have to do?” We had no experience with 360 but we’ve done pretty much every workflow that’s come our way, from IMAX to web to print, you name it. So you analyze the objective and challenges, and devise a process or workflow to just meet the criteria, creatively as well as technically. This is around the Oculus Rift DK2 era that I speak of right now. So the Oculus Rift had not quite launched yet, and YouTube 360 was not in play at the time. We were creating this project under the impression that the delivery was going to go to the Oculus Rift on its launch but the customer divulged to us near the end of the project that YouTube was simply going to be supporting 360 and that we were deploying there, so the specifications are pretty much the same from one to the other. So now we had always been looking for a way to transition between a service company and a product company, and I turned to Nancy my partner, and said “This is the opportunity. This is it. If YouTube is supporting 360, this means it’s going to go mass market, so we need to take this gnarly mess of a process, that I concocted to deliver on one project, and make a product out of it.” And that’s what we did Larry.

Larry Jordan: Your website says that your technology uses what’s called 3DNAE. Is that what you were developing, and what is it?

Chris Bobotis: 3DNAE is basically a foundation, so when you look at some of our earlier products like Freeform Pro and Shape Shifter, they take on the guise of a plug in, but to build that kind of functionality, you really have to build a whole 3D rendering pipeline underneath that. So well known products like Element 3D for example, are doing exactly the same thing. They’re actually a full 3D rendering system, but only what is necessary is exposed to the user. So that was born from production pipeline issues as well. For example, we had a lot of very talented 3D artists, and then very talented 2D artists. At different points in our progression, we would inundate the 3D artists with what we thought were mundane tasks, that you really didn’t need to be launching Maya or a Houdini or a 3D Max, could be doing some of the simpler things. Freeform Pro was born from that, in other words, you can empower your 2D artists to do some pretty neat things in 3D as long as you didn’t expose the whole of the Maya feature set to them. If you limited what you exposed, then you can get a 2D artist working in 3D very quickly, especially if you expose it in such a way that it makes sense to them. So you’re presenting a lot of 3D notions in a 2D UX paradigm. They’re born out of necessity if you follow. To answer your question, 3DNAE is a lot more than we present. We present things as necessary, so when we were creating the 360 VR products, we tapped into a whole 3D rendering system that we had pre built for the other products.

Larry Jordan: Now what level of knowledge is necessary to run your plugins?

Chris Bobotis: What we try hard, and we’ve always done this from the get go, is to really not introduce any new UX or new UI unless there’s a compelling reason to do so. It’s always felt like our products have been part of After Effects for example. We used the exact UI paradigms that After Effects are using as best as possible. So we don’t introduce new windows, we don’t introduce new notions, we just try to stick to what you know and love about After effects. It’s very much the same thing with the 360 VR products. We present things as if you’re working in flat cinema and we take care of all that 360 madness in the background for you. With Premiere Pro, if you can cut and do transitions and do post effects and stuff like that, I don’t think we’re introducing anything new, UX wise. We’re just resolving a lot of the dilemma.

Larry Jordan: I can understand the technical dilemmas, but there’s also creative dilemmas in terms of what you need to keep in mind that’s different in 360 VR versus standard 2D video. What are the key things people need to keep in mind?

Chris Bobotis: There was tremendous opportunity, as we were moving along and talking to more content creators, to create new tool sets that not only resolved technical issues, but also helped narrative. So a very good example of that is, because you’re in a frameless environment, meaning that the storyteller has to relinquish the notion that you can frame any more. The viewer can look pretty much anywhere they want in this 360 space, so when you’re trying to do a cut, or trying to do a transition, if the viewer’s not looking in the right place, the transition does not resolve properly, nor does the cut resolve properly. So when we were building our transitions for example, we were solving technical issues which are basically pinching and seam issues that are caused because filters are not built to address this in spherical space. Filters that ship with After Effects and Premiere Pro, and other NLE animation packages, are built for flat cinema, they’re not built for seamless environments. As a result, when you apply them, they cause seams. But transitions do the same thing. So, since we’re addressing these technical issues, we saw an opportunity to also help influence gaze, which means I can have a point of interest control in my transition that animates that transition and influences the viewer to look where I want them to look so I can resolve that transition properly.

Larry Jordan: Duncan, in the segment just before yours, made a really good point. He said “360 VR is much more like live theater in how it’s staged, than it is traditional film and video.” Would you agree with that?

Chris Bobotis: Very much so. Even actors find themselves coming up with new acting techniques that are more of a hybrid cinema meets theater, because you don’t have as many takes as you used to have before. You can’t sit there and do 50 takes with 360 as we did with flat cinema. Theater performances, because the cameras are limited in scope right now, you can’t be as close to some cameras as you would with regular DSLRs or film cameras. So you have to project a little bit more, but not too much. So, a lot of what we knew in acting techniques, and post production, it’s amazing because when we first started working in 360, I quickly realized that I know nothing. Even though I’ve been at it for over two decades, and this was exciting. This was incredible.

Larry Jordan: There’s so much to talk about. Chris, for people that want more information about the products that you make, where can they go on the web?

Chris Bobotis: That’s HYPERLINK “http://www.mettle.com” www.mettle.com. If you can, take a look at the blog. Great customer success stories there.

Larry Jordan: That website is mettle.com. Chris Bobotis is the CEO and co-founder of Mettle, and Chris, I guarantee we’ll bring you back. Thank you so much for your time.

Chris Bobotis: Thank you sir.

Larry Jordan: Take care, bye bye.

Chris Bobotis: Bye.

Larry Jordan: This has been a fascinating conversation for me, because there’s so many insights that our guests have shared. I love the focus on the audio being 360 but the video doesn’t have to be. The focus on moving actors not cameras, and more of a theatrical experience, and the fact that we’re really doing experiences and not overt storytelling. It’s been a fascinating discussion for me, and I’ve learned a lot and I hope it’s brought some insights to you as well.

Larry Jordan: I want to thank our guests this week, Andy Cochrane, the director, Nick Bicanic from RVLVR Labs, Brian Glasscock from Sennheiser, Duncan Shepherd is a freelance editor, Chris Bobotis, the co-founder of Mettle, and James DeRuvo with DoddleNEWS.

Larry Jordan: There’s a lot of history in our industry and it’s all posted to our website at digitalproductionbuzz.com. Here you’ll find thousands of interviews all online, and all available to you today. And remember to sign up for our free weekly show newsletter that comes out every Friday. Talk with us on Twitter @DPBuZZ and Facebook at digitalproductionbuzz.com.

Larry Jordan: Our theme music is composed by Nathan Dugi-Turner with additional music provided by Smartsound.com. Text transcripts are provided by Take1 Transcription. Visit Take1.tv to learn how they can help you.

Larry Jordan: Our producer is Debbie Price and a special thanks to Kevin Burke of Burke PR for his help in finding guests for this week’s show. My name is Larry Jordan, and thanks for listening to The Digital Production Buzz.

Larry Jordan: The Digital Production Buzz is copyright 2017 by Thalo LLC.

TAGS:    •  

Share the Episode

BuZZ Flashback

5 Years Ago Today on The Buzz: March 2, 2017


SAG/AFTRA merger ballots were mailed this week and Jonathan Handel described the process of making a decision.