ProVideo Coalition https://www.provideocoalition.com A Moviola Company Wed, 01 Mar 2017 14:08:25 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.2 https://cdn.provideocoalition.com/app/uploads/cropped-Moviola-Favicon-2016-32x32.png ProVideo Coalition https://www.provideocoalition.com 32 32 Why I migrated 2 of my domains to Medium.com https://www.provideocoalition.com/migrated-2-domains-medium-com/ https://www.provideocoalition.com/migrated-2-domains-medium-com/#respond Tue, 28 Feb 2017 19:24:38 +0000 https://www.provideocoalition.com/?p=46988 I just migrated two of my personal domains: AllanTepper.com and AllanTépper.soy from self-hosted WordPress installations to Medium.com. (My other domains remain with WordPress or Sparkle.) This article explains why I migrated these two particular domains (the major benefits for those two cases, not for the others); to alleviate the the doubts and objections that many people have

The post Why I migrated 2 of my domains to Medium.com appeared first on ProVideo Coalition.

]]>

I just migrated two of my personal domains: AllanTepper.com and AllanTépper.soy from self-hosted WordPress installations to Medium.com. (My other domains remain with WordPress or Sparkle.) This article explains why I migrated these two particular domains (the major benefits for those two cases, not for the others); to alleviate the the doubts and objections that many people have about this idea, and to share some details about the migration process.

In this article

  • A short comparison with other platforms
  • Advantages of bringing your domain (or subdomain) to Medium.com for blogging
  • Advantages of eliminating the old-fashioned www prefix
  • Relief to many people’s concerns about “not being on your own turf” & integration with the rest of your website
  • Medium.com’s partial support for non-English websites
  • Some observations regarding the migration from WordPress (self hosted) blog to Medium.com

A short comparison with other platforms

Every website platform has its unique advantages and disadvantages. Here is a brief summary of a few:

  • WordPress (self hosted) is free, but you need to host it somewhere. WordPress (self hosted) has the advantage of being one of the most popular and most flexible CMS (Content Management Systems) for a website or blog. Being a CMS means that it’s great for collaborating with multiple contributors (content, design or technical) who may be located throughout the world. WordPress also has the advantage of being open source, so even if Automattic (intentionally spelled with a double t) closed its doors for any reason, the WordPress project could continue via community efforts. There is a countless number of plugins (free and paid) to do all sorts of specialized applications, including e-commerce, a unique and sophisticated database with special automatic customized PDF generation after a visitor inputs information (as I recently implemented for a client), and even for creating an iTunes compliant RSS podcast feed. (Podcasting is just one important piece of the puzzle for the new radio, but not the only one.) However, the more plugins you use with WordPress, the more high-maintenance the website becomes. That’s why, if a site doesn’t really need any of those special benefits, you may be better served by another type of platform. I want to clarify that several of my websites and that of my clients are still with WordPress (self hosted) because those websites really need those special capabilities.
  • Sparkle is either free or paid for Mac. The free version works with a single website only, while the paid version allows use with multiple websites. Just to clarify: even with a single website, you can have multiple web pages within the site. Sparkle reminds me of Apple’s now defunct iWeb software, since Sparkle is so simple to use and so WYSIWYG (What you see is what you get). Given the fact that Medium.com (to be described in the rest of this article) is really just for blogs (sequential posts in order) and not appropriate for a static website, I actually created certain elements using Sparkle together with subdomains to accommodate web pages not offered by Medium.com. For example, I created contact.AllanTepper.com and payments.AllanTepper.com, which are both subdomains and I created them both using Sparkle to be free of WordPress’s maintenance. Both are on the server I use at DreamHost, and both have the free SSL certificate I have described in prior articles, for HTTPS and a green padlock in most browsers, in addition to being a Google SEO factor since 2014. Once you create and publish a website with a program like Sparkle, unless you have any content changes to make, it becomes maintenance-free, other than any future web standard changes. That is a major advantage for people with a static website who don’t require any special functionality via a plugin.
  • Medium.com is a complex platform to describe. According to its Wikipedia entry: “Medium is an online publishing platform developed by Twitter co-founder Evan Williams, and launched in August 2012. It is legally owned by A Medium Corporation. The platform is an example of evolved social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium and is regularly regarded as a blog host.” Although I don’t have official statistics, to my knowledge, most writers who publish on Medium.com do not do so with their own domain or subdomain on the Medium.com site as I do now with two of my websites. However, this article is all about using it with your own domain or subdomain, which is a fairly recent option.

Advantages of bringing your domain (or subdomain) to Medium.com for blogging

Here are some advantages I like about having my sites AllanTepper.com (a standard domain) and AllanTépper.soy (an IDN, or International Domain Name, with an accent mark) on Medium.com, rather than on a WordPress (self-hosted) as I did before:

  • Much greater potential exposure and sharing.
  • Zero maintenance, other than my own content.
  • When you follow the instructions to migrate your old blog from WordPress to Medium.com, all permalinks of your old posts are retained, whether or not they were previously at the root of the YourDomain.com, or at a location like YourDomain.com/blog.
  • No need to repost the same content between your own site and Medium.com, because they are one and the same. (I have always re-posted the first paragraph on my own site to promote full articles that are in ProVideo Coalition magazine, but that is a special case.)
  • Once you have successfully set up your own domain to work with Medium.com, you can upload your own favicon image to be displayed in place of the Medium icon in the browser bar, or when saving a bookmark to a smartphone’s desktop. I did it immediately after the migration of my domains.
  • No problem to include my or your own advertising on Medium.com. (I do.)
  • No problem to link to any other external websites.
  • Forced SSL for HTTPS is standard.
  • Clean look and branding without the old-fashioned www prefix. (See the next section for details.)

Advantages of eliminating the old-fashioned www prefix.

Here are some branding advantages of removing the old-fashioned www prefix, which is really an unnecessary subdomain (while retaining compatibility for those people who still enter www by habit, or have an old permalink somewhere):

  • It saves time when saying the URL aloud.
  • It’s cleaner looking in the browser window.
  • It’s cleaner looking and takes less space when printed anywhere.
  • Not having the www prefix allows the naked or “apex” domain to make even more sense when some other subdomain is used, when it exists for a true purpose. For example, it is clear to people what’s going on when they go to AllanTepper.com as opposed to when they go to books.AllanTepper.com, contact.AllanTepper.com or payments.AllanTepper.com.

The above mentioned advantages are true with any language (including English). There is also one additional advantage for Castilian-speakers, since the letter W in that language is even more problematic than the letter Z in English. Most people from the United States know and accept that the letter Z has a different pronounceable name outside of the US (“Zed” instead of “Zee”). However, in the Castilian language (commonly but imprecisely called “Spanish”), there are about 5 different names for the letter W, and many people are oblivious to the fact that the letter has other acceptable names. That becomes both distracting and causes conscious —or unconscious— disdain for the person speaking.

Approximately in 2014 (when I added SSL for HTTPS to most of my websites after the Google decree), I removed the www prefix.

Relief to many people’s concerns about “not being on your own turf” & integration with the rest of your website

Many people are concerned about what might happen if (in the future) Medium.com ever closed its doors, or changed its policy very negatively. If you do the following two steps, having your own domain on your own turf is essentially the same as having it with WordPress on any other contracted server, which could just as easily close its doors in the future.

  1. Have your domain on an independent registrar. (In fact, at least for now, Medium.com doesn’t sell domains anyway.) So you own your domain independently, regardless of what may happen to Medium.com in the future.
  2. Write your content offline with a writing tool that can publish directly to either Medium.com or to a WordPress site. I use one called Ulysses for Mac. There are probably others for Windows or even for the web.

I am very optimistic about Medium.com’s future, but I know that there is a way out if that ever changed, since I could re-point my domains anywhere and republish all of my articles by pressing a button. This doesn’t even depend upon Medium.com’s export feature, which may or may not be better. I am not concerned, since I have both options available, and I hope I’ll never have to use either of them.

The other concern is: how to handle other elements of a website. Since both are both blog-centric, I sent the “naked” or “apex” domains AllanTepper.com and AllanTépper.soy to Medium.com, and separately created the subdomains contact.AllanTepper.com and payments.AllanTepper.com. Both can be linked from the main sites at Medium.com, and vice versa. If your website is not blog-centric, you can do the opposite, and create the subdomain blog.YourDomain.com to Medium.com. For these two sites, I also wanted to eliminate the time investment in WordPress maintenance, although I still continue to do that for other sites like my radio show CapicúaFM and others.

Medium.com’s partial support for non-English websites

Fortunately, Medium.com now works 100% with IDN domains, i.e. those with accent marks, diacritical marks or other alphabets. However, if you see what appear to be “garbage characters”, you may be interested in reading the following subsection:

IDN domains, and some browsers’ attempt to prevent phishing

If instead of seeing allantépper.soy in your browser’s URL bar, you see the ugly xn--allantpper-g7a.soy, it’s because your browser (or version of your browser) is trying to prevent phishing attempts and isn’t currently set to include the Castilian language (sadly called “Spanish” or “español” in the majority of browsers). Some older browsers were designed to show IDNs in Punycode when the domain’s language isn’t currently included in your browser settings. This was done to avoid phishing attempts via what Wikipedia calls a homograph IDN attack(which is not to be confused with a Big Mac Attack from the 1970s era). For example, if a malicious individual purchased a domain like BankofAmerica.com using the Greek o instead of the standard one, to the naked eye, you could innocently believe that you reached the proper BankofAmerica.com website. Fortunately, I have observed that most newer browsers (with the notable exception of Microsoft Edge) now deal with this potential problem (at least with my IDN domains) via other means and fortunately no longer display IDNs in Punycode, even if the internaut hasn’t previously added Castilian to the list. Microsoft Edge now gives you an suggestion to add another language to the browser’s allowable list.

However, Medium.com fortunately still doesn’t allow translation/localization of several strings, including:

  • The 12 months of the year (January, February, March, Abril, May, June, July, August, September, October, November, December) when automatically dating posts.
  • About
  • Archive
  • Latest stories
  • Privacy
  • Terms

Fortunately, the last five strings mentioned appear only in the footer of the page by default. Medium.com has told me that the company will add localization capabilities in the future. I hope it happens sooner rather than later.

Some observations regarding the migration from WordPress (self hosted) blog to Medium.com

I am not going to attempt to cover all of the steps (as many other writers have). The reason for not covering all of the steps is because many have changed recently, and may change again. The most important thing is that it is quite simple to export all of your Posts from WordPress into an XML file and then import them into a new Publication you have created at Medium.com. It must be a Publication, not just your user name at Medium.com. As of publication time of this article, no payment is required to create a Publication at Medium.com. As long as you do this part before changing the pointing of your domain, all permalinks continue to work, and all images are also included and re-hosted by Medium.com, with the notable exception of WordPress’s Featured Images in Posts. Those are currently excluded. As a result, what I did was to re-add the Featured Images of the past five Posts manually, after importing the XML file at Medium.com, and before publishing them all via a single click.

Another important point is that in all of the articles I read on the topic, Medium.com was originally charging nothing to accept the domain. That has now changed: Medium.com now charges a one-time fee of US$75 per domain, which includes their internal adjustments and the SSL certificate. On the one hand, it hurt me to have to pay US$75 x 2 when others before me did not. However, on the other hand, the fact that Medium.com is now charging makes me realize that its financial future is probably much safer from here forward.

Upcoming articles, reviews, radio shows, books and seminars/webinars

Stand by for upcoming articles, reviews, and books. Sign up to my free mailing list by clicking here. Most my current books are at books.AllanTepper.com, and my personal website is AllanTepper.com.

Si deseas suscribirte a mi lista en castellano, visita aquí. Si prefieres, puedes suscribirte a ambas listas (castellano e inglés).

Follow @AllanLTepper on Twitter.

Listen to his CapicúaFM show at CapicúaFM.com in iTunes or Stitcher.

FTC disclosure

As of publication date of this article, there is no commercial relationship between Allan Tépper or TecnoTur LLC and Medium.com, other than the fact that Allan Tépper/TecnoTur LLC paid a total of US$150 to have both domains there. No manufacturer is specifically paying Allan Tépper or TecnoTur LLC to write this article or the mentioned books. Some of the other manufacturers listed above have contracted Tépper and/or TecnoTur LLC to carry out consulting and/or translations/localizations/transcreations. Many of the manufacturers listed above have sent Allan Tépper review units. So far, none of the manufacturers listed above is/are sponsors of the TecnoTur programs, although they are welcome to do so, and some are, may be (or may have been) sponsors of ProVideo Coalition magazine. Some links to third parties listed in this article and/or on this web page may indirectly benefit TecnoTur LLC via affiliate programs. Allan Tépper’s opinions are his own.

Copyright and use of this article

The articles contained in the TecnoTur channel in ProVideo Coalition magazine are copyright Allan Tépper/TecnoTur LLC, except where otherwise attributed. Unauthorized use is prohibited without prior approval, except for short quotes which link back to this page, which are encouraged!

The post Why I migrated 2 of my domains to Medium.com appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/migrated-2-domains-medium-com/feed/ 0
Hasselblad: 4 new XCD lenses for the X1D https://www.provideocoalition.com/hasselblad-4-new-xcd-lenses-x1d/ https://www.provideocoalition.com/hasselblad-4-new-xcd-lenses-x1d/#respond Tue, 28 Feb 2017 19:14:55 +0000 https://www.provideocoalition.com/?p=46996 Photographers and videographers using the Hasselblad X1D will have a zoom and two fixed focal lengths available during 2017, extending the creative options of the camera. In 2018 Hasselblad expects to have a total of 19 lenses available. The four new lenses announced by Hasselblad are the XCD 120mm Macro, the XCD 35-75mm zoom, XCD

The post Hasselblad: 4 new XCD lenses for the X1D appeared first on ProVideo Coalition.

]]>
Hasselblad: four new XCD lenses for the X1D

Photographers and videographers using the Hasselblad X1D will have a zoom and two fixed focal lengths available during 2017, extending the creative options of the camera. In 2018 Hasselblad expects to have a total of 19 lenses available.

The four new lenses announced by Hasselblad are the XCD 120mm Macro, the XCD 35-75mm zoom, XCD 65mm, and XCD 22mm wide angle. From the four, only the information for the macro lens is available, as it will be the first to arrive to the market, next June. As for the others, besides the focal length, there is not much information available to share, as Hasselblad indicates that detailed XCD specifications will be announced later this year.

Hasselblad will dedicate the next month to finish development of the lenses. The company says that “by the beginning of 2018, the X1D will have access to seven dedicated XCD lenses and all twelve HC/HCD lenses using the XH lens adapter.” The lenses extend the creative option available to the X1D camera users. The camera, launched in 2016, is the world’s first mirrorless digital medium format camera.

Hasselblad: four new XCD lenses for the X1D

Back to the first lens to become available, the 120mm f/3.5 lens brings together the compact format of the XCD range with the maximum optical quality across the frame with a flat image field. Providing a new versatility to the X1D user, the lens is suitable for both close-up work up to a 1:2 image scale, and also as a mid-range telephoto lens for portrait or other photography requiring a longer focal length. Auto or manual focusing goes from infinity to 1:2 without the need for extension tubes.

Like the other XCD lenses, XCD 120mm Macro lens has an integral central shutter offering a wide range of shutter speeds and full flash synchronisation up to 1/2000th second.

Hasselblad Product Manager, Ove Bengtson commented: “The XCD 120mm Macro lens complements the existing XCD dedicated autofocus lenses which were developed to support optical quality and portability. This is the first addition to the X1D range of lenses in 2017 and we are excited to launch more lenses later in the year.”

The post Hasselblad: 4 new XCD lenses for the X1D appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/hasselblad-4-new-xcd-lenses-x1d/feed/ 0
Cineo Lighting debuts MavX at NAB 2017 https://www.provideocoalition.com/cineo-lighting-debuts-mavx-nab-2017/ https://www.provideocoalition.com/cineo-lighting-debuts-mavx-nab-2017/#respond Tue, 28 Feb 2017 17:04:06 +0000 https://www.provideocoalition.com/?p=46953 Offering the ability to dynamically color tune from 2700K to 6500K, with accurate presets at 3200K, 4300K and 5600K, the new Cineo MavX has no color shift or flicker at any output level. Built on the award-winning Cineo Maverick line, the MavX provides a mid-power color-tunable soft source with the same powerful, accurate light output

The post Cineo Lighting debuts MavX at NAB 2017 appeared first on ProVideo Coalition.

]]>
Cineo Lighting debuts MavX at NAB 2017

Offering the ability to dynamically color tune from 2700K to 6500K, with accurate presets at 3200K, 4300K and 5600K, the new Cineo MavX has no color shift or flicker at any output level.

Built on the award-winning Cineo Maverick line, the MavX provides a mid-power color-tunable soft source with the same powerful, accurate light output as the Cineo Maverick Remote Phosphor fixture. Within the MavX users will find all the color rendering and output benefits of Remote Phosphor Technology (RPT), with accurate color tuning a range from 2700K to 6500K, with presets at 3200K, 4300K and 5600K.

Leveraging years of experience in solid-state lighting and material sciences, Cineo has created a compact, 1K equivalent soft light source with the same beautiful color rendering and extended deep-red spectrum as their Remote Phosphor fixtures. Unlike other color-tunable sources, MavX remains consistent throughout the life of the fixture with no color shift or need for calibration.

Cineo Lighting debuts MavX at NAB 2017

The MavX uses, according to information provided by Cineo, “passive thermal management for soundless operation, delivering a volume of light equal to a traditional 1K soft source with no color shift or flicker at any output level. It is designed for small studio and portable applications and features the optimized output needed for film, video and still photography. MavX supports Cineo’s photo-accurate dimming, which matches the dimming curve to camera stops. Local control is simple and intuitive, complimented by both 5-pin wired and wireless DMX control built-in.”

“Cineo’s RPT has quickly made its mark in the industry and we are happy to enhance our Maverick line with the color-tunable MavX,” says Rich Pierceall, CEO, Cineo Lighting. “We invite all expo attendees to visit the Cineo booth at NAB to observe the clear quality of all Cineo lighting systems.”

Cineo Lighting debuts MavX at NAB 2017

For studio applications, MavX delivers up to 8,000 lumens from a lightweight, compact package — perfect for lighting broad areas where grid height is challenging. The MavX is also ideal as a portable, battery-powered soft source. Cineo’s SmartPower™ technology provides consistent light output during battery operation, regardless of the charge state of the battery. MavX also protects a battery fleet by providing adjustment to the fixture’s current draw to match the output capability of the batteries.

Other features of the MavX include 150-degree light spread; silent, passive cooling and built-in LumenRadio wireless DMX control. The included 80/20 slots provide for maximum mounting flexibility, as well as easy removal of the yoke.

Accessories for the MavX include AC adapter, V-Lock battery adapter, Gold Mount battery adapter, barn doors, grids, louvers and soft boxes.

The post Cineo Lighting debuts MavX at NAB 2017 appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/cineo-lighting-debuts-mavx-nab-2017/feed/ 0
Think Tank Photo upgrades Helipak and TakeOff bags https://www.provideocoalition.com/think-tank-photo-upgrades-helipak-takeoff-bags/ https://www.provideocoalition.com/think-tank-photo-upgrades-helipak-takeoff-bags/#respond Tue, 28 Feb 2017 16:26:41 +0000 https://www.provideocoalition.com/?p=46944 The Think Tank Photo’s Airport Helipak V2.0 backpack for DJI Phantom 4 packs the drone and accessories in a size fit for both U.S. domestic or international airline travel. As quadcopter design evolves, bags to carry them have to adapt to the new drone dimensions. Think Tank just did that, presenting the Airport Helipak V2.0

The post Think Tank Photo upgrades Helipak and TakeOff bags appeared first on ProVideo Coalition.

]]>
Think Tank Photo upgrades Helipak and TakeOff bags

The Think Tank Photo’s Airport Helipak V2.0 backpack for DJI Phantom 4 packs the drone and accessories in a size fit for both U.S. domestic or international airline travel.

As quadcopter design evolves, bags to carry them have to adapt to the new drone dimensions. Think Tank just did that, presenting the Airport Helipak V2.0 backpack for DJI Phantom. This newly updated backpack accommodates a DJI Phantom 4 and similarly sized drones. Its design is carry-on compatible for both U.S. domestic or international airline travel, something that is of utter importance for two reasons: so you can carry the backpack with you, while abiding by the strict regulations in place these days. Its custom divider set allows maximum room for extra accessories and gear, including a small camera kit, and features a dedicated 15” laptop compartment. The backpack is available for $199.75.

“We updated the backpack to carry a DJI Phantom 4, a 15” laptop, controller, GoPros, chargers, spare rotors, extra batteries, jacket, tools, and more”, said Doug Murdoch, Think Tank Photo’s CEO and lead designer. “The contoured adjustable harness with lumbar support, articulated air-channel, removable padded waistbelt, and height-adjustable sternum strap enable comfort for travel into remote areas.”

Think Tank Photo upgrades Helipak and TakeOff bags

The second product updated by Think Tank Photo is the Airport TakeOff, now on version V2.0. This model answers the needs of photographers/videographers who are forced to carry their rolling bags over uneven terrain. The TakeOff V2.0 can be converted from a roller to a backpack. And since the new version is 15% lighter, users can carry their gear to the next location more comfortably. Sized to meet the needs of the traveling professional, the Airport TakeOff V2.0 fits most U.S. domestic and international airline carry-on requirements. The rolling bag/backpack costs $369.75.

“The Airport TakeOff V2.0 features a custom designed retractable handle with inset channel on aluminum tubing for added strength and durability, making it easier to roll down the street,” said Doug Murdoch. “A newly added tablet carry in the laptop pocket and dedicated smart phone pocket in the front flap insure photographers’ tools are easily accessible and at the ready.”

Think Tank Photo upgrades Helipak and TakeOff bags

The Signature Series, just announced, is a modernized version of the classic shoulder bag. Whether one is a young urban professional or corporate photographer, the Signature shoulder bag is as fashionable as it is functional. Hand sewn, advanced fabrics blend weather protection and durability with the classic feel of fine wool. Genuine leather detailing and metal hardware adds character and stands up to the rigors of daily use.

“The new Signature Series features a modern fabric that is soft to the touch and yet is durable,” said Doug Murdoch.  “In addition, the zippered flap provides full closure and security to the main compartment, or tucks away when not in use.  It is a next generation design for today’s discerning photographer.”

The Signature Series comes in two sizes: Signature 10, for $249.00, and the Signature 13, for $279.00.

The post Think Tank Photo upgrades Helipak and TakeOff bags appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/think-tank-photo-upgrades-helipak-takeoff-bags/feed/ 0
ART OF THE CUT with “Mr. Robot” editor, Philip Harrison https://www.provideocoalition.com/art-of-the-cut-with-mr-robot-editor https://www.provideocoalition.com/art-of-the-cut-with-mr-robot-editor#respond Tue, 28 Feb 2017 15:41:56 +0000 https://www.provideocoalition.com/?p=46896 ART OF THE CUT talks to Philip Harrison about the distinctive cutting approach of the binge-worthy TV series Mr. Robot. Harrison’s work on the series was nominated for an ACE EDDIE for Best Edited One Hour Series (Commercial) for his episode,“eps2.4m4ster-s1ave.aes.” HULLFISH: Your previous editing gigs were quite different from Mr. Robot. HARRISON: Previous to Mr. Robot,

The post ART OF THE CUT with “Mr. Robot” editor, Philip Harrison appeared first on ProVideo Coalition.

]]>
ART OF THE CUT talks to Philip Harrison about the distinctive cutting approach of the binge-worthy TV series Mr. Robot. Harrison’s work on the series was nominated for an ACE EDDIE for Best Edited One Hour Series (Commercial) for his episode,“eps2.4m4ster-s1ave.aes.”

All images from Mr. Robot, courtesy USA Network.

HULLFISH: Your previous editing gigs were quite different from Mr. Robot.

HARRISON: Previous to Mr. Robot, I had been working on Glee which is obviously a much different show in terms of its content and style. Ryan Murphy, who produced that show, was going for a bright, fun, immediate feel so the cutting is briskly paced with important dialogue always on camera, moment to moment the viewer should always know where they are in the story. For musical sequences, the rhythm of the music guided the cutting pattern and I was always on the lookout for interesting camera movement that sold the emotion of a particular song. When I started working on Mr. Robot, I had to let go of that sort of rhythm and shift over to the Mr. Robot rhythms. That was actually a big leap for me initially- to let things pause and let shots play- take in what the characters might be thinking. It was a question of trusting the story. I’m always concerned that the audience is understanding and I’m always worried if they’re ahead of the story or behind the story. Editing Mr. Robot, these concerns were tricky to judge because the storylines play out over a long amount of time and it isn’t always clear what the character motivations are. The creator, Sam Esmail, wanted that ambiguity. It’s inherent to the show – our lead character, Elliot, isn’t always forthcoming and honest with the audience or himself. Anyway, It always takes extra passes of your cut to feel like you’ve really gotten the right rhythm and the right story and character beats. And that the subtext of the story is going to hold the viewer.

HULLFISH: You were nominated for an ACE EDDIE for an episode that had a really interesting series of jumpcuts in a hotel break-in. (Season 2, Episode 6, “eps2.4 m4ster-s1ave.aes”)

HARRISON: That scene was a little unusual for Mr. Robot. As opposed to the usual dark moodiness, it had a sort of classic “heist” scene feel, like something you’d see in Ocean’s 11; The character is using her charisma and hacking skills to break into a hotel room.   We wanted to be able to accelerate the energy of it all – really manipulate the audience where we could ramp up the energy and then slow it down and then ramp it up again and the jump cuts were a logical extension of that. It was really fun: for my first cut I just built the sequence “as is.”, without jump cuts. Then I just played with the energy and tried to figure out how much information we actually needed and how little we could get away with telling. The nervous energy created by the jump cuts in rhythm with the song propelled the scene. A little later in the sequence, we go in the opposite direction with a 3-4 minute take with no cuts that has a different way of creating tension.

HULLFISH: Well it’s interesting that you keep mentioning energy because I was just talking to Joi McMillon who edited Moonlight and there’s a bunch of jump cuts in that piece and they’re all done for energy and to get into a character’s head. Thelma Schoonmaker says the same thing about a lot of the jump cuts she does, it is definitely a way to infuse some energy into a sequence, I’m trying to figure out what the psychology of that is.

HARRISON: It feels very physiological as well. There’s something about a jump cut – it’s a break in our perceptual reality of what’s going on in front of us. It can feel like if you bump your head you feel a little jump, you know? You’re also compressing time- anytime you compress time in a cut you feel a surge of energy leaping forward. And if you have a series of jump cuts, it’s a rhythm, you feel it musically, like a drumbeat.

There’s also a more thematic intention with some of the jump cuts in Mr. Robot. My first episode in season 1, “eps1.3_da3m0ns.mp4”, had a series of scenes where Elliot is going through morphine withdrawal. He’s really in a bad place, sweating, starting to hallucinate. We wanted to come up with a way to really accentuate that feeling and place the viewer in that experience. I experimented with step printing the image which gave it a very visceral kind of feeling, but then I also added jump cuts and duplicated frames to give it a sort of a stutter feeling; almost like a tremor. It gave us this feeling of getting into Elliot’s mind a little bit, that this was a fractured experience for him. In the season 1 finale, there’s a moment where Elliot is on the subway, he’s in crisis mentally and completely locked inside himself. Rami Malek’s performance in the dailies went through a range of emotions: happiness, crying, despair, blank. That was an opportunity to jump cut the footage so that you felt Elliot’s splintered personality – that his brain was quickly shuffling through these different emotions almost like a hard drive. The jump cut has definitely become a part of the Mr. Robot language.

HULLFISH: The use of music to me is really interesting in this series. One of the hardest things with music is determining when exactly to start it and stop it and there are some really cool examples in that FBI headquarters in your nominated episode.

HARRISON: That is a very playful stylized sequence. The song, “Gwan” by The Suffers is a very uptempo soul track with a big, brassy, female vocal. It helps justify these really broad shifts in energy. There are moments in the action where the hacking scheme feels like it might be derailed and the sudden cuts to silence help amp up the suspense. This is the oner scene I was talking about so the music also helps support that long take – gives it a pulse.

HULLFISH: Another scene I wanted to talk about was the scene with the car ride at the end of the episode with a young actor playing Mr. Robot as a child. You were able to get some great performances out of the kid. Can you talk to me a little bit about the little plays of subtle emotion across his face and trying to screen dailies and gather that stuff?

HARRISON: That scene is one of my favorites. At the end of the day, underneath the paranoia and cynicism, the show has a really big heart and to have a scene that’s about the relationship between a father and his son – to have it played out in such a sweet, sensitive way was really gratifying for me. They shot it with a process car, 2 cameras for every setup; I probably had 8 hours of footage for a 4-minute scene. I went through and broke down the footage into the tiniest portions possible for every line of dialogue and every beat where I thought there was subtext that required an emotional reaction. Doing that lets me get really familiar with the footage. Some of the reactions and some of the lines of dialogue we had at least 10 different possibilities for young Elliot and for his father. I kind of knew from the beginning that the boy’s face was going to be the most important part, so a lot of the dialogue plays just over his face and you just read so much into him. It was interesting to experiment with the dialogue against the boy’s reactions – the father talking to him about being fired and that he has cancer – and just trying different pieces of the boy and seeing what feels real and what gives you that little emotional sort of jigger in your stomach.

HULLFISH: You said that you broke the dailies down into granular segments. How are you doing that? With selects reels or sub clips or what are you doing?

HARRISON: I like to use Avid ScriptSync when I’m working with a director because it’s really fast for showing options. But when I’m cutting on my own I don’t use ScriptSync at all because I need to just go through the footage to understand what I have and kind of internalize it all. I’ll watch the dailies and then I will create what I call “breakdown” sequences for each scene. In the breakdown sequences, I will line up smaller sections of all the takes from each camera position. So, One sequence of all of Elliot’s dialogue, one for Mr. Robot’s, and one for the wide shots, etc. The breakdown is the individual lines of dialogue or reactions all lined up together but it can also be longer hunks of the scene according to what seems the most helpful. I find I need to compare performances directly against each other to make a choice. By putting lines of dialogue, reactions or moments of action together, side by side, the right take seems to pop out. I compare it to photographers when they look at photos on a contact sheet and they have their loupe and they go from photo to photo and by just popping from one to the next to the next you can suddenly see: “Oh that’s the one I want to use.”  That’s kind of my process.  Things get vague for me if I don’t break down into fairly small pieces. But it’s also just a way to make sure that I have seen everything and that I really have internalized the material.

HULLFISH: A lot of people use KEM rolls or selects rolls where you just kind of run all the dailies from action to cut one after another, but that can, for a decent size scene, that can be 45 minutes, 25 minutes, and you can’t keep 45 minutes of performances in your brain, but if you break that 45 minutes down into 5 minutes sections and go “Okay, this is 5 minutes of the first 15 seconds or 10 seconds of the scene.” Then all of a sudden you’re like “5 minutes. I can figure out 5 minutes worth of performances.”

HARRISON: Yea, I’ve tried those KEM rolls but sort of get lost in that method. By breaking it down into smaller pieces, I’m better able to manage things.

HULLFISH:  So basically what you’re doing is selects reels broken down to lines.

HARRISON: I’ve edited documentaries with a huge amount of archival material. In that situation, if you try to cut something without getting your ducks in a row and breaking it down into pieces, you can just forget it. You won’t be able to cut a documentary. I come from that mindset. I like to get everything in order, get all the pieces in my brain, but then get them easily available and then I’m ready to do it. I know I’ve got everything within arms-length.

HULLFISH: There’s a shocking edit at the end of this episode. Tell me about that. It’s a decision of finding the perfect moment, right, to make that cut?

HARRISON: I’d love to be able to take a huge amount of credit for that but that was Sam Esmail’s plan. At the end of this dramatic scene, 13-year-old Elliot’s father (Christian Slater) gives him the chance to name his new computer business. Elliot closes his eyes to think through what it could be called and just as he opens his eyes…we cut to black before he can say, “Mr. Robot”. Actually, in my first cut, I had him saying the line even though I knew we would want to cut that off. But they shot it with the dialogue and I wanted Sam Esmail to have a chance to see it that way. By cutting off the line, you give the audience the chance to make the connection. Anyone who’s been watching the series will make that leap.  You also put the focus on what’s going on with Elliot. He’s not just naming this computer business, he’s starting to build the whole idea of his alter ego, Mr. Robot.

HULLFISH: Tell me a little bit about the beginning of that episode – it’s a spoof on an 80s sitcom. Did you watch any 80s sitcoms to prepare?

HARRISON: When I was working on Glee, I cut a sitcom spoof sequence where a character gets bumped on the head and all of a sudden you’re in a Friends episode including a replica of a Friends title sequence and all of our Glee characters have taken the place of Friends characters. For that, I looked at Friends episodes and matched the style. It had the laugh track, music hooks, and the 3 camera cutting pattern. So I had some familiarity with the style. Flash forward to Mr. Robot: Every season Sam likes to do something that’s going to totally throw the train off the rails and just really sort of screw with the audience’s expectations. Last year I got to cut the episode where Elliot is shot in a drug den and the audience is thrown into a 10-15 minutes dream sequence.

This season we opened the episode, “eps2.4_m4ster-s1ave.aes”, with Elliot lost in a 20 minute 90s sitcom episode where he’s with his family on a road trip. Since I already had some familiarity with the sitcom style I did a limited amount of research. Instead, my focus was on amplifying the contrast between the upbeat style of the sitcom with the dark undertone of our Mr. Robot story- The main solution was to play it straight on both fronts. For the title sequence, Sam had me match the Full House title sequence exactly.  But instead of the family house in the Full House sequence, we have a wide shot of the town with a nuclear power plant! Then Sam connected with the original composers of the Full House music to write a whole new song called “A World Gone Insane”! They also did all the musical bumpers in this self-contained “sitcom”.  I temped my cut with a laugh track, but in the sound process they brought in the actual people who do laugh tracks for all of the sitcoms today and they made a track that is authentic to these shows.

The next things were to get the tone of the performances right- Sam shot a range from all the actors. Some were slightly more in the “winking” direction but we chose performances that were as straight as possible. That has its exceptions but we definitely leaned more towards the straight performances. We didn’t want it to be a wink, we wanted it to play as this contrast of the sitcom style with the dark, real undertone. For instance, in the opening sequence, we have the shot introducing Elliot as he reacts to falling through a glass window. We had a sort of  “Gee Wiz!” take but we went with the take where Elliot seems confused and upset, consistent with the character. The larger effect comes from the contrast of this reaction with the upbeat theme music. It’s more of a warped upsetting feeling – a joke you don’t really want to be in on.

HULLFISH: You mentioned a couple of other episodes or at least one other episode that you wanted me to take a look at. What specifically was in that that you were proud of or that you feel is interesting to talk about?

HARRISON: My first episode of season 2,” eps2.1_k3rnel-pan1c.ksd”, was massive. My first cut was an hour and a half long and I think eventually it came down to about an hour and four minutes which is long for television. It had a lot of characters and storylines. There were some pretty hefty montages. There were flashback scenes that only tangentially related to the rest of the show. We also had scenes that shifted quickly between “reality” and “fantasy” where we were manipulating the audience’s perceptions.

Originally, the script indicated a sort of checker-boarded pattern of the different characters arcs that would work in a very straightforward manner; dispense a little information, move on to the next character, a little more information, shift to the next character, etc.  But when we got into the process of editing, it felt too arbitrary. We realized that we needed to restructure so the viewer felt like they were in the moment – We wanted to create the feeling that we were moving to the next piece of story because the story required it and that the next scene we went to was the only logical scene that we could go to next. It took quite a bit of trial and error to get the cut to feel well balanced.

HULLFISH: What did it end up being that delivered that sense that – instead of being arbitrary – that it was necessary to go to the next part of the story? Was it a sense of answering a question, like almost like one scene, would pose a question the next scene would answer?

HARRISON: First we simplified the story arcs. Instead of Intercutting Elliot’s story with Dom and Angela’s in act 1-3, we condensed his storyline into acts 1 and 2 and held off introducing the Dom and Angela arcs until act 3 or act 4, By playing out Elliot’s story longer, we found that you were able to track it better. And it felt like there was a more satisfactory ending with his storyline before you moved on to the new character. You were ready to move on to the other characters. So, there was a lot of that type of work.

In respect to the reality/fantasy issue, we needed to make some alterations to accentuate the audience manipulation we were going for; In one sequence, Elliot is being confronted by his alter ego, Mr. Robot, and he takes a fist full of Adderall as a way to disconnect. This propels Elliot into a fantasy sequence where “men in black” apprehend him and force cement down his throat before a smash cut takes us back to reality. Originally we had that section as just one long sequence but it played much too long and it felt like the audience would be ahead of things. You could tell that it was “Oh it must be in Elliot’s head. This has to be a fantasy”. One thing we did was add some ADR when the “men in black” do appear that misleads the audience to think “Oh maybe this is the FBI, so maybe this is real.” Then we split the sequence up with another unrelated scene with other characters. Putting that interruption scene in there broke it up like a sort of cliffhanger- it put the audience back behind the story, not sure if the events are real or not,  instead of in front of it. Finally, we whittled down this section to the barest essential elements. When you have fantasy elements, a little goes a long way. Too much and the audience feels the fantasy. Keeping it minimal kept the viewer in the moment until we were ready to reveal the truth. We felt the audience would be behind the story at that reveal.

HULLFISH: And that’s a critical place for the audience to be, right?

HARRISON: Absolutely critical for that sequence. Overall, this was a process that the entire episode had to go through. It was in the subtleties of restructuring and manipulating everything so that you felt the threads were being told in the right order. So that was the most interesting problem-solving episode of the season.

HULLFISH: You were talking about all this restructuring of this episode and it makes me think: as an editor you’re a technician that knows how to run the equipment, you’re a visualist that knows how to find the most compelling visual imagery and cut it in the most interesting way, you’ve got a musician’s sense of pacing and rhythm, but you also have to have real storytelling chops to be able to be part of a room that is restructuring a story radically. Talk to me about being a story teller.

HARRISON: I’m very much a process-oriented editor. In my first passes, I’m most focused on trying to make sense of the storylines, not worrying about the bells and whistles. Looking for the character motivations and really cutting everything based on that. Then on repeated passes … this stuff it doesn’t all happen at once- I remember when I was a younger editor I thought I had to know how everything was going to go together and what I found is you don’t. You just need to make logical sense one cut at a time and the totality is revealed to you… I like those first passes to really get things laid out, screen direction, most important thing is character, where are the characters at emotionally. Subtext, Who’s the actor? Who’s the acted upon? What are their reactions to these things? Keep it simple, and then you go to the beginning again and watch things narratively and see how those threads are feeling across the entire episode. I never dip into a scene randomly. I always like to go from the very beginning because that informs, for me, the narrative sense and I feel like I’m on firm footing. So I continue to make passes of the material and as things continue to fall into place then my attention can turn more towards the specifics. I start adding sound effects. BTW, after my first scene passes, I handed them off to the assistant editor, Gordon Holmes,  to lay in background, sound effects all the stuff that makes it feel physically, viscerally real. Then I can start really getting in there and playing with the scenes in the musical sense that you’re talking about. I’ll see things that I didn’t see previously. More and more you’re going in and doing the detail work that really starts bringing it to life. Then the director comes in and it becomes a collaborative process. On Mr. Robot we make many passes before we get to the final version because it has so many layers.

One fun thing on Mr. Robot, coming from a documentary background, is the cinematic montages that are part of many of the episodes. It’s a thrill to take a bunch of different montage pieces and find the rhythm and find the flow and play things off of each other and invent moments. In a montage sequence where Elliot is flying high on Adderall, we had one moment where Sam had designed a wide profile shot on the side of the street with Elliot walking down the sidewalk. Sam wanted to literalize the idea of computer imagery in this moment, so he wanted to create a visual effect that blurred Elliot walking across frame – like a human screen saver image. As I worked with the footage, I noticed that Rami Malek had performed takes increasingly energetic.  I got the idea that we could split screen together all of those different takes so you see this spectrum of Elliots. It was another way to show the splintered personality. It sort of started with Sam’s intent and evolved into another thing that related more to the character’s inner life. It’s one of the fun things of editing when you can come up with a way to cut something and suddenly it comes to life. And that sort of work comes in the later passes when you’ve removed some basic obstacles and you can just get in the groove and be inventive.

HULLFISH: This new environment of TV with much more complicated and interesting plots and more cinematic style – the schedules are also radically different now.  Tell me a little bit about the schedule. The kind of restructuring you are talking about can’t possibly happen in 4 days.

HARRISON: No it doesn’t.

HULLFISH: I’m really interested in specifics.

HARRISON: Season one was a little more typical in that each episode was directed by a different director and they shot for eight days per episode. I probably had 3-4 days after that to pull my editor’s cut together and then we had four days with each individual director and then we would have however much time was needed in the producer’s cut with Sam Esmail. In season two things were very atypical in that we were block-shooting the episodes. We had three blocks. The first four episodes were shot in the first block. The second three episodes were shot in the second block, and then the final four episodes were shot in the final block. As it turned out my first episode bled into the second block of shooting and my second episode bled into the third block. So we had multiple episodes shooting over three or four months of time with airdates starting during production. On my first episode, I worked on the editor’s cut for a couple months because Sam was shooting and unavailable and I kept getting new scenes. But then I was starting to get dailies for my second episode and then all of a sudden Sam would have availability to start working on his version of the first episode. It took a lot of stamina to stay focused while jumping around between episodes like that. When I’m alone editing, I just get in a much more intuitive place. When you’re working with a director you have to do your best to keep that part alive while also using a good portion of your brain to communicate, to let the director know that you’re listening to them. I actually had a mirror set up, because we had a fairly small editing room, I had a mirror set up on my desk so that I could look in the mirror and see Sam when he was talking to me and to watch his reactions sometimes during screenings. We all thought it was a little funny and weird but it was the easiest way for me to be able to listen and really hear what he was saying without having to crane around all the time. I really needed to keep my hands on the keyboard the whole time if we were going to meet our deadlines. The other editors – Franklin Peterson and John Petaja – we would always look at each other’s work. The whole post department actually – Sam would have screenings of everybody’s cut and we would do a round table and talk through what the issues were and then go back into the rooms and work our way through those notes. We had a really smart crew so it was a really great way to hone each of the episodes!

HULLFISH: The mirror idea is very funny. That’s a great idea.

HARRISON: As you know, a big part of the job is the diplomacy and communication skills that you can bring to working with the director even as you have to be looking at a screen and a keyboard. I think a big part of human communication is looking someone in the face so they actually see that you’re listening and so you can actually take in what they’re saying. There’s a feeling of trust that comes from that. So I generally try to set up my cutting room so that whoever I’m working with is not behind me because I find it difficult to crane around. I much prefer if they’re just to the side. And I try to engrain in assistants that work with me –  it’s always more important for me, working with a director, for them to know that I’m listening to them and taking it in and really trying to support what they’re asking for. Even if I have ideas that I think might be helpful, often I will hold off on expressing those ideas. I know by being so centrally involved with the process, I’m automatically going to put my stamp on the project. I can always bring my ideas up at any time. It’s more important to let the director know that you’re there for them and you’re really taking in what they want.

All images from Mr. Robot, courtesy USA Network

HULLFISH: Yeah I was just talking to my assistant about this because I sent out an edit in a certain state and he was asking me about “well don’t you want to tell them this is wrong or this has got to be changed?” I said, “No.” He will see that he will have his own ideas of what has to be changed and what doesn’t have to be changed. Get your ego out of the way and let the director do their job. If some of the stuff gets missed by the director then I will bring it up, but I’m not going to bring it up right now.

HARRISON: Absolutely. I’ve definitely learned the hard way. If your director doesn’t think that you’re there for them it’s very difficult to get work done and to feel the collaboration happening. You really have to be available.

HULLFISH: Exactly.

To read more interviews in the Art of the Cut series, check out THIS LINK and follow me on Twitter @stevehullfish

Special thanks to Noah Adams for his transcription of this interview.

The first 50 Art of the Cut interviews were curated into a book – broken down by topic. This book reads like a virtual roundtable discussion of editing and is a rare glimpse into the art of modern film and TV editing. Together these editors have won more than a dozen Oscars and have been nominated for more than three dozen more, not to mention numerous Emmys and Eddies. All told, the book contains advice gleaned from more than 1,000 years of editing experience.

The post ART OF THE CUT with “Mr. Robot” editor, Philip Harrison appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/art-of-the-cut-with-mr-robot-editor/feed/ 0
Phottix Ares II: new flash trigger for manual shooting https://www.provideocoalition.com/phottix-ares-ii-new-flash-trigger-manual-shooting/ https://www.provideocoalition.com/phottix-ares-ii-new-flash-trigger-manual-shooting/#respond Tue, 28 Feb 2017 14:21:07 +0000 https://www.provideocoalition.com/?p=46923 With 16 channels, 4 groups and a range of 150 meters, the Phottix Ares II Flash Trigger takes the promise of the original Ares to a whole new level. When Phottix launched the original Ares, in 2012, the flash trigger solution was a standalone product within the Phottix ecosystem. Presented as an affordable solution, to

The post Phottix Ares II: new flash trigger for manual shooting appeared first on ProVideo Coalition.

]]>
Ares II: new flash trigger for manual shooting

With 16 channels, 4 groups and a range of 150 meters, the Phottix Ares II Flash Trigger takes the promise of the original Ares to a whole new level.

When Phottix launched the original Ares, in 2012, the flash trigger solution was a standalone product within the Phottix ecosystem. Presented as an affordable solution, to use off-camera flash, the Ares transmitter and receiver units were popular and lauded by some of the biggest names in the industry for its design and reliability.

The original Ares had its limitations. With only 8-channel, the transmitter and receiver units, albeit offering a range of 200 meters, had a “fire-all” channel function, meaning all flashes connected to the units would fire at once. The Ares II, introduced now, reduces the working distance, at least on paper, but offers a more dynamic approach.

One important aspect is that the Ares II is no longer a standalone solution but part of the Phottix ecosystem, meaning it works with other Phottix products. That’s an important aspect if you start with the Ares II and want to expand your system to newer and more sophisticated solutions.

One such example is the connection to the Strato II family of triggers, which is one of the most popular within the Phottix brand. The Ares II first four channels are compatible with the Phottix Strato II Receiver. Furthermore, the Ares II can be used to trigger products as the Mitros+ and Indra series of studio lights with built-in Strato II receivers. With 16 channels and Digital ID, the Ares II allows users full control of their lights. Use the Digital ID function for the ultimate in secure triggering – no one can trigger your flashes unless using your four-digit Digital ID code.

The Ares II transmitters and receivers have been designed to be compatible with most major camera and flash systems (including Sony). The Ares II is, according to Phottix,  at home on Canon, Nikon, Sony (MIS), Pentax, Panasonic, Fuji and Olympus cameras and compatible with most hot shoe flashes (triggering from the X-Sync pin).

Both units offers LCD display for control of functions, a maximum sync speed of 1/250,  and are powered by 2 AA batteries. Price wise, the new Ares II reflects the new and more ambitious specifications. While the original Ares, which is still available, costs $62.00 for the transmitter and receiver pair, the new Ares II will cost you $54.95 for the receiver and the same price for the transmitter.

The post Phottix Ares II: new flash trigger for manual shooting appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/phottix-ares-ii-new-flash-trigger-manual-shooting/feed/ 0
Using Blackmagic’s DaVinci Resolve For Editing, Grading, and Finishing https://www.provideocoalition.com/using-blackmagics-davinci-resolve-for-editing-grading-and-finishing/ https://www.provideocoalition.com/using-blackmagics-davinci-resolve-for-editing-grading-and-finishing/#respond Tue, 28 Feb 2017 05:18:55 +0000 https://www.provideocoalition.com/?p=46878 Blackmagic’s DaVinci Resolve 12.5 was the program of choice for editing, coloring, and finishing the original short form comedy series “Please Tell Me I’m Adopted!”  The Wraptastic Productions series is premiering March 6, 2017 on Amazon. For a damn near entire DIY production, the use of Blackmagic’s DaVinci Resolve may have been the glue keeping

The post Using Blackmagic’s DaVinci Resolve For Editing, Grading, and Finishing appeared first on ProVideo Coalition.

]]>
Blackmagic’s DaVinci Resolve 12.5 was the program of choice for editing, coloring, and finishing the original short form comedy series “Please Tell Me I’m Adopted!”  The Wraptastic Productions series is premiering March 6, 2017 on Amazon. For a damn near entire DIY production, the use of Blackmagic’s DaVinci Resolve may have been the glue keeping the production pieces together.

What is this comedy series? Well, here is some lowdown on the show itself. The series was created by Nicole Sobchack who is also the star. “Please Tell Me I’m Adopted!” follows free-spirited and disaster magnet Tiffany who loses her boyfriend, job, and home in one single day. She is then forced to move in with her newly married sister Emma and her husband. The newlyweds get pulled into the many outrageous and hilarious situations by Tiffany.

ResolveOriginally, the show was a one-off youtube video which transformed into the short-form episodic comedy series it is today. This was an all out DIY production. From friends and family lending a hand to a crowd funding campaign this production looked to keep their costs down while giving them to tools they needed “Which is why we chose DaVinci Resolve Studio,” according to Chris Sobchack, Executive Producer.

Because this was a downright DIY indie project, there were the usual limitations on the amount of time they could afford to shoot.  “We shot quickly, and several of DaVinci Resolve Studio’s tools were lifesavers when it came to repurposing footage,” Chris said. “The sharpening and blur tools made shots come alive that otherwise weren’t usable, and image stabilization was huge.”

“We would find something in the trash folder and think, ‘Wait a minute, we can use that!’ We would stabilize it and zoom to cut out anything not needed. We also used temporal noise reduction,” he continued. “For example, there was one shot with three people on a couch, and we cut that into single, twos and threes, zoomed in because everything was shot in 5K, and thanks to the temporal noise reduction, you can’t tell it was done that way.”

ResolveChris state they took full advantage with DaVinci Resolve Studio’s streamlined workflow. “It’s an easy-to-use NLE, and it’s laid out to be a great workflow that’s as simple as bringing footage into the media page and then moving it along through the edit page, into color and finally, delivery.”

Since Chris is also the Drum and Percussion Technician for Elton John, He was forced to complete a large portion of the post-production while on tour. He took to editing and grading in hotel rooms and even on tour buses on a MacBook Pro using portable drives, USB hubs and Blackmagic Design’s UltraStudio Express capture and playback device.

“The whole project took roughly two years to complete, with Nicole and I doing everything in post ourselves: CGI, sound design, foley, dialogue, the score, VFX, color, editing, titles, output, and mastering,” said Chris. “I would edit and grade both on tour and at home, but as the creator, Nicole had final cut on the footage. DaVinci Resolve Studio’s trim tools were great because she could provide direction in the post suite, and I could instantaneously pull left and right a little, cut back or move around to finesse the edit.”

Like many who have taken the time to learn the DaVinci Resolve toolset, Chris took full advantage of DaVinci Resolve’s Power Windows. Seriously, if you using the program and you have not learned to embrace their power windows I highly suggest to stop reading this right now and learn this invaluable feature. Chris said, “Power Windows with HSL qualifiers were amazing for pushing the skin tones. I could make a quick Power Window, set the HSL qualifier to match, and DaVinci Resolve Studio tracked it seamlessly.”

Chris concluded, “On a technical level, taking all the raw elements, putting them into DaVinci Resolve Studio and then seeing our vision come to life was very rewarding. On a personal level, for Nicole and I to be able to do this ourselves and be able to tinker to our hearts’ content in order to get it perfect was equally as rewarding.”

Resolve

Resolve

The post Using Blackmagic’s DaVinci Resolve For Editing, Grading, and Finishing appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/using-blackmagics-davinci-resolve-for-editing-grading-and-finishing/feed/ 0
Workflow-28 Weeks of Post Audio-Week 16 https://www.provideocoalition.com/workflow https://www.provideocoalition.com/workflow#comments Mon, 27 Feb 2017 22:52:01 +0000 https://www.provideocoalition.com/?p=46799 There really is no “standard” workflow for anything in filmmaking. Every project, in every aspect, is a unique set of situations. This post will discuss the differences between movies and TV, however, this is general, and it should be understood that each project is a thing unto itself. Typically, the process and approach to feature

The post Workflow-28 Weeks of Post Audio-Week 16 appeared first on ProVideo Coalition.

]]>

There really is no “standard” workflow for anything in filmmaking. Every project, in every aspect, is a unique set of situations. This post will discuss the differences between movies and TV, however, this is general, and it should be understood that each project is a thing unto itself. Typically, the process and approach to feature film and television in audio post are quite different.

Audio post is actually a wide number of very different skills. There are recordists, editors, and mixers who work with the audio elements of a soundtrack – the dialog, the music and the sound effects. On 100 million dollar Hollywood features there might be several of each of these – perhaps an ADR recordist, an ADR editor, a dialog editor and a dialog mixer etc., etc. On small, no-budget projects, it’s more likely to be an army of one who does all of that work and more. However, make no mistake, no matter how many people are dedicated to any one task, depending on the source material and depending on the project, each of those skills will be required.

workflowIn feature film, a supervising sound editor facilitates creating the vision of the filmmaker’s story – in sound. It will begin with the director and supervising sound editor going through the movie scene by scene, from start to finish, in a session typically called spotting. Through spotting, all of the audio will be discussed at length – is there dialog that needs to be rerecorded, what type of musical score will it be and where will it placed, scene by scene, are there special sounds like spaceships, alien landscapes or monsters that will need to be created? The supervising sound editor will then determine what work will be required and the various audio crafts that will have to be deployed, to create the soundtrack to tell the story that the director needs to tell.

A dialog editor will be tasked with making the sound edits from the location audio work for the edit choices that were made in picture editorial. The dialog editor will fix whatever audio can be fixed, will flag possible lines to be ADR’d, will dig through alternate takes to save location audio if possible, will add room tone for dialog, ADR and Foley tracks, and generally prepare the dialog for a mixer to use for the soundtrack. Dialog editing and ADR are covered more fully in this series here. Feature films also require extensive sound effects design and usually Foley. Foley is the recreation and recording of human sounds, such as cloth movement, footsteps, hand pats and claps, among other things, in sync with the picture. Foley is essentially, the sorts of sounds that humans uniquely make.

Foley and ADR are quite intensive tasks. Both require an artist (Foley artist or an actor) for the performances, they require recordists to record the various takes of Foley or ADR, in a manner trying to match the quality of the location audio. In ADR all of the takes will need to sorted, and chosen by the director. Finally, they need to be edited into the scenes and mixed, to match the ambiance of whatever scene they are meant for. This is a highly skilled set of tasks to make all of those different things ultimately blend seamlessly. I mention this because if you are a first time film director or producer, the more intensive the audio post needs, the more time and money it will take to make the movie you need to make. I will reiterate a common theme – fix it in post is a last resort, not a plan for getting through the day. You can kick the can down the road but sooner or later the road will end. Always best to get it right on location, within reasonable confines. An extra few minutes on set, here and there, can save a fortune in post production time and money.

Original musical scores are typically written specific to a given feature. They are an artistic creation in their own right, written for emotional connection, and scene pacing, in sync with the story being told. The creation of the musical score requires a whole different set of talented people. It not only requires the artistic work of the composer, but also the talents of the musicians, and the recording engineers, and the mixers. Finally, the music editor will coordinate all of that for sound editorial and sound mixing.

Post audio for narrative television programs approach the post audio work in a similar fashion to feature films, except that the schedules are very accelerated. The types of shows out there vary considerably of course, however, once a show has been established, in many cases routines can help in the process of sound editing and mixing.

workflowUnscripted television, talks shows, and news programming, which is the vast majority of the daily, bread and butter programming for US television, is created quite differently. Many of these programs use non-diegetic sounds, like whooshes or hits, chosen and cut in by the picture editor to match their transition of film effects, wipes or cuts. Those sounds then become the library for that show, so each subsequent episode matches specific sound effects with specific transitions, for example. This also includes the music tracks as well.

Since television programming has many decision makers through every step of the process, many of the elements of a show get decided early on as the pilot episode gets created. Decisions like the editing style, color palette, font styles, and sound effects then stay true to the upcoming episodes. Because of that particular process in TV, in most cases, these initial choices are made between the producing staff and the network executives.

In this type of programming regarding music, typically, the production company and networks will choose and place the music tracks, often from a library with a blanket usage license. In television audio post it’s often called “sweetening” since the basic concept of the show, including non-diegetic sound sources, has been fully created. It is the job of audio post to embellish and enhance the choices already made and to enhance the storytelling with sound. A major aspect as well is to meet the various delivery requirements including loudness standards and audio sub-mixes or stems. More extensive discussions with that can be found here.

Documentary films, often labors of love and passion projects, can have any number of audio needs. Depending on the “world” a particular documentary requires, it might require extensive editing of backgrounds and effects, or keep it the effects simple and to a minimum. Some documentarians want to stay strictly true to the location or story that’s been shot, only using location recordings to create the world. Other filmmakers want each moment to be filled with non-diegetic sound to reflect the mood shifts and pacing.  As with most of filmed entertainment, the recorded dialog tracks are a main effort, this holds especially true for documentary films. I have a post specific to documentary sound here.

workflowNo matter what the genre of project, the goal is always the same, to create a great sounding final track, meeting the audio specifications required for distribution, creating whatever versions of dialog, music and effects needed, but most importantly, to help tell the story through sound.

This series, 28 Weeks of Audio, is dedicated to discussing various aspects of post production audio using the hashtag #MixingMondays.

Woody Woodhall is a supervising sound editor and rerecording mixer and a Founder of Los Angeles Post Production Group. You can follow him on twitter at @Woody_Woodhall

Support ProVideo Coalition
Shop with

The post Workflow-28 Weeks of Post Audio-Week 16 appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/workflow/feed/ 2
Clarity 800, the world’s first miniature HFR camera for live production https://www.provideocoalition.com/clarity-800-worlds-first-miniature-hfr-camera-live-production/ https://www.provideocoalition.com/clarity-800-worlds-first-miniature-hfr-camera-live-production/#respond Mon, 27 Feb 2017 19:24:59 +0000 https://www.provideocoalition.com/?p=46851 The Clarity 800 camera system offers sports and event producers high-quality, real-time HFR video. It’s one of multiple products Bexel will have on show at the 2017 NAB Show. The Bexel Clarity 800 Camera is part of Bexel’s specialty camera initiative. The new model was first used at 2017 NBA All-Star Weekend, with two cameras

The post Clarity 800, the world’s first miniature HFR camera for live production appeared first on ProVideo Coalition.

]]>
Clarity 800, world’s first miniature HFR camera for live production

The Clarity 800 camera system offers sports and event producers high-quality, real-time HFR video. It’s one of multiple products Bexel will have on show at the 2017 NAB Show.

The Bexel Clarity 800 Camera is part of Bexel’s specialty camera initiative. The new model was first used at 2017 NBA All-Star Weekend, with two cameras used to capture key moments of the D-League All-Star Game. “It’s a proprietary camera that we have been developing for more than a year, is about the size of a smartphone, and can capture HD images at 4x, 6x, and 8x speed,” says Lee Estroff, Bexel, VP of account development. “We think it is going to be a game changer.”

The 12 oz camera system, which expands the possibilities of slow-mo imaging, is the world’s first miniature, high-frame-rate (HFR) point-of-view camera for live production. Building on Bexel and Camera Corps’ experience of using specialty cameras in live broadcast, the company developed the standard-setting Clarity 800 camera system to meet sports and event producers’ increasing demand for high-quality, real-time HFR video.

Clarity 800, world’s first miniature HFR camera for live production

The Clarity 800 offers HFR processing in HD up to 8x (480 fps) and 1080p for superior quality and can handle all video formats, including 1080i and 720p. The camera delivers complete camera functionality in a form factor that is only 4.7 inches high, 2.56 inches wide, and 1 inch thick. The Clarity 800 easily integrates into a live event ecosystem, operating as a broadcast camera system with real-time processing via fiber optics and integration with industry-standard video servers. Key features include a full-function camera remote control panel for paint control of the camera and a positive-lock lens mount with lens control of focus, iris, and zoom motors.

While the Clarity 800 is the star of Bexels’ presentation at NAB 2017, the company will also have on display its wireless audio and communications solutions. On display at NAB will be an example of Bexel’s exclusive RF audio solution developed with Quantum5X. Intended as player and coach mics for a variety of sports applications, the QT-5100 PlayerMic is a rubberized, flexible player microphone system that weighs less than 2 ounces and is only a third of an inch thick. The PlayerMic has been field-tested by the NBA and can be wirelessly controlled to adjust frequency, mic gain, RF power level, mode, and grouping.

Clarity 800, world’s first miniature HFR camera for live production

Bexel has recently made a substantial fiber-optic equipment investment, following the growing need to accommodate 4K and its requirements for increased bandwidth and 12G support. At NAB the company will showcase its expertise with its newly built Fiber Mini Booth Kit, a solution for interfacing an announce booth with audio, video, intercom, and IFB equipment to connect with a mobile unit up to 3,000 feet away without signal degradation — all over six strands of single-mode “tactical” fiber-optic cabling.

Clarity 800, world’s first miniature HFR camera for live production

According to the information available, “the Mini Booth Kit gives productions the ability to send and receive 12 bidirectional video paths (six each way), 16 audio paths (eight microphones, eight intercoms), four IFB channels, and an optional robotic camera interface with Ethernet control. The Fiber Mini Booth Kit is extremely user-friendly and can be moved around and set up within minutes, making it ideal for smaller productions with only a single on-site technical manager to set up the various fiber paths.”

Also from Bexel, expect to see the Creative Studio, an IP-based, cloud-managed solution for the distribution and consumption of both live and recorded video content. On display at the 2017 NAB Show, the Creative Studio features a complete production tool that is user-friendly, cost-effective, and remotely managed.

The post Clarity 800, the world’s first miniature HFR camera for live production appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/clarity-800-worlds-first-miniature-hfr-camera-live-production/feed/ 0
Flicker: Why On-Set Monitors Fail Us https://www.provideocoalition.com/flicker-set-monitors-fail-us/ https://www.provideocoalition.com/flicker-set-monitors-fail-us/#respond Mon, 27 Feb 2017 17:46:50 +0000 https://www.provideocoalition.com/?p=46789 While working as a young second camera assistant on a film TV series in the 1990s, I noticed the DP set the camera shutter to 144 degrees (1/60th of a second exposure). I asked him why. “I’ve been burned by HMI flicker in the past,” he told me. “I’m never going to let that happen

The post Flicker: Why On-Set Monitors Fail Us appeared first on ProVideo Coalition.

]]>

While working as a young second camera assistant on a film TV series in the 1990s, I noticed the DP set the camera shutter to 144 degrees (1/60th of a second exposure). I asked him why. “I’ve been burned by HMI flicker in the past,” he told me. “I’m never going to let that happen again.”

A few years ago, shortly after the RED ONE was released, I found myself shooting a short film for an internal ad agency film contest. It was a “just for fun” project, and I borrowed some gear from my regular crew to pull it off on a “short film budget.” As part of the lighting package I found myself with two 1200w HMI PARs, both with magnetic ballasts.

At one point during the shoot I noticed one of the HMIs was flickering. I noticed this while looking at a white wall (large flat surfaces are great for detecting flicker by eye) but I didn’t see it on our LCD monitor. I swapped the light out for the other 1200w head, changed my shutter from 180 degrees to 144 (1/60th second), and kept on shooting.

It wasn’t until I saw the footage on my computer monitor at home that I noticed something weird: faint roll bars drifting through the image. These weren’t visible on the on-set monitor, and I couldn’t see them on my computer monitor unless I quickly scrubbed through the timeline.

About nine months later I found myself on a job with a Sony F55, which has a global shutter. One scene, in a bedroom, saw us over cranking at 48 frames per second. The image looked great on our OLED monitor, but a little while later my DIT came to me for a quick chat. “The practical lamps are flickering,” he said. I walked quickly to his station, where he showed me that the lamps on either side of the bed were drifting in and out of phase: they’d look fine for few seconds and then flicker would appear, grow stronger, peak, and then gradually disappear.

This was completely invisible on the on-set monitor.

Recently, while researching an article on HDR, I found some articles on the Internet that may explain why kinds of flicker doesn’t always show up until post. The links to these articles can be found at the end of this one, in the event that you’d like to read them yourself, but I’m going to try to summarize the key points and touch on those things that are particularly relevant to filmmakers.

There are three things (at least) working against us when trying to detect flicker on the typical on-set monitor.

MOTION BLUR

This appears to be the key to detecting flicker on-set. It’s a deceptively complicated subject.

We rarely stare at one spot on a display for any decent amount of time. Our eyes constantly scan the frame. A monitor with a refresh rate of 48hz will display a 24fps video frame twice at 1/48th of a second. That’s a fairly long exposure time to fast-moving eyes, and results in blurred pixels as our eyes dart around the screen.

REFRESH RATE

This is the “frame rate” of the monitor. At 60hz, which is a fairly common refresh rate, the monitor will display 60 discrete frames per second, at 16.66ms per frame. At 30fps, each progressive frame will be displayed twice.

At 24fps, the monitor refresh rate changes to 48hz, so each frame is displayed twice for about 21ms each time.

DITs tend to use LCD displays that function at a higher refresh rate, typically 120hz. This means that each frame is displayed five times at 8.33ms per time. As each pixel appears sharper, differences between frames become more obvious.

RESPONSE TIME

This is the amount of time that it takes for a pixel to change from one hue and intensity to another. For example, given a 60hz display, if a pixel takes 8ms to transition from black to white, then its first frame will appear as white for only half of its 16ms screen time.

The shorter this transition period is, the sharper the image will look as the eyes sweeps across it. For example, if the same pixel takes only 4ms to change, that means it is now in its intended state for 12ms.

Longer transition times result in apparent motion blur, which can conceal flicker.

SAMPLE AND HOLD

Most LCDs and OLEDs employ this technique, where a pixel is painted and left static until it is told to change. On a 24fps display this means that each pixel is left effectively unchanged for 1/24th of a second: the first 1/48th of a second sees the pixel change from one state to another, and the next 1/48th of a second sees no change at all.

This is different to how CRTs functioned, where a phosphor pixel’s brightness would start to decay almost immediately after being painted, resulting in a “dark gap” between frames. This effect was largely hidden by painting the odd scan lines during the first half of the frame’s display time, and the even scan lines during the second half (interlacing). If every line had been painted from top down, the top would have started dimming by the time the electron beam reached the bottom of the frame, resulting in a noticeable roll bar.

This “dark gap” was still visible, though, in that the display time for any given line was less than 1/60th of a second. This translated into the appearance of increased sharpness to the roving eye. Sample-and-hold eliminates this effect: a frame that is meant to be displayed for 1/24th of a second appears for exactly that long.

Some OLED displays add artificial CRT flicker through a process called Pulse Width Modulation (PWM), where “dark gaps” are introduced between frames by turning the pixels off. This artificially-induced flicker reduces each pixel’s “exposure time” and decreases motion blur.

Likewise, some high-end LCD monitors will add “dark gaps” through the use of a strobing backlight.

PHASE

When the camera’s shutter is in phase, or synced, to a flickering light source, the flicker disappears. When the camera is out of phase with a flickering light source, flicker appears. When shutter speed and lamp flicker rate are slightly out of phase, they will slide in and out of phase such that the light may not flicker noticeably for long periods of time, but will then ramp into and out of a period of flicker. Unless one is looking at exactly the right spot on the screen at exactly the right time, the flicker may not be immediately noticeable. Typically I have so much to do that I can’t spend a lot of time examining every portion of the screen for flicker, but an on-set DIT will often spot it when they scrub through clips to check them for integrity. High-speed scrubbing reveals flicker issues better than anything else.

Editors do this as well, which is why they notice flicker immediately.

SOLUTIONS

ALWAYS SHOOT WITHIN THE HMI SAFE WINDOW

Long ago, in the days of magnetic ballasts, HMIs were prone to flicker if the power mains frequency drifted or the ballast wasn’t well maintained. Shooting with a shutter angle/exposure time that captured the same number of light pulses in each frame was the safest way to avoid flicker. By phasing the camera’s exposure to the light’s flicker there was much more latitude in how far the light’s flicker rate could drift before it became a problem.

This combination of frame rate and shutter angle/exposure time became known as the “HMI Safe Window.”

I always shoot within this window. For example, I never shoot at 24fps without setting the shutter to 1/60th of a second (144 degrees). I don’t care if I’m outdoors or indoors, or shooting with tungsten lights vs. HMIs, LEDs and Kino Flos: I habitually set the shutter to 144 degrees. Doing this consistently will eliminate most forms of flicker. The “normal” shutter angle of 180 degrees results in an exposure time of 1/48th of a second, and that doesn’t sync well with a 60hz light that’s having problems. That’s literally living on the edge of the HMI safe window.

When shooting off speeds I will always aim to put the exposure within the HMI flicker free window. I live in a 60hz power country, so I’ll shoot 48fps with a 144 shutter or 60fps with a 180 shutter, both of which give me an exposure time of 1/120th second.

1/120th is a dangerous place to live. I will check the monitor very carefully for flicker, looking in particular at the on-set practicals and LED sources. When the camera captures images at 120fps, and the light is flickering at around 120fps, camera/light phasing is critical.

PRACTICALS MUST BE TESTED

Small halogen bulbs have become a serious flicker issue as of late, and my suspicion is that they are more energy efficient because their filaments may be thinner. A thin filament means a big change in brightness when the AC current cycle switches direction and the filament cools during that transition. (Larger filaments take longer to cool and are less prone to flicker.) The effect is especially pronounced if the bulb is dimmed, as the filament cools more between cycles.

I refuse to take practicals for granted: if I’m shooting at high frame rates I will always shoot a quick test and view playback on multiple monitors to see if any of them flicker.

LEDs can’t be dimmed by reducing voltage but instead are best dimmed by adjusting current, and this kind of circuit can be expensive to build. It’s cheaper to flicker the LEDs at high speeds (pulse width modulation) so they appear to dim. High quality LED lights flicker at insanely high speeds and are largely flicker free even at high frame rates. Cheap LEDs tend to flicker at a rate that is invisible to the eye but is easily visible to a camera.

Whenever I work with a new motion picture LED fixture I always look at how it performs against the camera when dimmed to its lowest setting.

Industrial LEDs seem to do their own thing. Sometimes LED practicals in the background will phase at 1/60th of a second (144 shutter) and sometimes they won’t. I’ve noticed that exit signs seem particularly troublesome in this regard.

WAVEFORMS AND FALSE COLOR

Sometimes the best way to detect flicker is to watch a luma waveform. Global shutter flicker will turn up as a flickering trace on the waveform, and may be hard to detect as most in-monitor waveforms run at reduced frame rates to save on processing power. For example, an on-camera monitor’s waveform may only run at 12fps or 6fps, making flicker harder to detect.

Rolling shutter flicker will show up as a slow rolling movement in the trace, as if part of the image is breathing.

Sometimes false color will show flicker faster than anything else if the flicker causes enough of an exposure change that a portion of the frame transitions from one color to another. Rolling the lens aperture slowly may put flicker on the edge of a false color range and reveal it more quickly.

WHAT YOU SEE IS NOT ALWAYS WHAT YOU GET

Some monitors may not show flicker very well. Some waveforms may not show flicker very well. Sometimes file compression may enhance flicker such that it appears more strongly on playback. There are no easy answers.

My solution: always shoot within an HMI safe window, no matter the frame rate. That solves 90% of the problems I’ve run into. Doing that, along with refusing to take a practical light for granted at odd frame rates, will give you the best odds of avoiding a late night call from post.

Further reading:
Factors Affecting PC Monitor Responsiveness
Why Do Some OLEDs Have Motion Blur?

Art Adams
Director of Photography

 

Support ProVideo Coalition
Shop with

The post Flicker: Why On-Set Monitors Fail Us appeared first on ProVideo Coalition.

]]>
https://www.provideocoalition.com/flicker-set-monitors-fail-us/feed/ 0