Site icon ProVideo Coalition

Sony Introduces SR 2.0 Initiative in LA

DSC5790_thumb.jpg

Sony Introduces SR 2.0 Initiative in LA 1

I went to the Sony event in Hollywood tonight where they introduced their “SR 2.0” initiatives, including the following:
-more datarate options for the codec
-upgradeable cameras (35mm and PL mounts)
-solid state recording
-a hardware/software media transcoder
-hints about a 4K camera
-3D and DPX on SR tape
-it’s allllllll about file based workflows

Details after the jump

Short version of the highlights:

-SR solid state – 1 TB of storage, RAID 5 protection, 5 Gbps (that’s 625 megabytes/sec), due 2011-12 timeframe, price TBD

-new 5800/2 deck with new features, such as 3D recording (L/R eyes), DPX data mode, extended metadata support, even ASC CDL via 3rd party on set stuff

-the current SRW-9000 series camera will have two possible upgrades – a solid state option (as above), and an upgrade to 35mm sensor w/PL mount

-Multi-format transcoder – a big honkin dual Nehalem rackmount system with 8 16x PCIe lanes for Cell processor accelerated cards- one for media transcoding, one for ingest/playout – including multiple simultaneous processing tasks

-new SR bandwidth option – a “Lite” version of HDCAM SR codec – 220 megabits – not so much for acquisition but for distribution, or for transcoding from HDCAM

-media (SR tape) costs to be cut – starting with the 40 min tapes cut up to 25% in cost

-hints of a Sony 4K camera, RGB not Bayer, 4:1 compression, fits onto the SR solid state stuff at 12 bits/channel/pixel

-Baselight Truelight On Set allows for:
-previewing LUTs on set
-without altering the source media
-can realtime preview the LUTs
-the ASC CDL LUT is embedded in the metadata on tape with timecode etc.
-Baselight can read that on ingest, so shots come up pre-graded as desired on set, but can change all you want since the unaltered source material is recorded

Its late and I’m going to work on getting the pics up (includes a buncha slides of very cool stuff, link forthcoming), so here’s my raw (not RAW, ha ha) notes as I took’em, so typos and all.

By the way – does this sound like a response to any other company’s plans on the market? Hmm. Interesting stuff, here’s the deets:

=================================

Pictures, including key slides, can be seen here. Do take a look, good stuff in there not elaborated in text below.

notes on hdcam sr 2

they opened with the trailer for 2012 – “who will survive?”

shot digitally, recorded to SR tape

decided to call the event SR 2.0 since they have new stuff to announce

Rick Hardy

talk about change

entertainment imaging

the ongoing evolution of image capture and record

2000 – the F900 for 1080p24

had HDCAM record format

world’s most popular HD recording format

Lucas shot on it

F950 came out, 4:4:4, and SR in 2003

these were revolutionary – were tested/proven on Star Wars: Revenge of the Sith

in 2004, SR went mobile with SRW-1, which opened up to let competing systems record on it – Genesis, Viper, D21, etc. – a reliable, cost effective record solution

by 2005, SR was on the way to be defacto standard for high end acquisition, mastering, distro

didn’t reach production pinnacle until F23/F35 digital cinema cameras in 2007/8

built specifically for filmmakers

to deliver ergonomics, functionality, accessory compatibility

F23/35/Genesis “are considered best of the best”

as good as the tech is, Sony accomodated post processes with the 5800 deck in 2008 – first dual system for digital and data recording up to 4K in one unit

SR become more than a format – a platform

looking into the platform to see how it’ll change

3 guests, first is Curtis Clarke, ASC, board of governors, TV/commercial/features, has been head of tech committee since formed in 2003, he’s key on the ASC/PGA camera assessments.

Curtis:
in 2003, first meeting was to do a survey of the DI process – out of the many productions talking DI, only about 10-15% were doing DI. Everybody is nowadays for the most part. Decade long inflection point to get us into digital. DI is hybrid – film to digital back out to film.

Then along came a next gen of digital cameras. They came out of HD lineage – claimed to be digital motion picture cameras – for him personally, for any of these digital cameras to be worthy of “digital motion picture camera” – it needs to move beyond Rec 709 and the tone scale reproduction therein, get into a film space of color and the dynamic range – 12 stops minimal shadow to highlight, do it effortlessly and consistently, same way you would 35mm film.

CAS reference was film, not each other – how do they compare to film, not each other. Film is the The Standard to compare against. Always was. Now, there are a couple of cameras that are able to match film in terms of dynamic range and colorspace reproduction. Gotta move beyond HD and digital television limitations.

VP guy: Digital imaging needs to be move beyond HD -Curtis : log encoding is an efficient way of encoding that dynamic range. S-log in F35 is pretty amazing and usable. Gotta combine that with a color gamut/color space (F35’s S-gamma, 3×3 matrix) provides the pedigree for the camera to be worthy of being called a motion picture camera. Can have look and feel of film (personal feelings here) – is very impressed with F35 S-log is able to create a film look that is extraordinarily close to the film reference.

VP: from a cinematographer’s perspective, where does SR need to go?

Curtis: from HD tape to data mode, where we’re moving is file based workflow. When shooting with SR tape in S-log, you convert it to a DPX generally. Currently 10 bit DPX, maybe be more later, but that’s current. SR is an archival medium, image is available in a master form, in the future, moving into solid state, disk based recording, tape is great, but march of history is data mode and file based image workflow.

Always RAW data as it comes out of a bayer pattern, but that data is maintained in the native capture format and converted in an unambiguous way into some file format that can be used without compromise. The issue is about workflow is controversial – none of the cameras can be looked at as an island – only matters how the image can be handled within a given workflow – currently is cineon/DPX based – ultimately a film print of a digital master (DVD and Blu-ray and others). Getting out of log and linear etc is a nightmare. ACES spec, etc., holds the roadmap for where we need to go to get a color mangement system, interchange of imagery, unambiguous transfer between scene referenced linear, OpenEXR, log, 16 bit DPX, etc. – open, transparent, every facility, etc. – the filmmaker can retain the creative control. Make sure your visiion is realizable by the power of these tools. You don’t want someone else further down the workflow inadvertently limiting what that image can do. Image acquisisition through final devliery is important. (also, filmmmaker has to know what the tools are possible of doing)

Sam Nicholson, CEO, Stargate Studios, (VFX guy, virtual backlots, etc.) – a director and DoP, has a studio setup in his facility

VP: when SR came along, how did it affect you?

Sam: first matte painting done on glass doing latent image matte paints – doing all digital now – looking for where the shoe leather is in the process, why does it take so much time to see it, etc. – the most sophisticated work models are in sports and in daytime television – they are recording to solid state, shooting 100 pages, then there’s Monday Night Football, etc.

SR for us has given us the filmic quality we’ve looked for, the discussion of what you’re shooting on has kind of gone away, doesn’t matter what you’re mixing (stealing shots with an EX-3 or an F35), had two XD-Cams, an F35, etc. shooting off a window cleaner ledge – all came out great. Really a matter of picking the right tool, the assortment of Sony product lets us do anything from a tiny hhandheld up to a full featured camera, can intercut successfully

digital for TV VFX: the idea in TV (do a lot of it), was to bring feature quality effects for TV. at the end of the day, is like sudden death – 10 days from picture lock to it goes on air – to do something as complex as Ugly Betty in a virtual set, you make a bunch of promises and a 200 shot 5 minute sequence that you have to pull offf – speed is your lifeline – your parachute before you hit the ground (air date is hitting th ground) – a couple of days in that process is an eternity. Anything that beats the tie constraints to cut time on shooting film – the transfer, etc. – pushes to digital to save a couple of days. the P1 does what a 950 used to do – as we move into 3D and extreme budgets (when did you get more money or time for a project?) – we’re alll working twice as hard in same time. The technology is the only thing that lets you work fastter, increase the visiion, etc.

2 studios – one here and Vcancouver and one in Toronto – a dozen or more shows floating through studios, what does this platform need to evolve into to better help you – about time and money – smaller cameras, higher dynamic range, right at the point where you’ve matched film, but the workflow in terms of data centric (our whole thing is automation), but it all has to boil down to the same thing and be worked by 120 people in 2 or 3 countries at once and then back out to client in 10 days. Automation allows to have more time and craetive options and save money. Same money allows for more creative applications – if transferring film or digitizing tape, with data, can automatically create 20 different resolution proxies that get distro’d immediately – frame accurate, done that night – dailies on your iPhone. Smarter, faster, more efficient.

Paul Chapman – senior VP of tech at PhotoKem – recently made headway in 3D post

PhotoKem opening up a facility in Budapest –

SR has been embraced deeply – have 43 decks in the company . Has been an enabling tech across the company – sometimes replacing film acquisition, has opened up a new tier to the market, has raised the bar on some of the productions. A lot of shows coming through on that format. S-log is easy to do, the acquisition format is easy to do, etc. Is routine, isn’t rocket science – learned how to do it, an easy way to work.

Hanna Montana’s 3D piece, did Final Destination 3D – SR on tape was the enabling tech on that – shot on various camera rigs, recorded to left eye and right eye to same tape, one frame right after the other (440mbit for each?), SRW-1 didn’t have 9 pin deck control. Always make a backup in the process – making that backup tape lets you work on original material as well. The editorial process not done in stereo, but could be today. Left eye is usually what you cut from. 5800 deck supports 3D in a more relevant way, make a 3D master on one tape now. With FD3D, done on a 5800 deck. DI-3D post – where do you see the need for this platofrm to evolve?
higher frame rate recording is of need, one project is a ride film for a theme park shot at 48fps, ingesting that onto their Pablo for conform, the ability to deal with that frame rate is iimportant – not quite there but will get there. Data recording is interesting. Higher frame rates maek big diff for imaging quality. This project was originally going to shoot film, wanted to project 4K, discovered that film still very very good, F35 was almost as good when blown up to 4K, F35 was close enough to 4K film (lot of helicopter footage), efficinet to shoot tape not film in a helicopter setup.

Yasumiki ???? (see pic)
became a conduit for creative ideas back to design teams, senior manager product planning – overseeing strategy for SR platforms in Japan.

see pix!

ASC CDL integrated on set for color management from FilmLight – Craig Risebury from Filmlight, Inc.

craig risebury

see pics

if want to see more, come to office on Cahuenga Blvd

scene to screen

Baselight, TrueLight, Northlight

Baselight – grading software, sell to DI, commercials, and long form TV

make sure DP has tools to communicate his intentions in a way that doesn’t get distorted further down the line

onset monitoring – knowing what you’re shooting and how far you can push it. When have a camera w/14 stops, and a viewing system designed for 7 stops, you can send LUTs and notes, but that easily gets botched. Viewing

lets you save associated meatadata and embed them into the picture stream with timecode with each take

lets you view images out of camera with the look applied, without the look being burned in (maintains original raw source), can load into Baselight direclty, each shot pre-graded with original intent maintained

in the old way of hoping you knew what you got and know when see dailies, can now see final output previewed on set live

TrueLight On Set is what this system is called

New Sony techs include 2x ingest, simple studio profile, and SR memory, will show at NAB 2010

(see pics!!!!)

currently 5800 decks can do DPX frames, next is MXF

220 mbit Lite format – for distro, not so much recording, unless want to transfer HDCAM, transfer at Lite sufficiently

SRW-5800/2
middle of next year gets MXF, 2X ingest, multiple bit rates,

5800s will have an upgrade path

cutting tape costs 25%

1tb in 2 years frim now

4K camera done in RGB (not Bayer), with 4:1 at 12 bits can be done on this

Michael Bravin – CTO of Band Pro

client came to look at F35, needed equipment to shoot under low light, to recreate the situation of a post-tornado search and rescue group

F35 DIT – “13 1/2 to 14 stops”

shot it with no support and brought it to Band Pro afterwards to finish the BTS

requests they get – F35 is great, need something smaller for steadicam and hand held, and some kind of solid state recording. And a smaller B-camera to go with it.

SRW-9000 – built on a 2/3″, B4 mount, can migrate to a 35mm PL and SR memory over time

Multi-format transcoder – based on Cell processor, , lossless JP2K to other formats in realtime (right?)

can hook up multiple VTRs and start ingesting, and like the 5800 can do 2x ingest. Can burn in timecodes, CDLs, has powerful software interface to work in existing post infrastructure

can transcode to multiple formats – H.264 is on their wishlist but can’t commit at this point in time, for instance. But generation of editorial (certainly for DNxHD, FCP less likely) content, etc. is their intent

can ingest multiple sources simultaneously, sometimes at 2x realtime

Overall, very interesting – Sony is getting more flexible and moving (slowly) away from tape, so this is good news – seems responsive to what we shall call “other occurrences in the marketplace by other companies.”

-mike

Exit mobile version