The move to file-based workflows has put storage under the spotlight. Leading figures in the industry discuss best practice and what to do with the ‘digital sludge’.
ROUNDTABLE: THE PANEL
Chair: George Bevir Technology editor, Broadcast
Jeremy Bancroft Executive director (workflow consultant) Media Asset Capital Riccardo Finotti Chief executive and president, Qstar Technologies
Steve Sharman Chief technology officer, Mediasmiths
Stephen Smallwood Senior storage specialist, The Mill
What impact has the rise in multiple viewing platforms had on broadcast storage?
Steve Sharman It is a massive infrastructure issue for organisations. The hidden secret of file-based workflows is that they need much more storage. We did a piece of work for one broadcaster and one of its production teams, working from one location, producing one type of content with half a petabyte of rushes every six months. The content was factual – and needed to be kept for legal/compliance reasons – so that becomes a big issue.
We are always being told that the issue of storage is compounded by higher shooting ratios…
Sharman Yes, we’ve seen shooting ratios go up from 30:1 to 200:1. That’s partly because they can, and partly because nobody is generating best practice to tell them any different. For some productions, there’s no reason why you shouldn’t guard space on cards as jealously as you did with film.
Stephen Smallwood We certainly get a lot of data in the form of rushes – one example would be four hours of footage for a 30-second commercial. That has to be processed, stored and kept online for a long period. We find that if you move the data offline, onto tape, it can take longer to come back and, as a result, producers tend to be reluctant to do that until they are certain there is absolutely no work coming back. So you end up with terabytes of storage, which you can’t move in case someone wants to make a change at the end of the commercial.
Sharman We call that ‘digital sludge’ – the stuff that is hanging around because people might want it. By the time we need to do something about it, half the people who have created it are no longer in their positions. So we don’t know if it’s of any value or not.
Smallwood On average, we get a request for archived material for one in every 10 jobs – so 90% of the work that is taken offline is never requested back again.
Can you future-proof against new technologies?
Smallwood One thing not considered in this industry is that all storage goes end-of-life. You forecast how much you need for the next couple of years but then, three or four months before it goes end-of-life, you realise you have to buy twice as much because you have to replace the stuff that is about to go out of warranty.
Riccardo Finotti If archive storage is done in the proper way, your data will always be there. It’s just a matter of getting the latest generation of the storage where we can do a background configuration to move it to LTO-4, 5 or 6, or a disk-based system, or a mixture of those technologies.
Jeremy Bancroft The financial model for storage in a fi le-based environment is a challenge. Next time you buy half a terabyte of storage, it will probably be cheaper than it was previously. But the problem is that you will need twice as much. You will also probably need someone to maintain it for you, which will account for 15-20% of the capital cost of that solution. And then, three years in, you have to replace it all again.
Sharman It’s also a big change. Previously, people went out and bought bits of storage – particularly in post-production. Now they have to invest in a whole infrastructure at a time when capital is scarce.
Bancroft Those who do not want to worry about storage/changing drives/ migrating tapes/data management software can choose to outsource.
It’s technically and commercially feasible to do that now, but the issue is how content owners feel about an entity holding their content and having it transmitted large distances over fibre – possibly over the internet – to get from one place to another. What issues do broadcasters have with the private and public cloud?
Sharman Attitudes are starting to change. Two or three years ago, Adobe came out with a tool called Story for scriptwriting, which held scripts in the cloud. The BBC liked it at trial – but concluded it couldn’t use Story because it didn’t like the thought of its scripts being held in the cloud. ITV is using it, on Emmerdale and Corrie. Is broadcast looking at commodity storage and IT infrastructure?
Sharman When I talk about commodity storage, I think of three main vendors: Hitachi, IBM and Dell. However, there is a continual reliance in the broadcast industry on specialised storage for broadcast – the likes of Harmonic bringing out blue boxes that are broadcast-certified.
Unless mainstream vendors work with people who understand that and gain trust in the industry over a long period of time, that’s not going to change. We have done quite a lot with commodity storage people who understand how to configure it to get the performance and reliability that is necessary in a broadcast environment.
Smallwood Once you have someone who has put the storage in, it works, is reliable and they are responsive, you tend to use them again and again. We also require storage with scalability.
We’ve put in storage that looks OK up until the point where you start throwing thousands of render boxes at it, then it degrades quite abruptly. If you have people in your industry who have used the kit, hit it hard and it’s worked, then we say “great, let’s try that”.
Finotti: Do you prefer open or customised solutions?
Smallwood We had a board member who was keen that everything should be off-the-shelf – it meant there was more general support. But when it came to putting it in, everyone started to talk about all the things they used to be able to do and that they couldn’t do now.
We have used some fully proprietary solutions such as Final Cut Server, but even that was fairly heavily customised. We do tend to end up doing things that we can control ourselves.
Sharman ‘Customise’ is perceived to be a dirty word, but it shouldn’t be. In some ways, it can be your key differentiator. Where I have a problem is with the massive premium attached to what are essentially commodity storage items that have a ‘broadcast’ label.
Bancroft But there are things that some of those broadcast manufacturers will do, which, for some applications – such as partial restore – are essential. We had a customer based in the Middle East who wanted to archive short clips and bring bits of them back later.
The client could have chosen a commodity solution relatively cheaply, but needed it for something quite specific that only a few vendors would do. The customer paid hundreds of thousands of dollars extra for that – but it saved hundreds of thousands of dollars in workflow because they could get the material off that they needed. What kind of role do you think spinning disks will have in the storage infrastructure for people working in production?
Finotti Spinning disk in object archive gives you more ability in terms of longevity and technology migration. On the data tape side for archive, IBM has come out with the LTFS support. We are working closely with Japanese companies on an Optical Disc Archive. At NAB, we will see something more solid in production.
For companies of a smaller size, what’s the best storage solution?
Sharman Looping back to the cloud conversation again, Amazon has just released a product called Glacier – if you can get into the cloud and don’t want to get it back then it costs you a minimum amount of money to keep it forever. Those sorts of things are worth considering. Smaller companies should also make sure they invest in robust solutions such as LTO, rather than LaCie drives or G-Raids, for medium-to-long-term storage purposes.
STORAGE: WHO MANAGES THE PROCESS?
Once the debate opened up to the audience, the subject moved onto the issue of ownership: who would take responsibility for the storage management process – the production company, the post house or a third-party cloud-based service such as A-Frame?
Shane Warden, general manager for IMG Media’s post facility, Mediahouse, predicted that production companies would soon govern and manage their own media.
“The Soho square mile doesn’t support large-scale, power-hungry environments well. We are already building mini facilities for production companies to cater for their storage requirements on location or on site. In the future, I see production companies managing their own media and distributing just what needs to be distributed in small clumps,” he said.
Workflow consultant Jeremy Bancroft pointed out that this was already happening in MediaCityUK, with producers able to access a pop-up office, network connection and a PC.
“They have access to a network, storage, resources, signals routing… the lot. That’s one way to go and it is starting to happen.” Bancroft added that cloud service provider A-Frame also offered this model for production and post companies, although Owen Tyler, operations director at Evolutions, had reservations about handing over a client’s work to a third party.
“We perform through the night and some of these services lock off the editing process at 2am. If we are going to take on a third party like that, we need to have complete 24/7 control of the process as we deliver to our clients.”
There was a suspicion among the post houses in the room that a third-party cloud service could potentially cut them out of the loop if productions decided to deal directly with the cloud. According to Tyler, Evolutions is developing its own cloud-based service for its clients, which is how the facility sees its business model evolving.
Mediasmiths’ Steve Sharman agreed that providing the production community with storage and cloud-based services was a great opportunity for post houses. However, he warned that IP-based networks were not 100% reliable for email, let alone video.
“You go to your hotel room after a day’s shoot and upload your footage. But the problem with any hotel room I’ve ever been in is that I can hardly get my email, let alone send video. Until we get more ubiquity on high-speed networks, it doesn’t matter how good the processing is at the other end of it.”