Particle6 founder Eline van der Velden on embracing tech for Sky History’s short-form series
Production company: Particle6
Commissioner: Sam Pearson
Length: 5 x 8 minutes
TX: Monday 1 September, Sky History
Executive producer/Director: Eline van der Velden
Editor: Filip Parulski
Producers/Casting: Ella Smith, Louise Duncan
Director of photography: Dennis Griscenko
Post-house: Particle6 in-house
Particle6 was an early and proactive AI adopter within the UK production sector, and now, a few years down the line, we constantly use a wide variety of AI programmes and tools in multiple ways to help create the best – and most cost-effective - content possible.
We are learning interesting things about its capabilities all the time, no more so than on Two Sides of History, our latest short-form digital series for Hearst Networks EMEA’s Sky History.
Two Sides of History brings together two individuals from different backgrounds who hold expert yet conflicting views on the same historical or cultural event. Through personal testimony and perspective, they share their unique views and reflect on how the subject has impacted their lives.
At the end of each episode, there is a moment of reflection, where the guests step back to process what they have learned from each other.
With one subject, two stories, and two very different impacts, it provides a deeply human look at important periods and themes from our recent past, examining how class, race, gender, or social standing might influence our experiences. Topics covered include immigration, the COVID-19 pandemic and the monarchy.
We employed AI from the outset in tried and tested areas such as brainstorming (subject research), episode design (refining the core format, set design, and setup), and used it in workflow, production management and post-production, as we always do.
However, it really came into its own when it helped us overcome casting challenges.
So often, the success of any show will live or die on its casting, and we always knew finding the right protagonists for Two Sides of History would be the hardest part of making the series, and potentially very time-consuming.
Every episode hinged on finding two people with lived, often emotional connections to the same subject, but from radically different perspectives, who could engage in a meaningful and compelling unscripted debate. It was a tall order, but it was vital that we got it right.
It was essential to have interpersonal chemistry, narrative contrast and the right emotional tone for each theme. We had dozens of candidates for some episodes, but with time tight and budgets for a short-form series naturally modest, we didn’t have the luxury of testing every possible combination on film.
Yet, given the sensitive nature of some of the topics, we needed to explore what contributors might say and how they might react to each other, without putting them in a difficult situation on camera and potentially causing issues for us and our commissioner.
Our solution was to lean into our good friend AI. Not only did it deliver some surprising contributors through an internet deep dive, especially when we were struggling to find the right people after weeks of conventional research for one episode, but it also proved an arch collaborator, acting as a senior EP if you like, helping us find creative new ways to reduce risk and support decision-making.
Using detailed casting profiles generated from pre-interview transcripts, we asked ChatGPT to simulate “mock debates” between different potential contributors. These sample conversations helped us see how tensions might arise, whether points of view would clash or harmonise, and what emotional beats could surface.
We then used AI to model the different directions that debates could take, ensuring each final pairing would create powerful and respectful TV.
With AI quickly turning interview notes into clean contributor profiles and running simulations, the casting team could move faster and with more confidence for each episode. Helpfully, this activity also created a shared language internally: everyone could read the same sample conversation and understand why a particular pairing might – or might not – work.
And we had all of this done before we even stepped on set.
We are a next-gen production business that has been using AI to make content better, faster and more economically for a while, but the integration of AI into casting has changed how we approach unscripted development, and I believe it could prove a game-changer for the wider industry. Instead of just relying on instinct, we now have a way to prototype human chemistry and stress-test creative decisions before filming.
For Two Sides of History, it meant we went into production with pairings that worked editorially, emotionally and ethically – and with some standout contributors that we would never have found, such as the folklore expert in our UFO episode. And the stress-testing had another welcome impact – it kept my stress levels to a minimum too.
Filip Parulski, head of post-production
At Particle6, AI has long been an incredibly valuable tool in post-production, saving time and cost, and allowing us to keep on top of paper trails. But, for me, over and above our usual editing tools, for Two Sides of History, I was most impressed with how AI helped us deliver brilliant audio.
Much of our filming took place in a compact studio. While this was perfect for the intimate look and feel of the show, the close proximity of crew generated ambient noise, as did the rather creaky chairs we had on set. This series is all about spoken debate – no gimmicks, no locations, nothing but voices – so capturing perfect sound was vital.
We got around all noise issues by deploying Adobe Podcast’s Enhance Speech tool, which analyses the input audio, removes noise and reduces reverberation, before applying an equaliser and compression to render it studio quality. It even removed tiny bumps and clicks that usually require manual correction.
However, our most innovative use of AI for sound was in developing the pathway for our music score.
We needed a soundtrack that worked in harmony with the tone and pace of each episode, neither fading into nothing nor, conversely, having so much impact that it would distract from our collaborators and storylines.
We started the process by running rough cuts and a series description through a Large Language Model (LLM) that analysed pacing, tone, and audience flow.
The AI was prompted to identify not only the best moments to introduce or drop music but also how to use it strategically to support viewer retention. It also assessed our rough cuts against top-performing content in the same format and for the same platform, and then recommended a music structure carefully optimised for the target audience. This included when to build tension, when to break for silence, and how to reinforce emotional shifts.
All of this meant our soundtrack choices ended up being grounded not just in creative instinct but in platform-specific performance data, with a bespoke-designed sound arc engineered to hold attention and guide emotion.
No comments yet