CoSTAR National Lab: Advanced Production Call Company Announcement
Meet the four companies selected to take part in CoSTAR National Lab's Advanced Production Technology call exploring efficiencies within a forward thinking, converged production process for screen audiences.
Posted: 29 April 2026


Last year our Advanced Production Technology call drew a record number of applicants for CoSTAR National Lab, indicating the growing demand for innovation support in the UK’s creative industries in response to rapidly changing technologies.
The call invited UK companies producing digital content for screens and live performance to pitch for up to £40k each in funding and access to CoSTAR National Lab expertise, studio and AI compute facilities.
We’re now excited to announce the four companies we selected to take part in this programme will be:
- Hat Trick Productions who will be developing a proprietary AI-assisted hybrid 2D/3D workflow to reduce the cost and speed of broadcast animation using ethical AI-assisted workflows
- All Your Tomorrows who will be developing their LLM “NodeNinja” to make game design more accessible and inclusive
- Ten24 Media who will be developing their “SP-6M” AI model that can reconstruct high-fidelity 3D human geometry and texture from a single photograph
- Visualskies who will be developing a hybrid environment-capture and reconstruction pipeline to address the challenges of creating realistic lighting in virtual production when using 3D scans
Each company had to demonstrate how their technology or proposed innovation could be applied within a forward thinking, converged production process and a demonstrable engagement with either EDI, sustainability or both.
Meet the companies and find out more about their projects below.

Founded in 1986, Hat Trick Productions is a highly regarded independent production company in the UK, behind iconic shows like 'Have I Got News for You', 'Derry Girls', and 'Father Ted'. Led by visionary founder Jimmy Mulville, the Hat Trick Group continues to innovate through Hat Trick Lab, exploring new technologies and mediums beyond linear television.
What’s their project?
The team is developing a proprietary AI-assisted hybrid 2D/3D workflow that will enable them to create distinctive, creator-led animated content at a scale aligned with UK broadcaster economics. Their aim is to address some of the most expensive and resource-intensive elements of the animation pipeline; a challenge facing many across the wider sector.
The goal is not to lower creative ambition, but to make high-quality animation genuinely accessible within the UK commissioning landscape. With access to CoSTAR'S AI compute facilities and research expertise, they will build a formalised method for an ‘illustrator-to-generation’ finetuning process using LoRAs; importantly, this includes a legal framework to define clear and respectful ownership of the resulting model.
Meet the team
L-R: Liam Stout, Sam Shurberg, Daniel Efergan, Jonny McCauslandTeam:
- Liam Stout – Producer/Project Lead
- Jonny McCausland – Development Lead
- Daniel Efergan – Technical Lead
- Sam Shurberg – AI Training and Workflow Development
- Dog Ears – Animation Studio/Art Direction
Executive Producers:
- Jimmy Mulville – Hat Trick Managing Director
Descriptions:
Liam Stout – Producer/Project Lead
Liam is Creative AI Lead at Hat Trick Productions and founder of the animation studio ScreenPoint. He was part of the founding team behind an AI-enabled graphic novels platform and specialises in generative AI for the film and TV development process. Liam recently oversaw the production of the Hat Trick AI Jam, a first-of-its-kind AI production experiment.
Jonny McCausland – Development Lead
Jonny founded the Hat Trick Innovation dept. in 2023 to create new monetisation and distribution opportunities and now provides strategic oversight across expansion beyond TV. He currently leads the NFTS AI for Filmmaking certificate and enjoys working with CoSTAR’s AI Copyright working group.
Daniel Efergan – Technical Lead
Since leaving Aardman, where he acted as Executive Creative Director for Innovation, BAFTA-winner Daniel has spent his time working with animation and media production companies investigating the use of AI within production workflows, with a focus on supporting and enabling human creativity as a core tenet of these workflows.
Sam Shurberg (ArtOffcial) - AI Training and Workflow Development
Sam has been working with generative diffusion-based AI workflows for several years. He shares his knowledge publicly under the brand ArtOfficial on his popular YouTube channel. Sam & Daniel have worked together on a previousproject focusing on LoRA training.
Jimmy Mulville – Executive Producer
Managing Director and co-founder of Hat Trick Productions, and widely regarded as one of Britain’s most respected and visionary comedy producers. For four decades he has overseen the creation of distinctive, high-impact series that have shaped the modern British comedy landscape, building enduring relationships with leading on-screen talent, writers and global platforms. Jimmy is the recipient of the BAFTA Award for Outstanding Creative Contribution to Television, alongside numerous UK & international honours. He oversees all creative standards at Hat Trick, combining editorial ambition with the organisational rigour and strategic relationships required to develop, finance and deliver globally successful projects at scale.

All Your Tomorrows is a UK Creative R&D studio working at the frontier of AI and immersive technology. They develop experimental tools and production pipelines for interactive and spatial media, with a focus on procedural generation, AI-assisted design, and VR/XR systems.
What's the project?
The product they’re developing, NodeNinja, is an LLM (large language model) that drives procedural content generation in Unreal Engine; their aim is to accelerate the process of environment generation for games, lower barriers to entry and democratise access to game design. With the help of CoSTAR’s reseach expertise and compute facilities, the team are planning to develop a LoRA encoding implicit level design principles, including proportion, spatial flow, and narrative beats so users don't need to specify these explicitly. Additionally, they intend to conduct inclusive design testing sessions with participants who face barriers to creative practice, including those with disabilities or limited technical backgrounds, to validate whether LLM-driven level creation genuinely democratises access.
Meet the team
Robbie Cooper, Oluwaseyi Jesusanmi, Hannah Cooper, Matthew Thompson, NodeNinja is developed by a multidisciplinary team spanning AI research, Unreal Engine development, executive production, and inclusive design. Robbie Cooper leads technical development with 12 years across AAA games and VR; Matthew Thompson brings 30 years of production experience across games, film and TV; Oluwaseyi Jesusanmi is assisting with AI development; and Hannah Cooper leads social impact from her work at Outside In.

Ten24 is a Sheffield-based technical art studio specialising in high-resolution 3D human scanning, photogrammetry and digital human production. Founded in 2011, the company’s work has supported major games and screen projects including Baldur’s Gate 3, Final Fantasy, Hellblade II and hundreds of other AAA productions. Alongside bespoke scanning services, Ten24 operates 3D Scan Store, the world’s largest library of high-resolution, production-ready 3D human assets. Ten24 is also developing SP-6M, a rights-clear 3D facial dataset containing over 80,000 scans, designed to support next-generation digital human and AI research.
What’s their project?
With the support of CoSTAR’s AI compute facilities and research expertise, Ten24 will develop a proprietary UK-owned AI model that reconstructs high-fidelity 3D human geometry and texture from a single photograph, using reconstruction learned from a subset of its proprietary dataset of over 80,000 high-fidelity scans.
The project aims to explore how slow manual and hardware-dependent character creation could be replaced by a faster, more unified inference process. By training on rights-clear, high-resolution scan data, Ten24 will explore how AI can generate digital humans that retain the morphological diversity and high-frequency skin detail required for premium games, VFX and virtual production workflows.
The project will deliver a trained prototype model, a technical white paper documenting what was learned during model training, and an internal demonstration tool that accepts a single photograph and displays the generated 3D head in real time. Together, these outputs will give Ten24 a clear way to test the technology, share results with partners and clients, and understand how the system can be scaled toward full dataset training.
Meet the team
Ten24 team l-r: Roxanne Marshall-Porter, Rose Timperley, James Busby, Chris Rawlinson, Ben Carlin
Ten24’s CoSTAR project team brings together expertise in 3D scanning, technical art, AI/ML, data engineering and production.
Roxanne Marshall-Porter, Lead Developer / AI Researcher
Roxanne leads machine learning development at Ten24, owning the training pipeline, model architecture and inference systems. With a First-class BSc in Computer Games from Solent University and an MA in Game Design with Distinction from UCA Farnham, she combines games technology, AI research and production-focused tool development.
Rose Timperley, Senior Technical Artist
Rose manages scan processing, retopology, mesh alignment and ML-ready data preparation across Ten24’s dataset. With First-class degrees in 3D Digital Animation and Model Making, and Game Art, she brings a strong technical art background to the project, ensuring geometry, topology, UV mapping and PBR texture quality meet production standards.
James Busby, Director and Domain Expert
James co-founded Ten24 in 2011 and has over 18 years’ experience in photogrammetry, scanning and digital human production. His work has supported AAA games and film pipelines including Death Stranding, Hellblade II, Baldur’s Gate 3, Call of Duty, Final Fantasy Kingsglaive, Alien: Isolation and Halo 4. James leads dataset curation, production benchmark definition and validation against real studio requirements.
Chris Rawlinson, Director and QA Lead
Chris co-founded Ten24 in 2011, having previously worked as lead character artist at Sumo Digital after beginning his career at Argonaut Games. He co-built Ten24’s photogrammetry and QA pipeline and oversees production-standard preparation of scan data, ensuring ground-truth geometry, topology and texture consistency meet the requirements for supervised AI training.
Ben Carlin, Producer
Ben leads project delivery, partnerships, governance and commercial strategy for Ten24’s dataset and AI work. He previously co-founded Megaverse, where he led immersive and real-time technology projects across theatre, games and interactive media, and now supports Ten24’s R&D strategy and grant delivery.

As globally recognised technology drivers, Visualskies is a hybrid studio bridging the gap between physical reality and digital worlds. They deliver an end-to-end 3D scanning solution for Film/TV, Heritage & Commercial, integrating multi-scale volumetric capture, custom processing pipelines, and Unreal Engine 5 workflows. From foundational data to real-time In-Camera VFX, they empower storytellers to define the digital future through their three core pillars: CAPTURE, CREATE, and EXPERIENCE.
What’s their project?
The Visualskies team seek to address pipeline bottlenecks during retopology and real-time lighting that prevent high-quality 3D scans from being used efficiently in virtual production. Supported by CoSTAR’s compute facilities, LED virtual production stage, and research expertise, they will develop a hybrid environment-capture and reconstruction pipeline that integrates traditional photogrammetry/LiDAR with physically based 3D Gaussian Splatting for inverse rendering (3DGS-IR).
Over the course of the programme, they will:
- Develop a documented capture protocol tailored for hybrid Gaussian–mesh workflows.
- Integrate 3DGS-IR techniques into a production-oriented pipeline.
- Capture and process a complex natural environment as an end-to-end test case.
- Validate the results on a VP stage with controlled lighting changes.
- Produce a publicly available whitepaper outlining methodology and recommended standards.
Meet the team
Visualskies team l-r: Joseph Steel, Lydia Fauser, Will Jackson, Ross Dannmayr, Lara WilliamsonKey team members include Joseph Steel (project lead), co-founder of Visualskies, who has developed groundbreaking technology solutions for clients like MARV, Disney, Apple, Marvel and Warner Brothers. Joe will be responsible for delivering the project.
Other team members include:
- Lydia Fauser (Head of Processing) has played a key part in defining Visualskies delivery pipeline over the last 5 years and has delivered Realtime projects for clients like Channel 4 and Hogarth.
- William Jackson (Technical Lead) has a master's degree in electronic engineering and a varied hardware and software development background. Will is responsible for software engineering and has delivered projects for The Who and
- Ross Dannmayr (Commercial Lead) has successfully delivered commercial profitability for Visualskies over the decade and has previously secured over £500k in R&D grant funding. Ross will be responsible for testing and feedback as well as ensuring project goals are met.
- Lara Williamson (Senior 3D Artist) has a strong interest in using 3D for storytelling and at Visualskies leads in 3D character modelling including works for Netflix's Bridgerton and the latest Hunge Games film.