University of Bristol Partners with Lux Machina to Bring Cutting-Edge Virtual Production Technology to New Creative Facility

University of Bristol Partners with Lux Machina to Bring Cutting-Edge Virtual Production Technology to New Creative Facility

The University of Bristol has partnered with industry leader Lux Machina Consulting to install a state-of-the-art virtual production stage and Smart Cinema in its new £30 million MyWorld creative technologies hub.

production stage

ABOVE: House of Dragons made use of Virtual Production Screen tech from Lux Machina Consulting

The facility, set to open in Autumn 2024, will allow businesses and researchers to collaborate on innovative film, TV, and gaming projects.

What is Virtual Production?

Virtual production is a filmmaking technique that utilizes computer-generated imagery (CGI) and real-time rendering to create immersive, dynamic environments. Instead of traditional green screens, actors perform in front of LED backdrops that display the virtual world, allowing for more natural interactions and lighting.

Lux Machina’s groundbreaking virtual production technology has been used in high-profile projects such as “House of the Dragon,” the victorious “Game of Thrones” spin-off. Their expertise will be crucial in establishing the University of Bristol’s facility as a leader in the field.

The MyWorld programme, funded by a £30 million grant from UK Research and Innovation’s (UKRI) Strength in Places Fund (SIPF), aims to capitalize on the West of England’s strengths in production, technology, and research. By providing access to cutting-edge facilities, skills development, and innovation funding, MyWorld seeks to drive growth in the region’s creative industries. The project will feature:

  • £1.2 million virtual production stage
  • 35-seat Smart Cinema equipped with audience monitoring technology
  • Motion capture and volumetric capture capabilities
  • Soundstage for live TV broadcasts
  • Audio-visual galleries, edit suites, and training rooms

The experimental studio and Smart Cinema will enable partners from global giants like Netflix and Amazon to local indie filmmakers to access the tools and data needed to develop new technologies and create innovative content.

The facility aims to foster collaboration between industry professionals and world-leading academics, driving research into In-camera visual effects (ICVFX), Virtual and augmented reality, and Audience engagement and sentiment analysis.

How does a virtual production stage work?

Virtual production stages, or LED volumes or virtual sets, represent a significant advancement in film and television production technology. By combining large LED screens, real-time rendering engines, and camera tracking systems, virtual production stages allow filmmakers to create immersive, photorealistic environments seamlessly blending practical and digital elements. This article provides a technical overview of virtual production stages and their applications for undergraduate students in film, media, and related fields.

Components of a Virtual Production Stage

A typical virtual production stage consists of several key components:

  1. LED Screens: The primary visual element of a virtual production stage is a large, high-resolution LED screen or a series of connected screens forming a curved or angled wall. These screens display real-time rendered environments, visual effects, or pre-rendered content. Common LED panel types used in virtual production include:
    • ROE Visual Diamond Series (2.6mm pitch)
    • ROE Visual Carbon Series (5.7mm pitch)
    • Absen Aries Series (2.9mm pitch)
  2. LED Processors: LED processors, such as those manufactured by Brompton Technology, handle the LED screens’ video processing and colour management. These processors ensure accurate colour reproduction, low latency, and smooth playback of high-resolution content.
  3. Real-time Rendering Engines: Virtual production stages rely on powerful real-time rendering engines, such as Unreal Engine or Unity, to generate and display photorealistic environments on the LED screens. These engines allow for the creation of complex 3D scenes, lighting, and visual effects that respond to camera position and movement changes.
  4. Camera Tracking Systems: To maintain the illusion of a seamless virtual environment, the camera’s position and movement must be precisely tracked and synchronized with the rendered content on the LED screens. Camera tracking systems, such as those developed by Mo-Sys, Ncam, or Stype, use a combination of optical sensors, inertial measurement units (IMUs), and encoders to capture the camera’s six degrees of freedom (6DOF) data in real time.
  5. Workflow Integration: Virtual production stages require a tight integration of various software and hardware components to ensure a smooth and efficient workflow. This may include:
    • Virtual production control software (e.g., disguise, Pixera)
    • Real-time compositing and colour grading tools (e.g., Unreal Engine Composure, Pomfort Livegrade)
    • On-set previsualization and virtual scouting tools (e.g., Unreal Engine display, Unity MARS)

Virtual production stages offer numerous benefits and applications for film and television production:

  1. Increased Creative Control: Virtual production stages enable greater creative control over the final image by allowing filmmakers to visualize and interact with digital environments in real-time. Directors, cinematographers, and VFX supervisors can immediately adjust the virtual set, lighting, and camera angles without extensive post-production work.
  2. Enhanced Performances: Actors can perform within a more immersive and interactive environment, as they can see and respond to the virtual elements surrounding them. This can lead to more natural and authentic performances instead of acting against a green screen.
  3. Cost and Time Savings: Virtual production stages can reduce the need for location shoots, set construction, and extensive post-production visual effects work. By digitally creating photorealistic environments, productions can save on travel, lodging, and other associated costs while streamlining the production timeline.
  4. Improved Collaboration: Virtual production stages foster closer collaboration between different departments, including direction, cinematography, production design, and visual effects. Working together in a shared virtual space allows creative teams to make informed decisions and solve problems more efficiently.
  5. Sustainability: By reducing the need for physical set construction and location travel, virtual production stages can contribute to more environmentally friendly and sustainable production practices.

Several high-profile film and television productions have successfully utilized virtual production stages, demonstrating the technology’s potential:

  1. The Mandalorian (Disney+): This Star Wars series extensively used a large LED volume powered by Unreal Engine to create photorealistic alien landscapes and environments.
  2. The Batman (2022): The film employed a virtual production stage to create Gotham City environments and enhance the film’s visual style.
  3. The Lion King (2019): While not using an LED volume, the film’s production heavily relied on virtual production techniques, with the entire film being shot in a VR environment using Unity engine.
  4. Westworld (HBO): The series used a virtual production stage to create complex, immersive environments that blended practical and digital elements seamlessly.

Virtual production stages represent a significant shift in the way films and television shows are produced, offering filmmakers unprecedented creative control, flexibility, and efficiency. As the technology continues to evolve and become more accessible, it is likely that virtual production will become an increasingly essential tool for the next generation of filmmakers and content creators. By understanding the technical foundations and applications of virtual production stages, undergraduate students can position themselves at the forefront of this exciting and transformative field.

By co-locating research, R&D, and training, the Coal Shed aims to drive innovation while addressing skills gaps and promoting inclusivity in the creative sector.

The Coal Shed

MyWorld estimates the programme will boost the regional economy by £223 million GVA (Gross Value Added) by 2030. The creative industries contribute £124 billion to the UK economy annually, highlighting the significance of investments like the MyWorld programme.

Bristol’s recent recognition as a UNESCO City of Film during the Cannes Film Festival further underscores the city’s growing reputation as a hub for creative innovation.

As virtual production continues to impact the entertainment industry, facilities like the University of Bristol’s Coal Shed should play a vital role in shaping the future of storytelling. By making cutting-edge technologies accessible to a wide range of creators, we hope the project can unlock new possibilities and nurture the next generation of technical and creative talent.

– University of Bristol partners with Lux Machina to install a virtual production stage and Smart Cinema

– £30 million MyWorld programme funds state-of-the-art creative technologies hub

– Facility to drive innovation, address skills gaps, and promote inclusivity in creative industries

– Collaborative research and development between industry and academia

– Economic impact estimated at £223 million GVA by 2030

Born to Engineer Weekly

Get the latest Engineering news delivered to your inbox every Monday morning