On-Demand Webcasts

Live, interactive educational webcasts covering hot-topic technologies, issues and developments.

Webcast sessions are available as a SMPTE member benefit free of charge. Non-members may attend webcasts for USD $49.00.

Know a topic that needs to be covered? Send your suggestions to Joel Welch.

Previous Webcasts

Why am I still affected by signal blanking? (and why should I care?)

Original Airdate, 23 January 2020

This webcast is aimed at young media professionals who may wonder “Why should I listen to this webcast even if I only watch content on Apps?” Those Apps may be driven by data carried in what is called “ancillary data space” and authored to be delivered to the viewer with frame accuracy.
Young system designers are often aware of the need to properly handle ST 2110-40 data along with the video and audio data carried by IP-based media systems. Those using SDI-based technologies may also wonder “why is so much data space used for this?”

How many new compression standards? AV1, VVC, LCEVC, HEVC and AVC compared

Original Airdate, 21 November 2019

The long-awaited glasses-free display of three–dimensional imagery will soon be a reality with the introduction of Field of Light Displays (FoLD), also known as Light Field (LF) Displays. Light Field displays an attempt to recreate the light rays passing through a volume of space. The light rays pass through the 3D point, either real or virtual, that would exist in an actual 3D scene and provide true parallax and stereoscopic vision (each eye sees a different view and the view changes smoothly as the viewer moves). The development of a standardized method for streaming 3D content will facilitate the broad adoption of FoLD systems. Third Dimension Technologies (TDT), with sponsorship from the Air Force Research Laboratory (AFRL), is collaborating with Oak Ridge National Laboratory (ORNL) and Insight Media (IM) to develop a display agnostic standard for streaming 3D visuals to FoLD displays. 

Streaming Model for Field of Light Displays

Original Airdate, 14 November 2019

The long-awaited glasses-free display of three–dimensional imagery will soon be a reality with the introduction of Field of Light Displays (FoLD), also known as Light Field (LF) Displays. Light Field displays an attempt to recreate the light rays passing through a volume of space. The light rays pass through the 3D point, either real or virtual, that would exist in an actual 3D scene and provide true parallax and stereoscopic vision (each eye sees a different view and the view changes smoothly as the viewer moves). The development of a standardized method for streaming 3D content will facilitate the broad adoption of FoLD systems. Third Dimension Technologies (TDT), with sponsorship from the Air Force Research Laboratory (AFRL), is collaborating with Oak Ridge National Laboratory (ORNL) and Insight Media (IM) to develop a display agnostic standard for streaming 3D visuals to FoLD displays. 

Breakdown of using the ZFS file system in a media production environment

Original Airdate, 05 September 2019

This SMPTE Webcast will provide a deep dive into data optimization, retention/replication and storage virtualization for media production environments and the ZFS file system.

Powering the multi-platform distribution media factory: OTT and Streaming End-to-End

Original Airdate, 28 August 2019

Over-the Top (OTT) services and content streaming are arguably at the top of the content consumption food-chain right now. The doom and gloom of naysayers from years past have been replaced by optimism and opportunity for those who embraced the evolutionary format. During this SMPTE Technology webcast you will become much more conversant in the technologies, platforms and considerations relating to providing OTT and streaming services. You will learn about topics such as streamlining content for OTT distribution through workflow orchestration and automation. You will also gain knowledge about the specifics of latency and sync challenges, guaranteed capacity, predictive scaling and anomaly detection, plus the intelligent and stateful Adaptive Bit Rate (ABR), and so much more.

Guest speakers for this webcast are Bea Alonso and Wauter De Bruyne of Dalet, plus Filippa Hasselström and Ted Olsson of Net Insight to learn more about end-to-end OTT and streaming!

Content Management

Original Airdate, 15 August 2019

During this SMPTE Technology webcast, you will not hear about the fundamentals of content management, but you will learn about recent changes and challenges. In addition, you will learn about considerations for modernizing content management systems, plus applications in decentralized and very demanding environments.

PTP Timing – Everything you need to know

Original Airdate, 30 May 2019

Next-generation IP infrastructure deployments have benefitted enormously from the inherent flexibility of IP networking technology. In addition to greater flexibility in the routing of video, audio, and data essence streams, the transition to IP has enabled the transformation of facility timing infrastructures with the industry adoption of Precision Timing Protocol (PTP). This webinar will delve into the key aspects of PTP, including an in-depth examination of the benefits of PTP, system design, configuration, operational considerations, and monitoring. In addition, specific details of the configuration and operation of PTP grandmasters, including redundancy and GPS locking will be discussed. Slave device behavior is also examined, and how gateway devices provide synchronous operation is explained. Timing considerations for hybrid architectures incorporating both IP and SDI signals will also be detailed. Lastly, the importance of monitoring and understanding what is occurring in a PTP system will be presented

Low- Latency Live Streaming

Original Airdate, 16 May 2019

Live video streaming over the internet is a big growth area for the broadcasters and OTT services alike. Broadcasters are expected to have live streams at a similar latency to their broadcasts and pure-streaming services see matching the latency of over-the-air broadcasters as another way to prove their legitimacy and further swing viewing over to the internet. Work continues apace in developing technologies which can deliver ultra-low latency, we look at how it can be achieved, what the status is of those projects and how well they stack up in real life

ATSC 3.0 Webinar Part 2: ATSC 3.0 – a Deeper Dive

Original Airdate, 9 May 2019

Building on the first ATSC 3.0 webcast, this webcast will explore more deeply the technical aspects of ATSC 3.0 including system requirements and configuration, inter-layer protocols, STL transport protocol, and security.

New criteria to measure HDR image quality

Original Airdate, 25 April 2019

The High Dynamic Range image container is much wider than classic HDTV and DCI containers, allowing content creators to display much more luminance information than was previously possible.  Does content take advantage of this new format?  It depends on many factors, including the conditions in which it is shot. In this SMPTE Technology Series webcast, image specialist Pierre (Pete) Routhier suggests four criteria by which one may judge whether images take full advantage of HDR and apply those criteria to a variety asset types, i.e. , talk shows, dramatic series, movies, etc., to help assess their potential to amaze viewers with HDR.

ATSC 3.0 Webinar Part 1: Introduction to ATSC 3.0

Original Airdate: 21 March 2019

This webinar will provide an overview of the ATSC 3.0 system capabilities including a physical layer, signaling, audio, video, captions, interactivity, and advanced emergency messaging.  You will receive an introduction to the suite of Standards and Recommended Practices documents with a concentration on stream essence. Attendees will be asked to provide input and thoughts on areas of ATSC 3.0 for further exploration in future SMPTE webinars on the topic

Adventures in Volumetric Cinema

Original Airdate: 7 March 2019

A colleague and I had access to volumetric 3D displays and decided to try to make a motion picture to be exhibited in a public setting on such a device. We hosted a month-long workshop with students from several universities to try out storytelling and visual ideas and to experiment with production and post-production processes. In this talk I’ll describe what we’ve learned so far, as we head into the final production phase later this year

Machine Learning in M&E: Reality Versus Hype

Original Airdate: 24 January 2019

AI is here. From autonomous vehicles to voice-controlled agents, more and more machine intelligence research is crawling from the lab into real life. All signs point to this trend accelerating. More data, less-costly computing, and much greater investment are all indicators that AI, this time, is here to stay. The label has been used to death by unscrupulous marketers, reporters, and even some researchers who still confuse learning for actual intelligence, but true AI is still very rare. This is because real AI is hard —  especially in the media industry, where data is small and often fractured, and the questions posed to AI are extraordinarily complex. ETC AI researcher Yves Bergquist will present a detailed and case studies-driven webinar that explores the nature of AI, where it is going, and what it means for the media industry.

5G Opportunities for Broadcasters

Original Airdate: 15 November 2018

5G is a new broadband technology with superior technical capabilities compared with earlier mobile communications systems. Boasting very low latency, very high reliability, and improved spectrum utilization and energy efficiency, it aims to enable provision of new services to consumers and business users in different social and economic sectors.

Is Virtual Reality Still The Future?

Original Airdate: 8 November 2018

The trough of disillusionment is an oft-cited hype cycle bracket during which, as Gartner defines it, “interest in a technology wanes as experiments and implementations fail to deliver.” Until now, that’s been the case for virtual reality (VR). Consumers, pundits, and investors thought of VR as a clunky box strapped to a person’s face. Augmented reality (AR) was considered by some to be science fiction involving glasses that may or may not ever materialize. Today, however, immersive entertainment refers to a new type of audience engagement that blurs the lines of what is real and what is synthetic.

Mark Schubin's "Six Centuries of Opera and Media Technology in New York"


Original Airdate: 1 November 2018

Electronic home entertainment was invented in New York City for opera. The first compatible-color television program seen at home — and the first bootleg recording — featured opera in New York. New York’s media technologies for opera date back to the 16th century. Today, in the 21st century, they include dynamic video warping with depth-plane selection and multilanguage live cinema transmissions to all seven continents.

A 200-ton music synthesizer broadcasting opera music in New York in 1907? An opera lighting dimmer in 1638? Opera for military communications tests?

It might be difficult to believe, but it’s all true!

Blockchain Revealed – Why So Mysterious?

Original Airdate: 25 September 2018

Learn what makes up the fundamentals of blockchain, and what the differences are between cryptocurrencies and other blockchain implementations.

You’ll discover how blockchain can be useful, and where it may not be. Is it a good security mechanism? Is it the best way to solve a number of challenges in Media and Entertainment? What are the current standards and should we follow them?

Why speed is not the most important thing to look for in a RAID system

Original Airdate: 2 August 2018

During this SMPTE Technology Webcast, guest speaker, Tim Standing from SoftRAID, provides you with the information you need to pick a fast and reliable RAID system for media storage that is directly attached to your computer. He will describe the common RAID levels, and a few of the less common ones as well. For each RAID level, he will explain where the file data gets stored and what the "extra" disks are used for. In addition, topics include optimizations you can use for getting the fastest performance from your RAID system, what to look out for and how to avoid common problems.

Exploring the Role of NMOS: Discovery and Connection in an IP World

Original Airdate: 17 May 2018

SMPTE ST 2110 specifies how to stream video, audio, and data between devices for professional applications using IP networks. How best to discover, connect, and monitor those devices? That is another matter — and one addressed by the Advanced Media Workflow Association (AMWA) Networked Media Open Specifications (NMOS).

Beyond the Skynet Myth: Discovering the Reality of How AI is Used Today in Content Creation & Distribution, and the Possibilities for Tomorrow

Original Airdate: 4 March 2018

Artificial intelligence (AI) is not just a concept for the future. Machine learning (the main technique underpinning AI) is already being used in the media industry to transform content creation and delivery.

Join Jason Brahms CEO of VideoGorillas, who use machine learning in their innovative media software, and Lydia Gregory co-founder of FeedForward, who bring together the cutting edge of machine learning research and businesses, as they explore the current AI media landscape and discuss where it is heading.


Keeping Time with Precision Time Protocol (PTP)

Original Airdate: 14 December 2017

Throughout the ages, time has been a critical component to be quantified. Early sundials, water clocks, and mechanical pendulums once marked the passage of time. Today, quartz watches and the atomic clock measure time with far better precision. 

Time plays a critical role in a broadcast facility, ensuring accurate switching of program material. The Global Positioning System (GPS) provides precise timing information across the world, and both timecode and video synchronization can be derived from that data. Video synchronization is typically achieved either with analog black burst or with tri-level sync signals carried as a reference signal throughout the facility, locked from a GPS reference to maintain synchronization from facility to facility worldwide.

IP Monitoring and Measurement

Original Airdate: 9 November 2017

The broadcast industry’s ongoing transition to internet protocol (IP)-based transport for video, audio, and data is being enabled by the development of standards such as SMPTE ST 2022-6 and the SMPTE ST 2110 Professional Media Over Managed IP Networks standards suite. These standards provide the interconnection framework for an all-IP infrastructure within a facility.

VR from Shoot to Delivery

Original Airdate: 21 September 2017

Richard Mills, technical director of Sky VR Studios, will provide a detailed behind-the-scenes view of Sky VR Studios' creative and production processes. Using examples of the studio's work, he will explain the planning, commissioning, and technical processes from shoot through to delivery. Topics will include technical guidelines, production planning, shooting techniques, and the postproduction and app delivery workflows deployed by Sky VR Studios.

Beyond Virtual Reality: Light Field, Holographic Display, and the Roadmap to the Holodeck

Original Airdate: 24 August 2017

Jon Karafin, CEO of Light Field Lab Inc., will detail the very latest developments in light field and holographic display technologies, including insights into the creative and technical implications for content creation. Karafin also will analyze the data requirements and solutions for streaming holographic media, and then provide a glimpse into Light Field Lab's holographic technologies.

The Art and Technology of Spherical Storytelling: Adventures in Virtual Reality (VR) Production

Original Airdate: 10 August 2017

By now, almost everyone has learned something about the technology associated with virtual reality (VR), but what about storytelling in the spherical environment related to VR? How is VR storytelling different from 2D storytelling? During this SMPTE Educational Webcast, Andrew MacDonald, creative director at Cream VR/AR, will present concepts of 360-degree cinematography, including shooting nodal point 360, shooting stereo in 360, stitching 360 footage, using screen direction methodology, storyboarding in 360, moving the camera, and integrating CGI in a 360 spherical environment. This webcast looks at VR from an entirely different perspective.

Resolving Storage Pain

Original Airdate: 8 June 2017

Today’s artists, architects, administrators, and end users working in the media and entertainment industry today may experience pain points when managing media storage. Workloads are expanding as postproduction businesses try to squeeze more from less. Clients demand ever higher project resolution and frame rates with faster turnaround times. As technology evolves, postproduction must minimize the challenges presented by the storage environment. This webcast will discuss how to handle media storage for challenging workloads in a fast, efficient, and scalable manner.

HDR as a Dramatic Imaging Tool, Examined from a Cinematographer's Point of View

Original Airdate: 6 April 2017

High dynamic range (HDR) is most often discussed from an audience member's point of view, and there is an implication that more dynamic range on the theater screen will automatically deliver a more impressive, and immersive, movie-going experience. The fact is that an appropriate amount of dynamic range for the scene's content, the story, and the cinematographer's artistic intent is the goal — as it always is — in filmmaking. To a cinematographer, HDR is important as a visual option and can be used to create impressive images when it is a precisely calibrated and predictable tool.

CES 2017: Innovations, Advancements, and Disappointments

Original Airdate: 23 March 2017

The dust has settled on CES 2017. Now that the industry has had a chance to reflect on the massive exhibition, it is time to take a good look at the highlights of the show. Moderator Mark Schubin of Schubin Café will be joined by guest speakers Peter Putman, president of ROAM Consulting, and V. Michael Bove of MIT for a discussion of their impressions of the technology that promised to "change things," as well as what was hot and what was, in some cases, surprisingly not. CES 2017. We’ve heard initial reactions from others, but now that everyone has had a chance to reflect on the massive exhibition, let’s take a good look at the highlights. Please join moderator Mark Schubin, Schubin Café, and guest speakers Peter Putman, ROAM Consulting/Kramer Electronics and V. Michael Bove, MIT, as they share their impressions about the technology that promised to “change things”, plus “What was hot and what was, in some cases, surprisingly not!”.

JPEG-XS - The Next Generation Compression Standard for Video Over IP

Original Airdate: 16 March 2017

Adding mezzanine compression into live broadcast production workflows is motivated by the need for higher infrastructure bandwidths in the production of UHD content, as well as the desire for more infrastructure flexibility by using internet protocols (IP). However, the majority of existing solutions are proprietary. To respond to these needs and challenges, the Joint Photographic Experts Group (JPEG) decided to begin a new work plan called JPEG-XS to develop a new interoperable, standardized mezzanine codec for such applications. Please join Fraunhofer's Siegfried Foessel and intoPIX's Jean-Baptiste Lorent, who will discuss the specific requirements for such a codec, the evaluation criteria, and the outlook for expected evaluation results.

Center-Of-Interest (COI) Motion Analysis and Prediction: Do Movies Benefit More from 4K, HFR, or Both?

Original Airdate: 2 February 2017

As the industry relentlessly moves toward higher spatial and temporal resolutions, it's important to take a step back and look at how those technologies fare with the creative side of filmmaking.

During this webcast, you’ll discover the surprising results of applying a certain methodology to several recent box-office hits (exhibited at 2K24). Find out the specific impact that increasing frame rate has on image detail, while hearing how image performance is affected by today's fast-paced cinematography. Finally, possible ways to improve viewer experience while reducing costs, without compromising creative intent will be explored.

HDR: PQ and HLG – Presented by the BBC

Original Airdate: 19 January 2017

High dynamic range (HDR) promises to significantly enhance the viewing experience of audiences around the globe. The two HDR technologies supporting this promise are Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ). How are they similar, and how do they differ? In this webcast, guest speakers Tim Borer and Andrew Cotton, both of the BBC, will present an objective summary of the two technologies in terms of technical parameters.

The Future of Media and Entertainment and Its Impact on Computer Storage (and Vice Versa)

Original Airdate: 15 December 2016

Hollywood has been innovating since its inception more than 100 years ago, and storage has been along for the ride.

With the onset of the digitization of film, as well as the move from HD to 8K and eventually 16K movies, file sizes are soaring! A title that once generated 112GB an hour may soon reach 86TB. Therefore, storage has become a critical factor in how movies are archived, streams are stored, and how much broadcasts and movies are budgeted. From the set, through the camera, to the big or little screen, storage has earned its place in the closing credits.

ACES 1.0: Theory and Practice

Original Airdate: 17 November 2016

The Academy Color Encoding System (ACES) is a complete framework for digital color representation, reproduction, processing, distribution, and archival mainly aimed to theatrical, TV, and animation features. Current major version 1.0 has been recently released and introduces new concepts and methodologies.

From 4D Cinema to Haptic Cinematography: Challenges & Issues

Original Airdate: 20 October 2016

Does the cinema industry need to add another dimension to traditional audio-visual projection in order to improve the audience experience?

The 3D model was attempted, with varying degrees of success. More promising may be the emerging 4D approach, where dedicated rendering platforms employ physical or sensory feedback. The goal of this webcast is to address the different aspects of such a haptic cinematography workflow, including production/creation by the haptographer, rendering and associated quality of experience, and the definition of suitable formats for the representation and distribution of the resultant additional data.

Uber Master Creation Using HDR Movie Projector

Original Airdate: 6 October 2016

There are multiple formats for cinema, each with different characteristics, including dynamic range, maximum brightness, and color gamut. How can we best create content that looks as good as possible on all these projectors? Our experience has been to grade first on an exceptionally high-dynamic-range wide-color-gamut projector, and then map down to the other projectors. This has been an extremely fast, accurate, and pleasing process. Every version looks great and maintains the intent of the uber-master as well as possible. The techniques will be described and the results discussed. Thad Beier believes that this could be the future of color grading.

Build your Base, Rent your Spike – An Intro to Web-Scale Open Infrastructure on OpenStack, Microservices, and Kubernetes

Original Airdate: 28 July 2016

Infrastructure technology is evolving at an accelerated pace, as the space moves away from silos and toward the more diverse skill set that is needed to operate corporate infrastructure. 

Cloud computing can accomplish this. Cloud is more than technology: It brings together people and process. Transitioning corporate infrastructure to the cloud and being able to operate in a true IaaS model behind a firewall is key to meeting long-term business needs. 

Solinea will share some of our experience in working with the media and entertainment industry to make this transition.

Content Acquisition Using Light Field Technology

Original Airdate: Thursday, 21 July 2016

Traditional cameras capture an image as a two-dimensional plane. But what if you can capture each ray of light, including its intensity, color value, and direction? This is exactly what the new generation of light field cameras is capable of.

By capturing data on the full light field, the content creator is given unprecedented control in post-production, including focus, stereoscopic imaging, aperture, and shutter angle. In this webinar, you will learn the basics of how light field cameras work and hear about the new Lytro Cinema camera, which captures 755 Megapixels of data at up to 300 frames per second.

The Ins and Outs of ATSC 3.0

Original Airdate: 16 June 2016

ATSC 3.0 will change the broadcast television environment in many positive ways. This SMPTE webcast provides an overview of this new standard, that promises to transform the television distribution ecosystem. With an all-IP—based distribution platform, broadcasters will have numerous opportunities to improve the consumer experience with better-quality audio and video, new service offerings, and delivery to mobile devices while also enabling new digital services.

Immersive Audio Systems and the Management of Consequent Sounds

Original Airdate: 12 May 2016

Immersive audio is appearing in modern cinematic storytelling more frequently. In traditional sound mixing, a first sound can have a tight semantic coupling to a second sound, such as a gunshot and ricochet, or a handclap and its reverberation. Immersive sound systems can direct these precedent and consequent sounds to different locations so that they envelop the audience. When consequent sounds are not managed, the psychoacoustic principle known as the “Haas Effect” can result in portions of an audience misunderstanding the placement of precedent sounds, momentarily disrupting their experience.

The Psychology of Immersive Storytelling

Original Airdate: 28 April 2016

Suspension of disbelief is a fundamental goal of telling any story. Eliciting the perception that the story could be real requires overcoming the complexities of the human mind. Join Dr. Albert "Skip" Rizzo, PhD, Director – Medical Virtual Reality Lab, at the USC Institute for Creative Technologies, and learn why certain approaches may help, or hinder, virtual reality (VR) storytellers in creating the ultimate immersive environment within which a compelling story can be told. The presentation will begin with a brief description of Dr. Rizzo’s work in the area of Clinical Applications of VR. From that vantage point, he will discuss his work in utilizing VR as a tool for the treatment of PTSD and for preventing it by training resilience skills in service members before a deployment within immersive-interactive narrative VR episodes that put the user in challenging emotional situations similar to what they may experience in a combat zone.

CES 2016 - Necessity Is The Mother Of Reinvention

Original Airdate: 17 March 2016

8K, HDR, OLEDs, flexible displays - they were all in evidence at the 2016 International CES, as were drones, connected appliances, and super-fast 60 GHz wireless. This hour-long seminar will talk about all of these trends, along with the free-fall in television prices, why your current display interface isn't fast enough, and what the Internet of Things means for video and audio signal switching, distribution, and control. (Oh, and you'll hear about the usual "only at CES" products, too!)

UHD Color-Conversion Challenges

Origianl Airdate: 28 January 2016

The introduction of a wide color gamut (WCG) color space in ultra-high-definition (UHD) creates a need to match colors produced for WCG UHD displays with colors for conventional HDTV displays.

Discover if color-conversion methods mandated by the current television standards produce a good color match when converting colors from the HD to the UHD color space.

UHD in a hybrid SDI/IP World

Original Airdate: 24 November 2015

The last year has seen a lot of excitement around the transition to IP in the broadcast environment, but SDI still plays a vital role.

The hybrid SDI/IP models that will be adopted by most operations still need to consider the role that SDI plays for HD, but especially important is the role of SDI as it pertains to the introduction of UHD-1 (commonly referred to as 4K).

Evolved Content Security Framework: Supporting the Modern Needs of MVPD

Original Airdate: 17 November 2015

Every aspect of the industry is rapidly evolving – from consumer’s behavior to video packaging, encoding and platform shifts. In order to keep up with the ever changing landscape, the framework for modern MVPD Content Security needs to evolve as well. Several advances in video packaging, encoding and transport such as Adaptive Bit Rate (ABR) have contributed to this evolution. However, certain key aspects such as consumer authentication and access control have lagged behind.

Clarifying High Dynamic Range (HDR)

Original Airdate: 15 October 2015

HDR is a new and exciting technology that is gaining traction in both the consumer and professional aspects of motion pictures. There are differences of opinion even in its definition and ways to approach it.

Camera-Design Philosophies for Beyond-HDTV Resolution (UHD)

Original Airdate: 24 September 2015

Many motion-imaging cameras are said to be "4K" or "UHD," but wildly differing design philosophies are used, ranging from large-format single sensors to small-format three-sensor prisms to four-sensors and more. Learn what the characteristics of each philosophy are. Different applications might call for different designs.

Lens Considerations for 4K Digital Cinematography and UHD Television Production

Original Airdate: 16 July 2015

4K UHD was hugely visible at this year’s CES and NAB Conventions. Associated technologies for production, postproduction, workflows and infrastructures are rapidly advancing on a global basis. 4K was born in the realm of digital cinematography and in that context has been largely based upon the Super 35mm (S35mm) image format size. 4K digital camera developments have become prolific. Allied development in 4K S35mm zoom and prime lenses continue to grow apace. Size and weight consideration constrain the focal ranges available – but that is a compromise long accepted in the world of digital motion imaging. An increasing number of television program genres have adopted the cinematic imagery offered by this larger image format size.

Creating and Keeping Better Pixels – How to Implement and Verify Efficient High Quality HDR and Wide Color Gamut Imaging Systems

Original Airdate: 18 June 2015

High Dynamic Range (HDR) and wide color gamut (WCG) imaging are beginning to appear in both the professional and consumer marketplace forming imaging pipelines that will produce previously unseen brightness levels, contrasts, deep blacks and intense colors; all while maintaining incredible detail. However, the technological approaches involved need to work together harmoniously throughout the imaging pipeline to provide the highest level of fidelity possible. This becomes especially important when both technical and economical constrains have to be taken into account, as is the case with consumer TVs or mobile devices.

Why a Standards Decision from 1953 Impacts Today's Broadcast Software

Original Airdate: 21 May 2015

Fractional frame rates - you might love them or hate them. Either way they are destined to be with us long into the future. This SMPTE Educational Webcast expands on an informative, amusing and educational presentation given at this year's HPA Tech Retreat. It will explore the impact of nudging the frame rate of US video by 1 part in 1000 on systems, devices and applications that permeate our industry. As we look to the future and UHD with high frame rates, the webcast will also cover the technology and operational areas needing attention when multiple frame rates with multiple timecode styles become more prevalent in the working environment.

High Dynamic Range Intermediate: Challenges and Considerations

Original Airdate: 23 April 2015

High Dynamic Range (HDR) content promises a significant, aesthetically pleasing enhancement of the viewing experience. It also requires that several key questions be answered. For instance: How can an HDR master be created? How do we approach the new perceptual challenges of HDR? What information will be known only at the final display?

The Future of JPEG

Original Airdate: 7 April 2015

The JPEG standardization committee has played an important role in the digital revolution in the past quarter-century. However, the ever-changing requirements in multimedia applications have created new challenges in imaging for which solutions should be found. This Emerging Technologies webcast provides an overview of several new solutions, including a recently developed image format called JPEG XT that is intended to deal with high dynamic range (HDR) content. In addition, JPEG PLENO, a recent initiative by the JPEG committee to address an emerging modality known as plenoptic imaging, will be explained. Finally, we will introduce JPEG AIC (Advanced Image Coding), a potential initiative that aims to create a new image compression standard that would not only offer superior compression efficiency when compared to JPEG and JPEG 2000, but also would provide other features attractive for multimedia applications of tomorrow.

UHD-SDI - Enabling the Transport of UHD/4K Over Existing In-Plant Infrastructure

Original Airdate: 9 March 2015

Today’s studio infrastructure uses an HD-SDI coaxial cable to carry a single uncompressed baseband signal of up to 3 Gb/s for 1080p60 image formats. With the advent of UHD/4K production, a 4x to 8x increase in overall bandwidth is required to realize the substantial improvements in image quality that comes with higher resolution; higher frame rate; wider color gamut and higher dynamic range of UHDTV compared to today’s HDTV production.

Web Application Security: The Devil is in the Details - Part II of III

Speaker: Chase Schultz, Senior Security Consultant, Independent Security Evaluators (ISE)

Part I of this three part webcast series introduced the Open Web Application Security Project (OWASP) and discussed topics such as Injection, Broken Authentication and Session Management, and Cross Site Scripting (XSS). These are a few of OWASP’s top 10 commonly misunderstood security flaws.

Part II in the series continues to countdown OWASP’s top 10 list and focuses on Insecure Direct Object Reference, Security Misconfiguration and Sensitive Data Exposure.

2015 CES Round-Up

Speakers: V. Michael Bove, MIT Media Laboratory, Michael DeValue, Walt Disney Studios, Pete Putman, ROAM Consulting and Kramer Electronics

Original Airdate:  22 January 2015

The Consumer Electronics Show (CES) is one of the most highly anticipated events where manufacturers announce new, wild and fantastic advancements in electronic equipment. Gadgets and gizmos which will ultimately display motion imaging content can be found around every corner at the show. Presenters in this SMPTE webcast will explain what caught their eyes and what, just maybe, may be on the technology horizon for SMPTE Members. 

HDBaseT: Is There a Role for It In the Broadcasting Industry?

Speaker: Eyran Lida, Chair of the HDBaseT Alliance’s Technical Committee and Chief Technology Officer and co-founder of Valens

Original Airdate: 18 December 2014

The HDBaseT standard includes an impressive set of supported features for the delivery of uncompressed high-definition digital video (including 4K) — such as HD video, audio, Ethernet, controls, power, USB, multistream, and multipoint, among others — over a simple LAN/Ethernet cable (Cat5e/6) for up to 100 meters/328 feet. In addition, HDBaseT is a technology with zero latency and high resistance to electromagnetic noise in order to maintain the highest quality in uncompressed high-definition video.

H.264 versus HEVC versus VP9

Speaker: Ian Trow, Senior Director Emerging Technology and Strategy, Harmonic

Original Airdate: 20 November 2014

AVC (H.264) addressed many of the shortcomings of predecessor compression standards like MPEG-2 that were predominately aimed at linear scheduled broadcast. This enabled the needs of HD and streaming applications to be efficiently addressed with the necessary functionality and crucially, tight bandwidth constraints dictated by their respective distribution media. Innovation was demanded by the standards bodies to further improve on compression efficiency of the existing compression standards by as much as 50% to facilitate the introduction of 4K / Ultra HD as well increase the reach of Over The Top (OTT) services. All this activity surrounding compression formats begs the questions, what is the technology behind these standards, what are their target markets, how are they related and lastly are they gaining market traction?

Next Generation Display Interfaces

Speaker: Pete Putman

Original Airdate: 30 October 2014

As the transition from analog to digital signal interfacing runs its course, there are still a few laggards like the 27-year-old VGA display connector. But it’s very much on the endangered species list as HDMI and DisplayPort become more entrenched. That’s not the whole story, though: The next generation of display interfaces is faster, denser, and smaller. They can carry multiple signals (video, audio, control, Ethernet) and some versions accomplish this with just five pins!

With UHDTV looming, the display interface is a critical part of the chain – and perhaps the weakest link. This seminar will discuss the updates to HDMI and DisplayPort and also take a closer look at “micro” versions of each connector and the different signal formats they support. We’ll also run a few calculations to see if HDMI and DisplayPort are really fast enough to support more pixels, faster clock rates, and increased bit depths; all key to implementation of UHD-1 and UHD-2 display systems. And we’ll wrap things up with a discussion of display/audio/control signal multiplexing over structured wire systems.

Laser Science and Laser Illuminated Projection

Speaker: Bill Beck "The Laser Guy," Barco

Original Airdate: 7 October 2014

After years of anticipation and many impressive demonstrations, commercial laser-illuminated projectors are being sold and installed in commercial movie theaters and premium large format (PLF) theaters. Systems with brightness levels ranging from 6000 cinema lumens to over 100,000 lumens per screen (dual-projector) have been installed. This webcast steps through key terms and definitions; current commercial offerings; major system architectures; the impact of primary selection; RGB laser vs. Blue laser-Pumped Phosphor (BPP) and a tutorial on key "figures of merit" (FoM) for evaluating laser-illuminated projectors, such as brightness; brightness roll-off; lifetime; wall-plug efficiency; speckle contrast ratio; dynamic range; contrast ratio; color gamut; impact on 3D; 6 Primary 3D systems and other laser-related topics will be discussed.

Web Application Security: The Devil is in the Details

Speaker: Justin 'JD' Nir - Consultant, Independent Security Evaluators

Original Airdate: 18 September 2014

A three-part SMPTE webinar series analyzing the Open Web Application Security Project (OWASP) Top 10

The Open Web Application Security Project (OWASP) Top 10 is a guideline commonly relied upon in the Media & Entertainment industry as a resource for securing web applications. However, misunderstandings about certain nuances commonly result in improper implementations which lead to systems that fail against modern adversaries. This 3-part SMPTE webcast series will analyze the security flaws identified by the OWASP Top 10.

3Gb/s SDI for Transport of 1080p50/60, 3D, UHDTV-1 / 4K and Beyond
Part III –Physical Interface - Optical

Speaker: John Hudson, Semtech

Original Airdate: 21 August 2014

Since its arrival in 2006, the latest generation of the Serial Digital Interface, 3Gb/s SDI, or "3G," has achieved widespread adoption, rapidly becoming the real-time streaming media interface of choice. What exactly is 3G SDI and how can it be used to create a reliable real-time streaming infrastructure?

In this final installment of a three-part Webcast, you will learn about 3G-SDI physical interface requirements and will also take away practical advice on designing, installing and operating reliable optical 3G SDI infrastructure and networks.

From Quad HD to UHDTV: Making the Difference

Speakers: Hans Hoffmann, EBU; Howard Lukk, Pannon Entertainment

Original Airdate: 15 July 2014

Whilst the Consumer Electronics industry pushes 4k screens onto the shelf, a technical debate has started on how to realise the full parameter set of UHDTV. 

Please join Hans Hoffmann, EBU, and Howard Lukk as they discuss the situation on current and emerging trends in UHDTV, and reveal the research work underway on HFR and HDR. In addition, our presenters will bring context to the question of gaps in the UDHTV chain. Do not miss this very important webcast!

Mesclado Webcast: New SOA Architectures (No Membership Required)

Original Airdate: 10 July 2014

Second Screen, Interactivity, Targeted Advertising, Metadata Repurposing, Big Data… Tech’ Innovation Success Stories

How can we add value to Media Programmes? Technology can help, especially to reach younger audience, more connected than ever. Thanks to Mesclado’s own independent research lab and SMPTE's standardization effort, this jointly sponsored webcast will give hard facts rather than typical speculation and marketing focused perspective. Presenters will provide a 360° vision covering production to distribution (whether traditional or online).

SDN and Virtualization for Media-Centric Infrastructures

Speakers: Ron Hromoko, Tom Ohanian,  Cisco Systems

Original Airdate: 1 July 2014

Media and broadcast companies are beginning to accelerate adoption of IT-based facilities and methods for content creation and delivery. New efficiencies and cost-models emerge when the adoption of IP and Ethernet transport reaches scale. Virtualization and Software-Defined Networking (SDN) are keys to the next phase of this transformation. This webcast will introduce concepts in Virtualization and SDN, as well as their applicability to Production Workflows and Digital Delivery Models.

Tackling Cyber Resiliency – Common Sense Methods to Reduce Your Risk

Speaker: Frank Artes, NSS Labs

Original Airdate: 26 June 2014

The Engineering team often ends up as the voice of reason between the technology requirements of production, post-production and distribution. This SMPTE Education webcast will discuss the overall strategy and day-to-day steps to leverage cyber resiliency and common sense approaches to reduce your risk from cyber attack, system breaches, disruption of business, and loss of Intellectual Property. Please join guest speaker Francisco Artes, Chief Technology Architect at NSS Labs, and SMPTE for this very important educational opportunity.

3Gb/s SDI for Transport of 1080p50/60, 3D, UHDTV1 / 4k and Beyond
Part 2 –Physical Interface - Electrical

Speaker: John Hudson, Semtech

Original Airdate: 22 May 2014

Since its arrival in 2006, the latest generation of the Serial Digital Interface, 3Gb/s SDI, or "3G", has achieved wide-spread adoption, rapidly becoming the real-time streaming media interface of choice. But what exactly is 3G SDI and how can it be used to create a reliable real-time streaming infrastructure? 

In this second installment of a 3 part Webcast series (Part 1 covered 3G SDI standards), participants will learn about 3G SDI physical interface requirements and will also hear practical advice on designing, installing and operating reliable coaxial cable based 3G SDI infrastructure and networks.

Disruptive Weather Conditions: Clouds in the Forecast

Speaker: Richard Welsh, Sundog Media Toolkit Ltd.

Original Airdate: 17 April 2014

What does the term “cloud”  really mean for the media industry? Is it just a buzzword or a genuinely useful and game changing technology?  Where does it work and where does it not? What are the advantages versus challenges now and in the future? This SMPTE Educational Webcast  explores the architectures, implementations and applications of cloud computing in practical terms.

Quantum Dot Color for Motion Pictures and Television

Speaker: Seth Coe-Sullivan, QD VISION, INC.

Original Airdate: 1 April 2014

Quantum dots (QDs) are a new material that is already impacting the display industry, appearing in 2013 display products from 7” to 65”, and from tablets to televisions. These QD products are differentiated in their color gamut and color accuracy, two of the most critical performance characteristics for consumers and the SMPTE community. This webcast will cover the basics of what quantum dots are, the two methods by which they are integrated into display products, and how their characteristics directly influence display performance. We will then explore the relationships between color gamut and color accuracy, the issue of full gamut content delivery, and the role of standards in ensuring the best utilization of this new hardware innovation.

Lessons in Light: From Reality via Display to the Eye

Speakers: Timo Kunkel and Scott Daly, Dolby Laboratories

Original Airdate: 20 March 2014

Light in the real world around us appears in a multitude of intensities and wavelength combinations. The human visual system (HVS) has evolved to sense and interpret this subset of the electromagnetic spectrum we call light to create the appearance of the real world with a wide palette of colors and large contrasts between light and dark. 
In this presentation, we will discuss key display capabilities and trends, such as dynamic range and screen reflectance (from anorexic mirrors to curvy moth eyes), ambient light effects on displays in transition as well as on viewers, providing ways of creating better pixels.

Technical Differences Between Professional Monitors, and How to Choose the Right One for the Job

Speakers: Bill Admans - Postproduction Professional and Technology Marketing Executive

Original Airdate: 20 February 2014

Since the demise of the CRT, the monitor landscape has changed dramatically. Where there was one technology and few choices, there are many different technologies and hundreds of choices. Monitors are used at every stage of the production and postproduction workflow, from on-the-set to final distribution. Choosing the right monitor to meet your workflow requirements is important. This webcast will explore the differences between different monitor technologies and what they mean for you

The All IT Media Facility: Enabling Technologies

Speaker: Al Kovalick, Media Systems Consulting

Original Airdate: 19 December 2013

File-based (IT) production and broadcast workflows are now the norm across a large percentage of the our industry. Though the transition to file-based systems is a relatively recent evolutionary milestone, there are already a number of disruptive technologies poised to move real-time AV workflows several giant steps forward. No doubt you’ve already heard about Software Defined Networks/Storage, 10G/40G Ethernet, Precision Time Protocol, fast network switching, compute virtualization methods and widely available web apps (SaaS). These are the foundation upon which the “all IT facility” will be built. SMPTE’s guest speaker will discuss these catalytic technologies and why they are essential to move to all IT. He will also describe the associated technical challenges and potential means to overcome them.

Speaker: John Zubrzycki, Principal Technologist at BBC Research
Original Airdate: 7 November 2013
Handling video and audio content as digital files brings tremendous advantages to the broadcast and media industry. Converting operations from familiar broadcast technologies to IT technologies promises to provide flexibility and savings, but the wrong choices could put your content at risk.  The presentation will cover the basic steps needed to maintain the quality, safety and integrity in a digital production and archive workflow. It will include advice on handling coded content, on storage media, on using the cloud and explain the differences between a digital library and an archive. Ways of working with digital content in an IP infrastructure are still developing in this relatively new area for our industry and so pointers to sources of further help in SMPTE and elsewhere will be provided.
Special thanks to the following SMPTE Monthly Webcast sponsor:
Disclaimer: Sponsors are recognized for their generous support of SMPTE
educational initiatives. This recognition does not represent recommendation or endorsement of sponsors' services or products by BBC or SMPTE

An Introduction to Holographic Television

Speaker: V. Michael Bove, Jr., MIT Media Lab

Original Airdate:  31 October 2013 

Widespread recognition of some shortcomings of "traditional" 3D TV, and some recent technological advances, create an opportunity for holographic 3D TV for entertainment, telepresence, and teleoperation. In this webcast I'll review perceptual considerations for 3D TV, explain what true holographic television is (and how to distinguish it from things that marketers call holographic), as well as describe how recent developments in image capture, standardization, and computation may bring holographic television to market affordably sooner than many have predicted, co-existing with other kinds of 3D TV. I'll also explain the range of light-modulation technologies that various research groups (including mine) are exploring as part of developing holographic video display systems.

Download the Presentation Slides

Speakers: David Wood and Greg DePriest
Original Airdate: 26 September 2013
During this SMPTE Monthly Webcast, we will jump feet first into UHD and how it will likely impact the broadcasting industry. Guest speakers David Wood and Greg DePriest will take us from today, discussing the fundamental basis of UHD, global status of UHD standards situation and how UHD will likely progress in terms of frame rate, UDR and color bit depth. Additional topics of interest include the London Olympic Games experience, the rollout timeline for 4K/8K in Asia and public information about who is doing what with 4K in select locations. Don’t miss this important discussion on a topic which will impact workflows over the near, mid and long terms.
Speaker: Joseph Slomka, FotoKem
Origianl Airdate: 22 August 2013
Related Resources in the Library: ACES
This SMPTE Educational Webcast provides insight into implementation of ACES in production. Our special guest speaker, Joseph Slomka, Vice President, Principle Color Scientist at FotoKem will discuss the areas of production at FotoKem impacted by ACES and how ACES will likely affect feature motion picture production, animation and VFX. 

High Frame Rate Cinema – A New Tool for Storytellers

Speakers: David Stump, ASC and Andrew Watson, NASA

Original Airdate: 25 April 2013

Webcast Series: Digital Cinema

Related Resources in the Library: High Frame Rates

With the recent release of films produced and exhibited at frame rates higher than the traditional 24 frames per second (fps), many questions have arisen: What exactly is high frame rate in cinema? From where did the concept come? Why is it important today? What are the human psychophysical considerations? 

Guest speakers David Stump, ASC and Andrew Watson, NASA, explore this newly emerged storytelling tool

Speaker: Jim Houston, Principal, Starwatcher Digital; Co-chair, ACES Project Committee

Original Airdate: 28 March 2013

Webcast Series: File-Based Workflows
Related Resources in the Library: ACES
As digital cameras and displays grow toward higher dynamic range and wider color gamuts, what tools are in place for the new digital workflows they demand?  The Academy of Motion Pictures  Arts and Sciences has been working on this question through its Science and Technology Council.

BXF and Advertising Workflows

Speakers: Chris Lennon, SMPTE Engineering Director/President, MediAnswers, Harold Geller, Chief Growth Officer, Ad-ID

Original Airdate: 28 February 2013

Webcast Series: File-Based Workflows

Related Resources in the Library: BXF

2013 finds us at the intersection of several tools that enable advertising workflows, which have been seeking for years. New developments in SMPTE's Broadcast eXchange Format (BXF), along with Ad-ID, AS-12, and other areas mean that we can now automate the flow of data throughout the advertising chain - from creation to airing and billing of a commercial. 

Speaker: Jason Livingston, CPC Closed Captioning
Original Airdate: 24 January 2013
Related Resources in the Library: Closed Captions IP Delivery
On 30 September, an FCC regulation came into effect requiring TV broadcasters to implement captions for prerecorded programming that is not edited for Internet distribution. Livingston discusses best practices for workflows involving CEA-608 and CEA-708 broadcast closed captions data and translation into SMPTE 2052. (Free to All)

High Frame Rates: A Technical Discussion on the
Impact it Will Have on Motion Imaging Workflows

Speaker: Jim Whittlesey, Deluxe

Original Airdate: 13 September 2012

Webcast Series: Digital Cinema

Related Resources in the Library: High Frame Rates

With an eye on the ever evolving motion imaging technology horizon, high frame rate (HFR) content is the next challenge to impact the entire theatrical workflow. So what are the benefits of HFR technology? How will HFR affect the media workflow?

1  2  3  4