On-Demand Webcasts

Live, interactive educational webcasts covering hot-topic technologies, issues and developments.

Webcast sessions are available as a SMPTE member benefit free of charge. Non-members may attend webcasts for USD $49.00.

Know a topic that needs to be covered? Send your suggestions to Joel Welch.

Previous Webcasts


Remote Collaboration Audio Workflow

Original Airdate, 12 May 2020

What crazy times! Who could have ever imagined audio production/post-production that is 100% delivered from home? How to do ADR and Foley when social distancing is required? How to have sessions reviewed by clients remotely? What's the "proper" home environment that enables smooth operations? What about security? Let's learn from the All-Star lineup about their remote audio workflows, setups, and hear their advice. 



Two Blockchain Case Studies for M&E

Original Airdate, 07 May 2020

Before it was well understood, blockchain was touted as a revolutionary technology that would change the world.  Outside of cryptocurrency, blockchain has proven to be more of a valuable set of tools and techniques for very specific use cases.  This webcast will examine two such use cases as they apply to Media & Entertainment; smart contracts and secure high-performance content distribution. 



News in the age of the Pandemic

Original Airdate, 16 April 2020

In this webcast, production media architect and SMPTE instructor Pierre (Pete) Routhier, Eng. M.Eng., looks at how news organizations can leverage off-the-shelf social media and cloud-based solutions to keep delivering quality content in a timely fashion to their audiences without taxing their already heavily burdened infrastructure.



AI and Machine Learning More Relevant in the Brave New World

Original Airdate, 14 April 2020

As unprecedented and far-reaching as the crisis is, a rough sketch of the post-Covid world is starting to emerge. And a lot about this New Normal feels like an accelerated version of the planning done in the pre-Covid world: leaner, cleaner and more decentralized production and post-production pipelines, a big emphasis on security, and of course AI to intelligently integrate and optimize it all. But while the future isn’t totally out of focus, it’s arriving much faster than anybody would have ever anticipated. This is a problem on so many levels, first and foremost because the current “rush to the cloud” - naturally- forces industry players into decisions driven by emergency, and into the arms of large cloud providers all too happy to manage petabytes more data behind their high, comfortable, and outrageously expensive walls. This is one of the areas where even simple machine learning and artificial intelligence solutions can optimize many areas of the New Normal, especially cloud deployments, pipelines, and security. This security will give an overview of the post-COVID cloud infrastructure landscape, and lay out concrete avenues for artificial intelligence to play a central role in the New Normal. 


OTT Technical Challenges

Original Airdate, 09 April 2020

This webinar will outline the key solutions and technologies that meet the demanding commercial constraints and flexibility for common OTT workflows.


Why am I still affected by signal blanking? (and why should I care?)

Original Airdate, 23 January 2020

This webcast is aimed at young media professionals who may wonder “Why should I listen to this webcast even if I only watch content on Apps?” Those Apps may be driven by data carried in what is called “ancillary data space” and authored to be delivered to the viewer with frame accuracy.
Young system designers are often aware of the need to properly handle ST 2110-40 data along with the video and audio data carried by IP-based media systems. Those using SDI-based technologies may also wonder “why is so much data space used for this?”


How many new compression standards? AV1, VVC, LCEVC, HEVC and AVC compared

Original Airdate, 21 November 2019

The long-awaited glasses-free display of three–dimensional imagery will soon be a reality with the introduction of Field of Light Displays (FoLD), also known as Light Field (LF) Displays. Light Field displays an attempt to recreate the light rays passing through a volume of space. The light rays pass through the 3D point, either real or virtual, that would exist in an actual 3D scene and provide true parallax and stereoscopic vision (each eye sees a different view and the view changes smoothly as the viewer moves). The development of a standardized method for streaming 3D content will facilitate the broad adoption of FoLD systems. Third Dimension Technologies (TDT), with sponsorship from the Air Force Research Laboratory (AFRL), is collaborating with Oak Ridge National Laboratory (ORNL) and Insight Media (IM) to develop a display agnostic standard for streaming 3D visuals to FoLD displays. 


Streaming Model for Field of Light Displays

Original Airdate, 14 November 2019

The long-awaited glasses-free display of three–dimensional imagery will soon be a reality with the introduction of Field of Light Displays (FoLD), also known as Light Field (LF) Displays. Light Field displays an attempt to recreate the light rays passing through a volume of space. The light rays pass through the 3D point, either real or virtual, that would exist in an actual 3D scene and provide true parallax and stereoscopic vision (each eye sees a different view and the view changes smoothly as the viewer moves). The development of a standardized method for streaming 3D content will facilitate the broad adoption of FoLD systems. Third Dimension Technologies (TDT), with sponsorship from the Air Force Research Laboratory (AFRL), is collaborating with Oak Ridge National Laboratory (ORNL) and Insight Media (IM) to develop a display agnostic standard for streaming 3D visuals to FoLD displays. 


Breakdown of using the ZFS file system in a media production environment

Original Airdate, 05 September 2019

This SMPTE Webcast will provide a deep dive into data optimization, retention/replication and storage virtualization for media production environments and the ZFS file system.

Powering the multi-platform distribution media factory: OTT and Streaming End-to-End

Original Airdate, 28 August 2019

Over-the Top (OTT) services and content streaming are arguably at the top of the content consumption food-chain right now. The doom and gloom of naysayers from years past have been replaced by optimism and opportunity for those who embraced the evolutionary format. During this SMPTE Technology webcast you will become much more conversant in the technologies, platforms and considerations relating to providing OTT and streaming services. You will learn about topics such as streamlining content for OTT distribution through workflow orchestration and automation. You will also gain knowledge about the specifics of latency and sync challenges, guaranteed capacity, predictive scaling and anomaly detection, plus the intelligent and stateful Adaptive Bit Rate (ABR), and so much more.

Guest speakers for this webcast are Bea Alonso and Wauter De Bruyne of Dalet, plus Filippa Hasselström and Ted Olsson of Net Insight to learn more about end-to-end OTT and streaming!

Content Management

Original Airdate, 15 August 2019

During this SMPTE Technology webcast, you will not hear about the fundamentals of content management, but you will learn about recent changes and challenges. In addition, you will learn about considerations for modernizing content management systems, plus applications in decentralized and very demanding environments.

PTP Timing – Everything you need to know

Original Airdate, 30 May 2019

Next-generation IP infrastructure deployments have benefitted enormously from the inherent flexibility of IP networking technology. In addition to greater flexibility in the routing of video, audio, and data essence streams, the transition to IP has enabled the transformation of facility timing infrastructures with the industry adoption of Precision Timing Protocol (PTP). This webinar will delve into the key aspects of PTP, including an in-depth examination of the benefits of PTP, system design, configuration, operational considerations, and monitoring. In addition, specific details of the configuration and operation of PTP grandmasters, including redundancy and GPS locking will be discussed. Slave device behavior is also examined, and how gateway devices provide synchronous operation is explained. Timing considerations for hybrid architectures incorporating both IP and SDI signals will also be detailed. Lastly, the importance of monitoring and understanding what is occurring in a PTP system will be presented

Low- Latency Live Streaming

Original Airdate, 16 May 2019

Live video streaming over the internet is a big growth area for the broadcasters and OTT services alike. Broadcasters are expected to have live streams at a similar latency to their broadcasts and pure-streaming services see matching the latency of over-the-air broadcasters as another way to prove their legitimacy and further swing viewing over to the internet. Work continues apace in developing technologies which can deliver ultra-low latency, we look at how it can be achieved, what the status is of those projects and how well they stack up in real life

ATSC 3.0 Webinar Part 2: ATSC 3.0 – a Deeper Dive

Original Airdate, 9 May 2019

Building on the first ATSC 3.0 webcast, this webcast will explore more deeply the technical aspects of ATSC 3.0 including system requirements and configuration, inter-layer protocols, STL transport protocol, and security.

New criteria to measure HDR image quality

Original Airdate, 25 April 2019

The High Dynamic Range image container is much wider than classic HDTV and DCI containers, allowing content creators to display much more luminance information than was previously possible.  Does content take advantage of this new format?  It depends on many factors, including the conditions in which it is shot. In this SMPTE Technology Series webcast, image specialist Pierre (Pete) Routhier suggests four criteria by which one may judge whether images take full advantage of HDR and apply those criteria to a variety asset types, i.e. , talk shows, dramatic series, movies, etc., to help assess their potential to amaze viewers with HDR.

ATSC 3.0 Webinar Part 1: Introduction to ATSC 3.0

Original Airdate: 21 March 2019

This webinar will provide an overview of the ATSC 3.0 system capabilities including a physical layer, signaling, audio, video, captions, interactivity, and advanced emergency messaging.  You will receive an introduction to the suite of Standards and Recommended Practices documents with a concentration on stream essence. Attendees will be asked to provide input and thoughts on areas of ATSC 3.0 for further exploration in future SMPTE webinars on the topic

Adventures in Volumetric Cinema

Original Airdate: 7 March 2019

A colleague and I had access to volumetric 3D displays and decided to try to make a motion picture to be exhibited in a public setting on such a device. We hosted a month-long workshop with students from several universities to try out storytelling and visual ideas and to experiment with production and post-production processes. In this talk I’ll describe what we’ve learned so far, as we head into the final production phase later this year

Machine Learning in M&E: Reality Versus Hype

Original Airdate: 24 January 2019

AI is here. From autonomous vehicles to voice-controlled agents, more and more machine intelligence research is crawling from the lab into real life. All signs point to this trend accelerating. More data, less-costly computing, and much greater investment are all indicators that AI, this time, is here to stay. The label has been used to death by unscrupulous marketers, reporters, and even some researchers who still confuse learning for actual intelligence, but true AI is still very rare. This is because real AI is hard —  especially in the media industry, where data is small and often fractured, and the questions posed to AI are extraordinarily complex. ETC AI researcher Yves Bergquist will present a detailed and case studies-driven webinar that explores the nature of AI, where it is going, and what it means for the media industry.

5G Opportunities for Broadcasters

Original Airdate: 15 November 2018

5G is a new broadband technology with superior technical capabilities compared with earlier mobile communications systems. Boasting very low latency, very high reliability, and improved spectrum utilization and energy efficiency, it aims to enable provision of new services to consumers and business users in different social and economic sectors.

Is Virtual Reality Still The Future?

Original Airdate: 8 November 2018

The trough of disillusionment is an oft-cited hype cycle bracket during which, as Gartner defines it, “interest in a technology wanes as experiments and implementations fail to deliver.” Until now, that’s been the case for virtual reality (VR). Consumers, pundits, and investors thought of VR as a clunky box strapped to a person’s face. Augmented reality (AR) was considered by some to be science fiction involving glasses that may or may not ever materialize. Today, however, immersive entertainment refers to a new type of audience engagement that blurs the lines of what is real and what is synthetic.

Mark Schubin's "Six Centuries of Opera and Media Technology in New York"

SPECIAL WEBCAST - FREE FOR EVERYONE TO VIEW

Original Airdate: 1 November 2018

Electronic home entertainment was invented in New York City for opera. The first compatible-color television program seen at home — and the first bootleg recording — featured opera in New York. New York’s media technologies for opera date back to the 16th century. Today, in the 21st century, they include dynamic video warping with depth-plane selection and multilanguage live cinema transmissions to all seven continents.

A 200-ton music synthesizer broadcasting opera music in New York in 1907? An opera lighting dimmer in 1638? Opera for military communications tests?

It might be difficult to believe, but it’s all true!

Blockchain Revealed – Why So Mysterious?

Original Airdate: 25 September 2018

Learn what makes up the fundamentals of blockchain, and what the differences are between cryptocurrencies and other blockchain implementations.

You’ll discover how blockchain can be useful, and where it may not be. Is it a good security mechanism? Is it the best way to solve a number of challenges in Media and Entertainment? What are the current standards and should we follow them?

Why speed is not the most important thing to look for in a RAID system

Original Airdate: 2 August 2018

During this SMPTE Technology Webcast, guest speaker, Tim Standing from SoftRAID, provides you with the information you need to pick a fast and reliable RAID system for media storage that is directly attached to your computer. He will describe the common RAID levels, and a few of the less common ones as well. For each RAID level, he will explain where the file data gets stored and what the "extra" disks are used for. In addition, topics include optimizations you can use for getting the fastest performance from your RAID system, what to look out for and how to avoid common problems.

Exploring the Role of NMOS: Discovery and Connection in an IP World

Original Airdate: 17 May 2018

SMPTE ST 2110 specifies how to stream video, audio, and data between devices for professional applications using IP networks. How best to discover, connect, and monitor those devices? That is another matter — and one addressed by the Advanced Media Workflow Association (AMWA) Networked Media Open Specifications (NMOS).

Beyond the Skynet Myth: Discovering the Reality of How AI is Used Today in Content Creation & Distribution, and the Possibilities for Tomorrow

Original Airdate: 4 March 2018

Artificial intelligence (AI) is not just a concept for the future. Machine learning (the main technique underpinning AI) is already being used in the media industry to transform content creation and delivery.

Join Jason Brahms CEO of VideoGorillas, who use machine learning in their innovative media software, and Lydia Gregory co-founder of FeedForward, who bring together the cutting edge of machine learning research and businesses, as they explore the current AI media landscape and discuss where it is heading.

 

Keeping Time with Precision Time Protocol (PTP)

Original Airdate: 14 December 2017

Throughout the ages, time has been a critical component to be quantified. Early sundials, water clocks, and mechanical pendulums once marked the passage of time. Today, quartz watches and the atomic clock measure time with far better precision. 

Time plays a critical role in a broadcast facility, ensuring accurate switching of program material. The Global Positioning System (GPS) provides precise timing information across the world, and both timecode and video synchronization can be derived from that data. Video synchronization is typically achieved either with analog black burst or with tri-level sync signals carried as a reference signal throughout the facility, locked from a GPS reference to maintain synchronization from facility to facility worldwide.

IP Monitoring and Measurement

Original Airdate: 9 November 2017

The broadcast industry’s ongoing transition to internet protocol (IP)-based transport for video, audio, and data is being enabled by the development of standards such as SMPTE ST 2022-6 and the SMPTE ST 2110 Professional Media Over Managed IP Networks standards suite. These standards provide the interconnection framework for an all-IP infrastructure within a facility.

VR from Shoot to Delivery

Original Airdate: 21 September 2017

Richard Mills, technical director of Sky VR Studios, will provide a detailed behind-the-scenes view of Sky VR Studios' creative and production processes. Using examples of the studio's work, he will explain the planning, commissioning, and technical processes from shoot through to delivery. Topics will include technical guidelines, production planning, shooting techniques, and the postproduction and app delivery workflows deployed by Sky VR Studios.

Beyond Virtual Reality: Light Field, Holographic Display, and the Roadmap to the Holodeck

Original Airdate: 24 August 2017

Jon Karafin, CEO of Light Field Lab Inc., will detail the very latest developments in light field and holographic display technologies, including insights into the creative and technical implications for content creation. Karafin also will analyze the data requirements and solutions for streaming holographic media, and then provide a glimpse into Light Field Lab's holographic technologies.

The Art and Technology of Spherical Storytelling: Adventures in Virtual Reality (VR) Production

Original Airdate: 10 August 2017

By now, almost everyone has learned something about the technology associated with virtual reality (VR), but what about storytelling in the spherical environment related to VR? How is VR storytelling different from 2D storytelling? During this SMPTE Educational Webcast, Andrew MacDonald, creative director at Cream VR/AR, will present concepts of 360-degree cinematography, including shooting nodal point 360, shooting stereo in 360, stitching 360 footage, using screen direction methodology, storyboarding in 360, moving the camera, and integrating CGI in a 360 spherical environment. This webcast looks at VR from an entirely different perspective.

Resolving Storage Pain

Original Airdate: 8 June 2017

Today’s artists, architects, administrators, and end users working in the media and entertainment industry today may experience pain points when managing media storage. Workloads are expanding as postproduction businesses try to squeeze more from less. Clients demand ever higher project resolution and frame rates with faster turnaround times. As technology evolves, postproduction must minimize the challenges presented by the storage environment. This webcast will discuss how to handle media storage for challenging workloads in a fast, efficient, and scalable manner.

HDR as a Dramatic Imaging Tool, Examined from a Cinematographer's Point of View

Original Airdate: 6 April 2017

High dynamic range (HDR) is most often discussed from an audience member's point of view, and there is an implication that more dynamic range on the theater screen will automatically deliver a more impressive, and immersive, movie-going experience. The fact is that an appropriate amount of dynamic range for the scene's content, the story, and the cinematographer's artistic intent is the goal — as it always is — in filmmaking. To a cinematographer, HDR is important as a visual option and can be used to create impressive images when it is a precisely calibrated and predictable tool.

CES 2017: Innovations, Advancements, and Disappointments

Original Airdate: 23 March 2017

The dust has settled on CES 2017. Now that the industry has had a chance to reflect on the massive exhibition, it is time to take a good look at the highlights of the show. Moderator Mark Schubin of Schubin Café will be joined by guest speakers Peter Putman, president of ROAM Consulting, and V. Michael Bove of MIT for a discussion of their impressions of the technology that promised to "change things," as well as what was hot and what was, in some cases, surprisingly not. CES 2017. We’ve heard initial reactions from others, but now that everyone has had a chance to reflect on the massive exhibition, let’s take a good look at the highlights. Please join moderator Mark Schubin, Schubin Café, and guest speakers Peter Putman, ROAM Consulting/Kramer Electronics and V. Michael Bove, MIT, as they share their impressions about the technology that promised to “change things”, plus “What was hot and what was, in some cases, surprisingly not!”.

JPEG-XS - The Next Generation Compression Standard for Video Over IP

Original Airdate: 16 March 2017

Adding mezzanine compression into live broadcast production workflows is motivated by the need for higher infrastructure bandwidths in the production of UHD content, as well as the desire for more infrastructure flexibility by using internet protocols (IP). However, the majority of existing solutions are proprietary. To respond to these needs and challenges, the Joint Photographic Experts Group (JPEG) decided to begin a new work plan called JPEG-XS to develop a new interoperable, standardized mezzanine codec for such applications. Please join Fraunhofer's Siegfried Foessel and intoPIX's Jean-Baptiste Lorent, who will discuss the specific requirements for such a codec, the evaluation criteria, and the outlook for expected evaluation results.

Center-Of-Interest (COI) Motion Analysis and Prediction: Do Movies Benefit More from 4K, HFR, or Both?

Original Airdate: 2 February 2017

As the industry relentlessly moves toward higher spatial and temporal resolutions, it's important to take a step back and look at how those technologies fare with the creative side of filmmaking.

During this webcast, you’ll discover the surprising results of applying a certain methodology to several recent box-office hits (exhibited at 2K24). Find out the specific impact that increasing frame rate has on image detail, while hearing how image performance is affected by today's fast-paced cinematography. Finally, possible ways to improve viewer experience while reducing costs, without compromising creative intent will be explored.

HDR: PQ and HLG – Presented by the BBC

Original Airdate: 19 January 2017

High dynamic range (HDR) promises to significantly enhance the viewing experience of audiences around the globe. The two HDR technologies supporting this promise are Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ). How are they similar, and how do they differ? In this webcast, guest speakers Tim Borer and Andrew Cotton, both of the BBC, will present an objective summary of the two technologies in terms of technical parameters.

The Future of Media and Entertainment and Its Impact on Computer Storage (and Vice Versa)

Original Airdate: 15 December 2016

Hollywood has been innovating since its inception more than 100 years ago, and storage has been along for the ride.

With the onset of the digitization of film, as well as the move from HD to 8K and eventually 16K movies, file sizes are soaring! A title that once generated 112GB an hour may soon reach 86TB. Therefore, storage has become a critical factor in how movies are archived, streams are stored, and how much broadcasts and movies are budgeted. From the set, through the camera, to the big or little screen, storage has earned its place in the closing credits.

ACES 1.0: Theory and Practice

Original Airdate: 17 November 2016

The Academy Color Encoding System (ACES) is a complete framework for digital color representation, reproduction, processing, distribution, and archival mainly aimed to theatrical, TV, and animation features. Current major version 1.0 has been recently released and introduces new concepts and methodologies.

From 4D Cinema to Haptic Cinematography: Challenges & Issues

Original Airdate: 20 October 2016

Does the cinema industry need to add another dimension to traditional audio-visual projection in order to improve the audience experience?

The 3D model was attempted, with varying degrees of success. More promising may be the emerging 4D approach, where dedicated rendering platforms employ physical or sensory feedback. The goal of this webcast is to address the different aspects of such a haptic cinematography workflow, including production/creation by the haptographer, rendering and associated quality of experience, and the definition of suitable formats for the representation and distribution of the resultant additional data.

Uber Master Creation Using HDR Movie Projector

Original Airdate: 6 October 2016

There are multiple formats for cinema, each with different characteristics, including dynamic range, maximum brightness, and color gamut. How can we best create content that looks as good as possible on all these projectors? Our experience has been to grade first on an exceptionally high-dynamic-range wide-color-gamut projector, and then map down to the other projectors. This has been an extremely fast, accurate, and pleasing process. Every version looks great and maintains the intent of the uber-master as well as possible. The techniques will be described and the results discussed. Thad Beier believes that this could be the future of color grading.

Build your Base, Rent your Spike – An Intro to Web-Scale Open Infrastructure on OpenStack, Microservices, and Kubernetes

Original Airdate: 28 July 2016

Infrastructure technology is evolving at an accelerated pace, as the space moves away from silos and toward the more diverse skill set that is needed to operate corporate infrastructure. 

Cloud computing can accomplish this. Cloud is more than technology: It brings together people and process. Transitioning corporate infrastructure to the cloud and being able to operate in a true IaaS model behind a firewall is key to meeting long-term business needs. 

Solinea will share some of our experience in working with the media and entertainment industry to make this transition.

Content Acquisition Using Light Field Technology

Original Airdate: Thursday, 21 July 2016

Traditional cameras capture an image as a two-dimensional plane. But what if you can capture each ray of light, including its intensity, color value, and direction? This is exactly what the new generation of light field cameras is capable of.

By capturing data on the full light field, the content creator is given unprecedented control in post-production, including focus, stereoscopic imaging, aperture, and shutter angle. In this webinar, you will learn the basics of how light field cameras work and hear about the new Lytro Cinema camera, which captures 755 Megapixels of data at up to 300 frames per second.

The Ins and Outs of ATSC 3.0

Original Airdate: 16 June 2016

ATSC 3.0 will change the broadcast television environment in many positive ways. This SMPTE webcast provides an overview of this new standard, that promises to transform the television distribution ecosystem. With an all-IP—based distribution platform, broadcasters will have numerous opportunities to improve the consumer experience with better-quality audio and video, new service offerings, and delivery to mobile devices while also enabling new digital services.

Immersive Audio Systems and the Management of Consequent Sounds

Original Airdate: 12 May 2016

Immersive audio is appearing in modern cinematic storytelling more frequently. In traditional sound mixing, a first sound can have a tight semantic coupling to a second sound, such as a gunshot and ricochet, or a handclap and its reverberation. Immersive sound systems can direct these precedent and consequent sounds to different locations so that they envelop the audience. When consequent sounds are not managed, the psychoacoustic principle known as the “Haas Effect” can result in portions of an audience misunderstanding the placement of precedent sounds, momentarily disrupting their experience.

The Psychology of Immersive Storytelling

Original Airdate: 28 April 2016

Suspension of disbelief is a fundamental goal of telling any story. Eliciting the perception that the story could be real requires overcoming the complexities of the human mind. Join Dr. Albert "Skip" Rizzo, PhD, Director – Medical Virtual Reality Lab, at the USC Institute for Creative Technologies, and learn why certain approaches may help, or hinder, virtual reality (VR) storytellers in creating the ultimate immersive environment within which a compelling story can be told. The presentation will begin with a brief description of Dr. Rizzo’s work in the area of Clinical Applications of VR. From that vantage point, he will discuss his work in utilizing VR as a tool for the treatment of PTSD and for preventing it by training resilience skills in service members before a deployment within immersive-interactive narrative VR episodes that put the user in challenging emotional situations similar to what they may experience in a combat zone.

CES 2016 - Necessity Is The Mother Of Reinvention

Original Airdate: 17 March 2016

8K, HDR, OLEDs, flexible displays - they were all in evidence at the 2016 International CES, as were drones, connected appliances, and super-fast 60 GHz wireless. This hour-long seminar will talk about all of these trends, along with the free-fall in television prices, why your current display interface isn't fast enough, and what the Internet of Things means for video and audio signal switching, distribution, and control. (Oh, and you'll hear about the usual "only at CES" products, too!)


UHD Color-Conversion Challenges

Origianl Airdate: 28 January 2016

The introduction of a wide color gamut (WCG) color space in ultra-high-definition (UHD) creates a need to match colors produced for WCG UHD displays with colors for conventional HDTV displays.

Discover if color-conversion methods mandated by the current television standards produce a good color match when converting colors from the HD to the UHD color space.


UHD in a hybrid SDI/IP World

Original Airdate: 24 November 2015

The last year has seen a lot of excitement around the transition to IP in the broadcast environment, but SDI still plays a vital role.

The hybrid SDI/IP models that will be adopted by most operations still need to consider the role that SDI plays for HD, but especially important is the role of SDI as it pertains to the introduction of UHD-1 (commonly referred to as 4K).


Evolved Content Security Framework: Supporting the Modern Needs of MVPD

Original Airdate: 17 November 2015

Every aspect of the industry is rapidly evolving – from consumer’s behavior to video packaging, encoding and platform shifts. In order to keep up with the ever changing landscape, the framework for modern MVPD Content Security needs to evolve as well. Several advances in video packaging, encoding and transport such as Adaptive Bit Rate (ABR) have contributed to this evolution. However, certain key aspects such as consumer authentication and access control have lagged behind.


Clarifying High Dynamic Range (HDR)

Original Airdate: 15 October 2015

HDR is a new and exciting technology that is gaining traction in both the consumer and professional aspects of motion pictures. There are differences of opinion even in its definition and ways to approach it.


Camera-Design Philosophies for Beyond-HDTV Resolution (UHD)

Original Airdate: 24 September 2015

Many motion-imaging cameras are said to be "4K" or "UHD," but wildly differing design philosophies are used, ranging from large-format single sensors to small-format three-sensor prisms to four-sensors and more. Learn what the characteristics of each philosophy are. Different applications might call for different designs.


Lens Considerations for 4K Digital Cinematography and UHD Television Production

Original Airdate: 16 July 2015

4K UHD was hugely visible at this year’s CES and NAB Conventions. Associated technologies for production, postproduction, workflows and infrastructures are rapidly advancing on a global basis. 4K was born in the realm of digital cinematography and in that context has been largely based upon the Super 35mm (S35mm) image format size. 4K digital camera developments have become prolific. Allied development in 4K S35mm zoom and prime lenses continue to grow apace. Size and weight consideration constrain the focal ranges available – but that is a compromise long accepted in the world of digital motion imaging. An increasing number of television program genres have adopted the cinematic imagery offered by this larger image format size.


Creating and Keeping Better Pixels – How to Implement and Verify Efficient High Quality HDR and Wide Color Gamut Imaging Systems

Original Airdate: 18 June 2015

High Dynamic Range (HDR) and wide color gamut (WCG) imaging are beginning to appear in both the professional and consumer marketplace forming imaging pipelines that will produce previously unseen brightness levels, contrasts, deep blacks and intense colors; all while maintaining incredible detail. However, the technological approaches involved need to work together harmoniously throughout the imaging pipeline to provide the highest level of fidelity possible. This becomes especially important when both technical and economical constrains have to be taken into account, as is the case with consumer TVs or mobile devices.


Why a Standards Decision from 1953 Impacts Today's Broadcast Software

Original Airdate: 21 May 2015

Fractional frame rates - you might love them or hate them. Either way they are destined to be with us long into the future. This SMPTE Educational Webcast expands on an informative, amusing and educational presentation given at this year's HPA Tech Retreat. It will explore the impact of nudging the frame rate of US video by 1 part in 1000 on systems, devices and applications that permeate our industry. As we look to the future and UHD with high frame rates, the webcast will also cover the technology and operational areas needing attention when multiple frame rates with multiple timecode styles become more prevalent in the working environment.


High Dynamic Range Intermediate: Challenges and Considerations

Original Airdate: 23 April 2015

High Dynamic Range (HDR) content promises a significant, aesthetically pleasing enhancement of the viewing experience. It also requires that several key questions be answered. For instance: How can an HDR master be created? How do we approach the new perceptual challenges of HDR? What information will be known only at the final display?


The Future of JPEG

Original Airdate: 7 April 2015

The JPEG standardization committee has played an important role in the digital revolution in the past quarter-century. However, the ever-changing requirements in multimedia applications have created new challenges in imaging for which solutions should be found. This Emerging Technologies webcast provides an overview of several new solutions, including a recently developed image format called JPEG XT that is intended to deal with high dynamic range (HDR) content. In addition, JPEG PLENO, a recent initiative by the JPEG committee to address an emerging modality known as plenoptic imaging, will be explained. Finally, we will introduce JPEG AIC (Advanced Image Coding), a potential initiative that aims to create a new image compression standard that would not only offer superior compression efficiency when compared to JPEG and JPEG 2000, but also would provide other features attractive for multimedia applications of tomorrow.


UHD-SDI - Enabling the Transport of UHD/4K Over Existing In-Plant Infrastructure

Original Airdate: 9 March 2015

Today’s studio infrastructure uses an HD-SDI coaxial cable to carry a single uncompressed baseband signal of up to 3 Gb/s for 1080p60 image formats. With the advent of UHD/4K production, a 4x to 8x increase in overall bandwidth is required to realize the substantial improvements in image quality that comes with higher resolution; higher frame rate; wider color gamut and higher dynamic range of UHDTV compared to today’s HDTV production.


Web Application Security: The Devil is in the Details - Part II of III

Speaker: Chase Schultz, Senior Security Consultant, Independent Security Evaluators (ISE)

Part I of this three part webcast series introduced the Open Web Application Security Project (OWASP) and discussed topics such as Injection, Broken Authentication and Session Management, and Cross Site Scripting (XSS). These are a few of OWASP’s top 10 commonly misunderstood security flaws.

Part II in the series continues to countdown OWASP’s top 10 list and focuses on Insecure Direct Object Reference, Security Misconfiguration and Sensitive Data Exposure.


2015 CES Round-Up

Speakers: V. Michael Bove, MIT Media Laboratory, Michael DeValue, Walt Disney Studios, Pete Putman, ROAM Consulting and Kramer Electronics

Original Airdate:  22 January 2015

The Consumer Electronics Show (CES) is one of the most highly anticipated events where manufacturers announce new, wild and fantastic advancements in electronic equipment. Gadgets and gizmos which will ultimately display motion imaging content can be found around every corner at the show. Presenters in this SMPTE webcast will explain what caught their eyes and what, just maybe, may be on the technology horizon for SMPTE Members. 


  
1  2