Live, interactive educational webcasts covering hot-topic technologies, issues and developments.
Webcast sessions are available as a SMPTE member benefit free of charge. Non-members may attend webcasts for USD $49.00.
Original Airdate: 6 October 2016
There are multiple formats for cinema, each with different characteristics, including dynamic range, maximum brightness, and color gamut. How can we best create content that looks as good as possible on all these projectors? Our experience has been to grade first on an exceptionally high-dynamic-range wide-color-gamut projector, and then map down to the other projectors. This has been an extremely fast, accurate, and pleasing process. Every version looks great and maintains the intent of the uber-master as well as possible. The techniques will be described and the results discussed. Thad Beier believes that this could be the future of color grading.
Original Airdate: 28 July 2016
Infrastructure technology is evolving at an accelerated pace, as the space moves away from silos and toward the more diverse skill set that is needed to operate corporate infrastructure.
Cloud computing can accomplish this. Cloud is more than technology: It brings together people and process. Transitioning corporate infrastructure to the cloud and being able to operate in a true IaaS model behind a firewall is key to meeting long-term business needs.
Solinea will share some of our experience in working with the media and entertainment industry to make this transition.
Original Airdate: Thursday, 21 July 2016
Traditional cameras capture an image as a two-dimensional plane. But what if you can capture each ray of light, including its intensity, color value, and direction? This is exactly what the new generation of light field cameras is capable of.
By capturing data on the full light field, the content creator is given unprecedented control in post-production, including focus, stereoscopic imaging, aperture, and shutter angle. In this webinar, you will learn the basics of how light field cameras work and hear about the new Lytro Cinema camera, which captures 755 Megapixels of data at up to 300 frames per second.
Original Airdate: 16 June 2016
ATSC 3.0 will change the broadcast television environment in many positive ways. This SMPTE webcast provides an overview of this new standard, that promises to transform the television distribution ecosystem. With an all-IP—based distribution platform, broadcasters will have numerous opportunities to improve the consumer experience with better-quality audio and video, new service offerings, and delivery to mobile devices while also enabling new digital services.
Original Airdate: 12 May 2016
Immersive audio is appearing in modern cinematic storytelling more frequently. In traditional sound mixing, a first sound can have a tight semantic coupling to a second sound, such as a gunshot and ricochet, or a handclap and its reverberation. Immersive sound systems can direct these precedent and consequent sounds to different locations so that they envelop the audience. When consequent sounds are not managed, the psychoacoustic principle known as the “Haas Effect” can result in portions of an audience misunderstanding the placement of precedent sounds, momentarily disrupting their experience.
Original Airdate: 28 April 2016
Suspension of disbelief is a fundamental goal of telling any story. Eliciting the perception that the story could be real requires overcoming the complexities of the human mind. Join Dr. Albert "Skip" Rizzo, PhD, Director – Medical Virtual Reality Lab, at the USC Institute for Creative Technologies, and learn why certain approaches may help, or hinder, virtual reality (VR) storytellers in creating the ultimate immersive environment within which a compelling story can be told. The presentation will begin with a brief description of Dr. Rizzo’s work in the area of Clinical Applications of VR. From that vantage point, he will discuss his work in utilizing VR as a tool for the treatment of PTSD and for preventing it by training resilience skills in service members before a deployment within immersive-interactive narrative VR episodes that put the user in challenging emotional situations similar to what they may experience in a combat zone.
Original Airdate: 17 March 2016
8K, HDR, OLEDs, flexible displays - they were all in evidence at the 2016 International CES, as were drones, connected appliances, and super-fast 60 GHz wireless. This hour-long seminar will talk about all of these trends, along with the free-fall in television prices, why your current display interface isn't fast enough, and what the Internet of Things means for video and audio signal switching, distribution, and control. (Oh, and you'll hear about the usual "only at CES" products, too!)
Origianl Airdate: 28 January 2016
The introduction of a wide color gamut (WCG) color space in ultra-high-definition (UHD) creates a need to match colors produced for WCG UHD displays with colors for conventional HDTV displays.
Discover if color-conversion methods mandated by the current television standards produce a good color match when converting colors from the HD to the UHD color space.
Original Airdate: 24 November 2015
The last year has seen a lot of excitement around the transition to IP in the broadcast environment, but SDI still plays a vital role.
The hybrid SDI/IP models that will be adopted by most operations still need to consider the role that SDI plays for HD, but especially important is the role of SDI as it pertains to the introduction of UHD-1 (commonly referred to as 4K).
Original Airdate: 17 November 2015
Every aspect of the industry is rapidly evolving – from consumer’s behavior to video packaging, encoding and platform shifts. In order to keep up with the ever changing landscape, the framework for modern MVPD Content Security needs to evolve as well. Several advances in video packaging, encoding and transport such as Adaptive Bit Rate (ABR) have contributed to this evolution. However, certain key aspects such as consumer authentication and access control have lagged behind.
Original Airdate: 15 October 2015
HDR is a new and exciting technology that is gaining traction in both the consumer and professional aspects of motion pictures. There are differences of opinion even in its definition and ways to approach it.
Original Airdate: 24 September 2015
Many motion-imaging cameras are said to be "4K" or "UHD," but wildly differing design philosophies are used, ranging from large-format single sensors to small-format three-sensor prisms to four-sensors and more. Learn what the characteristics of each philosophy are. Different applications might call for different designs.
Original Airdate: 16 July 2015
4K UHD was hugely visible at this year’s CES and NAB Conventions. Associated technologies for production, postproduction, workflows and infrastructures are rapidly advancing on a global basis. 4K was born in the realm of digital cinematography and in that context has been largely based upon the Super 35mm (S35mm) image format size. 4K digital camera developments have become prolific. Allied development in 4K S35mm zoom and prime lenses continue to grow apace. Size and weight consideration constrain the focal ranges available – but that is a compromise long accepted in the world of digital motion imaging. An increasing number of television program genres have adopted the cinematic imagery offered by this larger image format size.
Original Airdate: 18 June 2015
High Dynamic Range (HDR) and wide color gamut (WCG) imaging are beginning to appear in both the professional and consumer marketplace forming imaging pipelines that will produce previously unseen brightness levels, contrasts, deep blacks and intense colors; all while maintaining incredible detail. However, the technological approaches involved need to work together harmoniously throughout the imaging pipeline to provide the highest level of fidelity possible. This becomes especially important when both technical and economical constrains have to be taken into account, as is the case with consumer TVs or mobile devices.
Original Airdate: 21 May 2015
Fractional frame rates - you might love them or hate them. Either way they are destined to be with us long into the future. This SMPTE Educational Webcast expands on an informative, amusing and educational presentation given at this year's HPA Tech Retreat. It will explore the impact of nudging the frame rate of US video by 1 part in 1000 on systems, devices and applications that permeate our industry. As we look to the future and UHD with high frame rates, the webcast will also cover the technology and operational areas needing attention when multiple frame rates with multiple timecode styles become more prevalent in the working environment.
Original Airdate: 23 April 2015
High Dynamic Range (HDR) content promises a significant, aesthetically pleasing enhancement of the viewing experience. It also requires that several key questions be answered. For instance: How can an HDR master be created? How do we approach the new perceptual challenges of HDR? What information will be known only at the final display?
Original Airdate: 7 April 2015
The JPEG standardization committee has played an important role in the digital revolution in the past quarter-century. However, the ever-changing requirements in multimedia applications have created new challenges in imaging for which solutions should be found. This Emerging Technologies webcast provides an overview of several new solutions, including a recently developed image format called JPEG XT that is intended to deal with high dynamic range (HDR) content. In addition, JPEG PLENO, a recent initiative by the JPEG committee to address an emerging modality known as plenoptic imaging, will be explained. Finally, we will introduce JPEG AIC (Advanced Image Coding), a potential initiative that aims to create a new image compression standard that would not only offer superior compression efficiency when compared to JPEG and JPEG 2000, but also would provide other features attractive for multimedia applications of tomorrow.
Original Airdate: 9 March 2015
Today’s studio infrastructure uses an HD-SDI coaxial cable to carry a single uncompressed baseband signal of up to 3 Gb/s for 1080p60 image formats. With the advent of UHD/4K production, a 4x to 8x increase in overall bandwidth is required to realize the substantial improvements in image quality that comes with higher resolution; higher frame rate; wider color gamut and higher dynamic range of UHDTV compared to today’s HDTV production.
Speaker: Chase Schultz, Senior Security Consultant, Independent Security Evaluators (ISE)
Part I of this three part webcast series introduced the Open Web Application Security Project (OWASP) and discussed topics such as Injection, Broken Authentication and Session Management, and Cross Site Scripting (XSS). These are a few of OWASP’s top 10 commonly misunderstood security flaws.
Part II in the series continues to countdown OWASP’s top 10 list and focuses on Insecure Direct Object Reference, Security Misconfiguration and Sensitive Data Exposure.
Speakers: V. Michael Bove, MIT Media Laboratory, Michael DeValue, Walt Disney Studios, Pete Putman, ROAM Consulting and Kramer Electronics
Original Airdate: 22 January 2015
The Consumer Electronics Show (CES) is one of the most highly anticipated events where manufacturers announce new, wild and fantastic advancements in electronic equipment. Gadgets and gizmos which will ultimately display motion imaging content can be found around every corner at the show. Presenters in this SMPTE webcast will explain what caught their eyes and what, just maybe, may be on the technology horizon for SMPTE Members.
Speaker: Eyran Lida, Chair of the HDBaseT Alliance’s Technical Committee and Chief Technology Officer and co-founder of Valens
Original Airdate: 18 December 2014
The HDBaseT standard includes an impressive set of supported features for the delivery of uncompressed high-definition digital video (including 4K) — such as HD video, audio, Ethernet, controls, power, USB, multistream, and multipoint, among others — over a simple LAN/Ethernet cable (Cat5e/6) for up to 100 meters/328 feet. In addition, HDBaseT is a technology with zero latency and high resistance to electromagnetic noise in order to maintain the highest quality in uncompressed high-definition video.
Speaker: Ian Trow, Senior Director Emerging Technology and Strategy, Harmonic
Original Airdate: 20 November 2014
AVC (H.264) addressed many of the shortcomings of predecessor compression standards like MPEG-2 that were predominately aimed at linear scheduled broadcast. This enabled the needs of HD and streaming applications to be efficiently addressed with the necessary functionality and crucially, tight bandwidth constraints dictated by their respective distribution media. Innovation was demanded by the standards bodies to further improve on compression efficiency of the existing compression standards by as much as 50% to facilitate the introduction of 4K / Ultra HD as well increase the reach of Over The Top (OTT) services. All this activity surrounding compression formats begs the questions, what is the technology behind these standards, what are their target markets, how are they related and lastly are they gaining market traction?
Speaker: Pete Putman
Original Airdate: 30 October 2014
As the transition from analog to digital signal interfacing runs its course, there are still a few laggards like the 27-year-old VGA display connector. But it’s very much on the endangered species list as HDMI and DisplayPort become more entrenched. That’s not the whole story, though: The next generation of display interfaces is faster, denser, and smaller. They can carry multiple signals (video, audio, control, Ethernet) and some versions accomplish this with just five pins!
With UHDTV looming, the display interface is a critical part of the chain – and perhaps the weakest link. This seminar will discuss the updates to HDMI and DisplayPort and also take a closer look at “micro” versions of each connector and the different signal formats they support. We’ll also run a few calculations to see if HDMI and DisplayPort are really fast enough to support more pixels, faster clock rates, and increased bit depths; all key to implementation of UHD-1 and UHD-2 display systems. And we’ll wrap things up with a discussion of display/audio/control signal multiplexing over structured wire systems.
Speaker: Bill Beck "The Laser Guy," Barco
Original Airdate: 7 October 2014
After years of anticipation and many impressive demonstrations, commercial laser-illuminated projectors are being sold and installed in commercial movie theaters and premium large format (PLF) theaters. Systems with brightness levels ranging from 6000 cinema lumens to over 100,000 lumens per screen (dual-projector) have been installed. This webcast steps through key terms and definitions; current commercial offerings; major system architectures; the impact of primary selection; RGB laser vs. Blue laser-Pumped Phosphor (BPP) and a tutorial on key "figures of merit" (FoM) for evaluating laser-illuminated projectors, such as brightness; brightness roll-off; lifetime; wall-plug efficiency; speckle contrast ratio; dynamic range; contrast ratio; color gamut; impact on 3D; 6 Primary 3D systems and other laser-related topics will be discussed.
Speaker: Justin 'JD' Nir - Consultant, Independent Security Evaluators
Original Airdate: 18 September 2014
A three-part SMPTE webinar series analyzing the Open Web Application Security Project (OWASP) Top 10
The Open Web Application Security Project (OWASP) Top 10 is a guideline commonly relied upon in the Media & Entertainment industry as a resource for securing web applications. However, misunderstandings about certain nuances commonly result in improper implementations which lead to systems that fail against modern adversaries. This 3-part SMPTE webcast series will analyze the security flaws identified by the OWASP Top 10.
Speaker: John Hudson, Semtech
Original Airdate: 21 August 2014
Since its arrival in 2006, the latest generation of the Serial Digital Interface, 3Gb/s SDI, or "3G," has achieved widespread adoption, rapidly becoming the real-time streaming media interface of choice. What exactly is 3G SDI and how can it be used to create a reliable real-time streaming infrastructure?
In this final installment of a three-part Webcast, you will learn about 3G-SDI physical interface requirements and will also take away practical advice on designing, installing and operating reliable optical 3G SDI infrastructure and networks.
Speakers: Hans Hoffmann, EBU; Howard Lukk, Pannon Entertainment
Original Airdate: 15 July 2014
Whilst the Consumer Electronics industry pushes 4k screens onto the shelf, a technical debate has started on how to realise the full parameter set of UHDTV.
Please join Hans Hoffmann, EBU, and Howard Lukk as they discuss the situation on current and emerging trends in UHDTV, and reveal the research work underway on HFR and HDR. In addition, our presenters will bring context to the question of gaps in the UDHTV chain. Do not miss this very important webcast!
Mesclado Webcast: New SOA Architectures (No Membership Required)
Original Airdate: 10 July 2014
Second Screen, Interactivity, Targeted Advertising, Metadata Repurposing, Big Data… Tech’ Innovation Success Stories
How can we add value to Media Programmes? Technology can help, especially to reach younger audience, more connected than ever. Thanks to Mesclado’s own independent research lab and SMPTE's standardization effort, this jointly sponsored webcast will give hard facts rather than typical speculation and marketing focused perspective. Presenters will provide a 360° vision covering production to distribution (whether traditional or online).
Speakers: Ron Hromoko, Tom Ohanian, Cisco Systems
Original Airdate: 1 July 2014
Media and broadcast companies are beginning to accelerate adoption of IT-based facilities and methods for content creation and delivery. New efficiencies and cost-models emerge when the adoption of IP and Ethernet transport reaches scale. Virtualization and Software-Defined Networking (SDN) are keys to the next phase of this transformation. This webcast will introduce concepts in Virtualization and SDN, as well as their applicability to Production Workflows and Digital Delivery Models.
Speaker: Frank Artes, NSS Labs
Original Airdate: 26 June 2014
The Engineering team often ends up as the voice of reason between the technology requirements of production, post-production and distribution. This SMPTE Education webcast will discuss the overall strategy and day-to-day steps to leverage cyber resiliency and common sense approaches to reduce your risk from cyber attack, system breaches, disruption of business, and loss of Intellectual Property. Please join guest speaker Francisco Artes, Chief Technology Architect at NSS Labs, and SMPTE for this very important educational opportunity.
Speaker: John Hudson, Semtech
Original Airdate: 22 May 2014
Since its arrival in 2006, the latest generation of the Serial Digital Interface, 3Gb/s SDI, or "3G", has achieved wide-spread adoption, rapidly becoming the real-time streaming media interface of choice. But what exactly is 3G SDI and how can it be used to create a reliable real-time streaming infrastructure?
In this second installment of a 3 part Webcast series (Part 1 covered 3G SDI standards), participants will learn about 3G SDI physical interface requirements and will also hear practical advice on designing, installing and operating reliable coaxial cable based 3G SDI infrastructure and networks.
Speaker: Richard Welsh, Sundog Media Toolkit Ltd.
Original Airdate: 17 April 2014
What does the term “cloud” really mean for the media industry? Is it just a buzzword or a genuinely useful and game changing technology? Where does it work and where does it not? What are the advantages versus challenges now and in the future? This SMPTE Educational Webcast explores the architectures, implementations and applications of cloud computing in practical terms.
Speaker: Seth Coe-Sullivan, QD VISION, INC.
Original Airdate: 1 April 2014
Quantum dots (QDs) are a new material that is already impacting the display industry, appearing in 2013 display products from 7” to 65”, and from tablets to televisions. These QD products are differentiated in their color gamut and color accuracy, two of the most critical performance characteristics for consumers and the SMPTE community. This webcast will cover the basics of what quantum dots are, the two methods by which they are integrated into display products, and how their characteristics directly influence display performance. We will then explore the relationships between color gamut and color accuracy, the issue of full gamut content delivery, and the role of standards in ensuring the best utilization of this new hardware innovation.
Speakers: Timo Kunkel and Scott Daly, Dolby Laboratories
Original Airdate: 20 March 2014
Light in the real world around us appears in a multitude of intensities and wavelength combinations. The human visual system (HVS) has evolved to sense and interpret this subset of the electromagnetic spectrum we call light to create the appearance of the real world with a wide palette of colors and large contrasts between light and dark.
In this presentation, we will discuss key display capabilities and trends, such as dynamic range and screen reflectance (from anorexic mirrors to curvy moth eyes), ambient light effects on displays in transition as well as on viewers, providing ways of creating better pixels.
Speakers: Bill Admans - Postproduction Professional and Technology Marketing Executive
Original Airdate: 20 February 2014
Since the demise of the CRT, the monitor landscape has changed dramatically. Where there was one technology and few choices, there are many different technologies and hundreds of choices. Monitors are used at every stage of the production and postproduction workflow, from on-the-set to final distribution. Choosing the right monitor to meet your workflow requirements is important. This webcast will explore the differences between different monitor technologies and what they mean for you
Speaker: Al Kovalick, Media Systems Consulting
Original Airdate: 19 December 2013
File-based (IT) production and broadcast workflows are now the norm across a large percentage of the our industry. Though the transition to file-based systems is a relatively recent evolutionary milestone, there are already a number of disruptive technologies poised to move real-time AV workflows several giant steps forward. No doubt you’ve already heard about Software Defined Networks/Storage, 10G/40G Ethernet, Precision Time Protocol, fast network switching, compute virtualization methods and widely available web apps (SaaS). These are the foundation upon which the “all IT facility” will be built. SMPTE’s guest speaker will discuss these catalytic technologies and why they are essential to move to all IT. He will also describe the associated technical challenges and potential means to overcome them.
Speaker: John Zubrzycki, Principal Technologist at BBC Research
Original Airdate: 7 November 2013
Handling video and audio content as digital files brings tremendous advantages to the broadcast and media industry. Converting operations from familiar broadcast technologies to IT technologies promises to provide flexibility and savings, but the wrong choices could put your content at risk. The presentation will cover the basic steps needed to maintain the quality, safety and integrity in a digital production and archive workflow. It will include advice on handling coded content, on storage media, on using the cloud and explain the differences between a digital library and an archive. Ways of working with digital content in an IP infrastructure are still developing in this relatively new area for our industry and so pointers to sources of further help in SMPTE and elsewhere will be provided.
Special thanks to the following SMPTE Monthly Webcast sponsor:
Disclaimer: Sponsors are recognized for their generous support of SMPTE
educational initiatives. This recognition does not represent recommendation or endorsement of sponsors' services or products by BBC or SMPTE
An Introduction to Holographic Television
Speaker: V. Michael Bove, Jr., MIT Media Lab
Original Airdate: 31 October 2013
Widespread recognition of some shortcomings of "traditional" 3D TV, and some recent technological advances, create an opportunity for holographic 3D TV for entertainment, telepresence, and teleoperation. In this webcast I'll review perceptual considerations for 3D TV, explain what true holographic television is (and how to distinguish it from things that marketers call holographic), as well as describe how recent developments in image capture, standardization, and computation may bring holographic television to market affordably sooner than many have predicted, co-existing with other kinds of 3D TV. I'll also explain the range of light-modulation technologies that various research groups (including mine) are exploring as part of developing holographic video display systems.
Download the Presentation Slides
Speakers: David Wood and Greg DePriest
Original Airdate: 26 September 2013
During this SMPTE Monthly Webcast, we will jump feet first into UHD and how it will likely impact the broadcasting industry. Guest speakers David Wood and Greg DePriest will take us from today, discussing the fundamental basis of UHD, global status of UHD standards situation and how UHD will likely progress in terms of frame rate, UDR and color bit depth. Additional topics of interest include the London Olympic Games experience, the rollout timeline for 4K/8K in Asia and public information about who is doing what with 4K in select locations. Don’t miss this important discussion on a topic which will impact workflows over the near, mid and long terms.
Speaker: Joseph Slomka, FotoKem
Origianl Airdate: 22 August 2013
Related Resources in the Library: ACES
This SMPTE Educational Webcast provides insight into implementation of ACES in production. Our special guest speaker, Joseph Slomka, Vice President, Principle Color Scientist at FotoKem will discuss the areas of production at FotoKem impacted by ACES and how ACES will likely affect feature motion picture production, animation and VFX.
Speakers: David Stump, ASC and Andrew Watson, NASA
Original Airdate: 25 April 2013
Webcast Series: Digital Cinema
Related Resources in the Library: High Frame Rates
With the recent release of films produced and exhibited at frame rates higher than the traditional 24 frames per second (fps), many questions have arisen: What exactly is high frame rate in cinema? From where did the concept come? Why is it important today? What are the human psychophysical considerations?
Guest speakers David Stump, ASC and Andrew Watson, NASA, explore this newly emerged storytelling tool
Speaker: Jim Houston, Principal, Starwatcher Digital; Co-chair, ACES Project Committee
Original Airdate: 28 March 2013
Webcast Series: File-Based Workflows
Related Resources in the Library: ACES
As digital cameras and displays grow toward higher dynamic range and wider color gamuts, what tools are in place for the new digital workflows they demand? The Academy of Motion Pictures Arts and Sciences has been working on this question through its Science and Technology Council.
Speakers: Chris Lennon, SMPTE Engineering Director/President, MediAnswers, Harold Geller, Chief Growth Officer, Ad-ID
Original Airdate: 28 February 2013
Webcast Series: File-Based Workflows
Related Resources in the Library: BXF
2013 finds us at the intersection of several tools that enable advertising workflows, which have been seeking for years. New developments in SMPTE's Broadcast eXchange Format (BXF), along with Ad-ID, AS-12, and other areas mean that we can now automate the flow of data throughout the advertising chain - from creation to airing and billing of a commercial.
Speaker: Jason Livingston, CPC Closed Captioning
Original Airdate: 24 January 2013
Related Resources in the Library: Closed Captions IP Delivery
On 30 September, an FCC regulation came into effect requiring TV broadcasters to implement captions for prerecorded programming that is not edited for Internet distribution. Livingston discusses best practices for workflows involving CEA-608 and CEA-708 broadcast closed captions data and translation into SMPTE 2052. (Free to All)
Speaker: Jim Whittlesey, Deluxe
Original Airdate: 13 September 2012
Webcast Series: Digital Cinema
Related Resources in the Library: High Frame Rates
With an eye on the ever evolving motion imaging technology horizon, high frame rate (HFR) content is the next challenge to impact the entire theatrical workflow. So what are the benefits of HFR technology? How will HFR affect the media workflow?