SMPTE was founded in 1916 to address the lack of standards in the emerging movie industry where there was no agreement on film width, image format, perforations, etc., and only a small chance that a specific film could be displayed on a given projector. Today, SMPTE maintains a multitude of standards for film gauges from 8mm to 70mm, covering all these parameters plus many others such as edge coding, analog, and digital sound. SMPTE standards also include lens mounts, spools, film processing and storage, cinema sound, and a plethora of other areas needed to ensure interoperability in a complex, global industry. The use of film is declining as it is supplanted by electronic imagery in both acquisition and display. However, SMPTE film standards continue to form the foundation for the industry, and a level of performance that is the benchmark for digital cinema.
Digital Cinema has been SMPTE’s ongoing opportunity to play a vital role in the reinvention of a 100-year old industry. Early digital projectors proved the concept while revealing a tremendous gap between the norms of existing electronic imagery and the demands of cinematographers. Lengthy investigations and tests sponsored by The Entertainment Technology Center at the University of Southern California (ETC@USC) and Digital Cinema Initiatives (DCI) supported developments in the SMPTE D-Cinema Technology Committee. This committee represented all industry sectors, had participants from more than 20 countries and included experts on all aspects of the associated arts and sciences.
The result is a suite of more than 30 SMPTE standards and engineering guidelines that have enabled a rapid and successful deployment of digital cinema, and a more engaging cinema experience. When interest in 3-D peaked, SMPTE D-Cinema standards were ready to support it. The standards have subsequently been enhanced to support higher frame rates (HFR), as used in the Hobbit movies, and work is proceeding to add support for immersive sound systems.
It goes without saying that the standards mentioned above are specific to D-Cinema and they rely on other SMPTE standards such as MXF (below) that provide a common foundation for development and operation in diverse areas.
Cinema Sound Systems (CSS) SMPTE’s Cinema Sound Systems Technology Committee (TC-25CSS) is charged with the creation of SMPTE standards and recommended practices to address opportunities created by the many technical advances since cinema sound standards last were created, nearly 30 years ago. Through this work, the committee is striving to improve the quality and consistency of cinema sound, so that no matter which cinema you view a film in the experience is as close as possible to that of the mixing stage.
Digital Picture Exchange (DPX) is a common file format for digital intermediate and visual effects work (VFX). DPX provides lots of flexibility in storing color information, color spaces and color planes for exchange between production facilities. Multiple forms of packing and alignment are possible. The DPX specification allows for a wide variety of metadata to further clarify information stored within each file. DPX is the worldwide chosen format for still frames storage in most digital intermediate postproduction facilities and film labs.
SMPTE has created video standards for many years, initially for North America and other countries adopting similar standards. In the 1980s, following close cooperation with SMPTE on the development of the first international digital standards, the European Broadcast Union (EBU) decided to continue the close collaboration and to rely on SMPTE to publish standards for all areas of the world. SMPTE has accomplished progress in both analog and digital formats, and many of its standards have been used as the basis for ITU Recommendations.
In the mid-2000’s Japan’s National Broadcaster, NHK asked SMPTE to standardize the necessary parameters of a family of Ultra High Definition Television (UHDTV) formats, to provide a consistent basis for those doing development work in the field.
Electro-Optical Transfer Function (EOTF) and High Dynamic Range (HDR) assists in creating a consistent production of HDR-support content for multiplatform distribution. It means viewers will see a wider range from the brightest whites to the darkest blacks providing a substantial enhancement to HD or UHDTV pictures. It is a “transfer function” document that standardizes a way to represent HDR content, starting with the camera, and carried through the workflow to postproduction and delivery. Two additional related standards are expected by October/November, including one that addresses color gamut. SMPTE is now embarking on the next step and looking at the entire UHD ecosystem, including delivery to consumer devices including TVs.
SMPTE Time Code© gives every frame of video its own unique identifying number, makes digital editing possible, and enables the association of other data to make audio and video even more meaningful, accurate, and repeatable, whether in postproduction for a major studio release, in hard news environments or live sports production. It even synchronizes music and is often used to automate lighting, pyrotechnics, video, and other effects in live concert production
Material eXchange Format (MXF) The changing technology of television production and digital services to viewers means the ways for moving content - program video and audio - in studios is changing too. There is now far greater use of IT-related products, such as servers, and also reliance on automation, as well as the reuse and repurposing of materials, has expanded. The development of the MXF is a remarkable achievement: It establishes interoperability of content between various applications used in the television production chain. Interoperability leads to operational efficiency and creative freedom through a unified, networked environment. MXF is a file transfer format, which is openly available to all interested parties. It is not compression-scheme specific and simplifies the integration of systems using MPEG and digital video (DV) as well as future, as yet unspecified, compression strategies. Simplified integration means the transportation of these different files will be independent of content, not dictated by the use of any particular manufacturers' equipment. Any required processing can be achieved simply by automatically invoking the appropriate hardware or software codec. MXF is designed for operational use and for environments where all handling processes are seamless to the user. MXF has also been adopted as the foundation for D-Cinema distribution.
SMPTE Color Bars© Television Test Patterns have set THE consistent reference point for more than four decades to ensure color is calibrated correctly on broadcast monitors, programs, and on video cameras and displayed beautifully for consumers. Using color bars allows video, RGB, LCD, and Plasma displays, as well as duplication, television and webcast facilities, to maintain the intended chroma and luminance levels.
Serial Digital Interface (SDI and HD-SDI), a well-established standard in the broadcasting industry, is a family of digital video interfaces used for broadcast-grade video. High-Definition SDI (HD-SDI) is used to transfer uncompressed high-definition video. These standards are used for transmission of uncompressed, unencrypted digital video signals (optionally including embedded audio and time code) within television facilities. SMPTE was awarded an Emmy® statuette for HD-SDI in 2013. HD-SDI is a 1.5 Gb/s interface; already SMPTE has published a 3 Gb/s version, and the committees a close to completing work on the 6 Gb/s and 12 Gb/s versions needed for UHDTV and other advanced imaging applications.
SMPTE has published standards for many video codecs to provide well-reviewed documentation and enhanced interoperability. The latest of these is the VC-5 standard family that provides documentation and reference software for the video compression used in GoPro systems and workflows. SMPTE also has a new project to document the Apple ProRes codec.
Coding of Tactile Essence: Want to feel the roar of the engine while watching a car race? Tactile essence will make this possible! Tactile/ haptic or motion enabled broadcasts and transmissions can be described as the end to end use of technology to capture, insert and/or encode into the broadcast or transmission, transmit, decode and conversion of the tactile or haptic “feeling” and “impact” of a live event and so that a remote viewer can receive and experience not only audio and video but the haptic or tactile “feeling” and “impact” of that event, regardless of the transmission means whether cable, satellite, over-the-air, or Verizon FiOS®.
Interoperable Master Format (IMF)
Did you know that there are often more than 35,000 possible versions of a film based on every possible version, including cinematic exhibition, home viewing, broadcast, cable, in flight, multiple languages, varying aspect ratios, and Internet distribution? Interoperable Master Format (IMF) is a solution that solves the issue of multiple versions and is now being deployed using SMPTE standards documents. The concept is that, rather than storing a vast number of versions, all the individual assets (such as the various possible video elements, the different audio and subtitle tracks, etc.) are stored individually, and represent the inventory required to produce any required version. For each version, an extensible markup language (XML) composition playlist (CPL) specifies how the appropriate segments of each asset should be assembled to create the required program version. Automated systems can invoke the CPL to assemble any version on demand. A new version may be created at any time by writing a new CPL.
SMPTE standards are developed principally to meet the needs of the media industry. However, modern technology allows much wider utilities, as we have seen with SMPTE Time Code.
Archive Exchange Format (AXF) is an IT-centric file container that can encapsulate any number and type of files in an entirely self-contained and self-describing package, AXF supports interoperability among disparate data storage systems and ensures long–term availability of data, no matter how storage or file system technologies evolve. The nature of AXF makes it possible for equipment manufacturers and content owners to move content from their current archive systems into the AXF domain in a strategic way that does not require abandoning existing hardware unless or until they are ready to do so. In enabling the recovery of archived content in the absence of the systems that created the archives, AXF also offers a valuable means of protecting users’ investment in content. AXF already has been employed around the world to help businesses store, protect, preserve, and transport many petabytes of file-based content, and the format is proving fundamental to many of the cloud-based storage, preservation, and IP-based transport services available today. Participation by bodies such as the Library of Congress, and by major storage companies, has helped to ensure that AXF will provide a compelling solution for any critical archiving requirement.
Media Device Control over IP
Today’s modern media storage, playback, control and effects devices lack a standardized means of exposing control functions to both operators and software applications. Standardized simple machine control functions such as PLAY, STOP, PAUSE, LIST, SEARCH, JOG, along with the ability to query storage devices would allow users to choose components and applications from various manufacturers. These would easily work together to provide control, similar to the capabilities provided by older serial and parallel control technologies.