Cathy Huis in ‘t veld-Esser chuckles at the question—are we approaching a day when electronic delivery of digital cinema product will be truly global and seamless everywhere?
“The studios would love that,” says Huis in ‘t veld-Esser, longtime CTO of European digital cinema logistics company Gofilex. “If they could have a button on their desk and release their new movie to all the cinemas in the world in a common standard so we don’t have any operational problems in the cinemas by pressing that button—that would be their dream. All cinemas would be able to ingest the package, and play it out without any problems. Some countries have gotten really far into this, but for the whole world? That remains a big wish.”
As previously discussed in Newswatch, major engineering breakthroughs in recent years have enabled manufacturers of image capture and display devices to produce stunning results in terms of imagery that incorporates better color depth, deeper dynamic range, faster frame rates, and other enhancements that the human visual system can process and comprehend. These developments sometimes get convolved with the quest to bring greater realism to moving images, when often they are actually used to enhance a creative distortion of reality. David Long, an associate professor at the Rochester Institute of Technology (RIT), director of RIT’s MAGIC Center, a media research facility, SMPTE Fellow, and contributor to the SMPTE Educational Advisory Board, suggests the difference is subtle, but crucial. Such imaging technology breakthroughs, he says, can actually be aimed at either responding to or perturbing human expectations for viewing imagery under particular conditions, not always in terms of pushing the bounds of human consumption of media by somehow bringing imagery “closer” to what we can see and experience in the real world.
While compression might not be the sexiest topic, the entire world is growing increasingly dependent on the process daily now that video over IP concepts are surging to the front of the broadcast mindset. And that’s why, for those who pay attention to such things, it’s great news that “we are currently seeing more movement out of MPEG [the Moving Picture Experts Group] than we have in a very long time,” according to Russell Trafford-Jones, manager of support and services for UK-based video-over-IP specialist Techex and editor of the educational industry website called The Broadcast Knowledge. “In fact, we have many new compression standards just about ready to come forward. It’s a vibrant area. I’m seeing from my writing at The Broadcast Knowledge that posts about compression are among the most popular. That’s because, the truth is, everybody, needs to know about compression.”
When Tomasz Witkowski, a senior workflow engineer for UK-based Sundog Media Toolkit, Ltd., a company that specializes in media workflows built on Cloud platforms, and his Sundog colleague, Richard Welsh, the company’s co-founder, set to work to prepare a September 2019 SMPTE Journal article on “Media Cloud Fundamentals,” they realized the idea of cloud computing was no longer a new concept for business professionals generally. But, Witkowski says, the idea of converting media production and distribution processes “from the ground into the cloud” remains a confusing one across the media world.
The rise of immersive audio for cinema has entered an exciting new chapter of final testing and preparations for rolling out the format widely some five years after SMPTE’s TC-25CSS Cinema Sound Systems Working Group on Interoperability of Immersive Sound Systems in Digital Cinema first formed to standardize the delivery of immersive audio. Brian Vessa, Sony Pictures’ Executive Director of Audio Mastering, has been a big part of this evolution as the Founding Chair of the TC-25CSS committee and chair of the IMF Audio Essence Drafting Group in SMPTE’s TC-35PM Media Packaging and Interchange Technology Committee.
As media companies battle to keep up with the industry’s ongoing transition to IT-based workflows, the questions of what, exactly, a workflow is and means on this new landscape, and how best to implement the most efficient ones, sit at the root of the transition. That’s the view of workflow expert Bruce Devlin, SMPTE’s Standards Vice President, and through his company Mr. MXF, a provider of a wide range of services to companies seeking to improve workflows and file-based operations generally.
New methods for broadcast facilities to distribute video and audio data and metadata around their studios have been evolving for some time. Indeed, a revolution is currently underway in this arena thanks to the use of IP-based networks for professional AV media. This development has radically changed just about everything, and as discussed in a 2018 issue of Newswatch, it’s been radically enabled by the move away from SDI and toward incorporating the SMPTE ST 2110 suite of standards for Professional Media Over Managed IP networks.
When discussing the current state of immersive media in theatrical environments, Steve LLamb, VP of Media Technology Standards at Deluxe, co-chair of the SMPTE 21-DC Digital Cinema Technical Committee, and Chair of the 21DC Immersive Audio Drafting Group, emphasizes that providing strategic enhancements to theatrical viewing experiences may someday become crucial for content creators fighting it out in a multi-platform universe. After all, he says, “From the creative producer’s standpoint, if you are talking about immersive media, there is nothing more immersive than being able to take your content with you, wherever you go, whether at home or in your car or on a plane, on a mobile device.”
As recent media reports, a quick tour around the Internet, a journey through any major app store, or an exploration of the capabilities of your new Smart TV illustrate, one of the most profound changes sweeping the broadcast industry right now is the rapid proliferation of so-called over-the-top (OTT) streaming content services like Netflix, Amazon Prime, Hulu (which recently came under Disney's full operational control), YouTube Premium, and others. Indeed, an Adweek article late last year indicated that 2019 alone was slated to see new OTT content services launching from AT&T, Apple, Viacom, Discovery, and most notably, Disney’s much ballyhooed Disney+ service, which launches in November. A more recent article in Forbes discussed a consumer OTT user survey, which indicated that 61 percent of Americans already own a Smart TV and 52 percent use OTT services to one degree or another.
Understanding the current state of the ultrahigh-definition (UHD) ecosystem, and especially the prospects for UHD broadcast, requires “considering the relative ease or difficulty with which rapid technological advances can be accommodated across each sector in the ecosystem.”
August - A Foundation for Immersive Entertainment
The ongoing expansion of “immersive” or “inclusive” entertainment of various types marks one of the more unusual game-changing technology trends the entertainment industry has seen in recent years. That’s because the trend appears to be running on multiple tracks, all with enormous potential. The use of new tools to create virtual reality (VR), extended reality (XR), augmented reality (AR), and hyper-reality experiences, along with new forms of immersive theater and a few other things along the way, appears to be important both regarding the challenge of how to transform that content into meaningful consumer entertainment experiences on the one hand, and for the significance of the tools used to make the content to begin with.
June - Comprehensive Media Management
To put it simply, the big problem with managing media workflows these days is that there is virtually unlimited data to manage, dozens of platforms to deliver to, and a plethora of sub-categories within the greater content creation and delivery umbrella, according to Jesse Korosi, director of workflow services at Sim Digital and co-chair of the HPA’s Young Entertainment Professionals (YEP) Committee. Korosi emphasizes that a discussion of real solutions about how best to collect, manage, and distribute assets and metadata together all the way across the production and post-production chain needs to be comprehensive and global, starting at the camera head onward.
May - Hot Button Discussion: Content Security on a Decentralized Landscape
When asked what has changed the most in recent years when it comes to the subject of content security, Marc Zorn suggests it is the need to stand vigilant over what is now a decentralized data chain. Zorn has almost 30 years of experience in information security practices, heads up productions and content security for HBO, and is also on the Program Committee for the SMPTE 2018 Annual Technical Conference.
March - Hot Button Discussion: Evolving into HDR Workflows
The broadcast industry’s ongoing push toward incorporating 4K/high-dynamic-range (HDR) imaging capabilities into workflows has led to stunning advancements, giving content creators the ability to more efficiently create such imagery and consumers the ability to view it more easily.
February - Hot Button Discussion: Integrating Young Professionals into Production Engineering: The Millennial's Perspective
With the dizzying speed of next-generation technological breakthroughs in the media and entertainment world these days, the focus is frequently, and understandably, on the creative, cultural, and business implications of these developments. However, the next generation of actual humans now joining the production engineering workforce and increasingly influencing how these new technologies can and should be applied to the media food chain on an ever-changing landscape often get overlooked.
January - Hot Button Discussion: SMPTE ST 2110: IP Revolution's Next Step
The quest to make possible the wide-ranging adoption of IP-based professional production networks has greatly accelerated in recent months, thanks to concerted standards work across the industry. That work has led to SMPTE’s recent approval and publication of SMPTE ST 2110, suite of standards for Professional Media Over Managed IP Networks.
November - Hot Button Discussion: Preservation and Archiving: Next Generation
Andrea Kalas likes to joke that the No. 1 instruction she gives her staff at Paramount Studios, where she serves as vice president of Archives, is “don’t lose stuff.” In actuality, Kalas’ concerns are far wider ranging. She is the outgoing president of the Association of Moving Image Archivists (AMIA)—a position that ends in this month, with Dennis Doros of Milestone Films recently elected to take over the position. Given what she does for a living, and as someone intimately involved with the world’s largest association of professional moving image media archivists, her concerns revolve around figuring out the best methods for archivists to preserve reams of analog and digital media from industries across the spectrum on a rapidly changing, IT-centric landscape.
October - Hot Button Discussion: SDI in an IT World
When he spoke with Newswatch in 2014 about the status of Serial Digital Interface (SDI) technology on an increasingly IT-centric landscape, John Hudson, Director of Strategic Technology and New Business Development for the Semtech Corporation and Chair of the SMPTE’s 32NF-40 working group on SDI interfaces and the 32NF-70 6G SDI drafting group, stated that he believed SDI would remain relevant for the foreseeable future as a crucial element of a burgeoning hybrid industry infrastructure.
August - Hot Button Discussion: Making Immersive Audio Work for VR
As the virtual reality (VR) revolution marches onward, much of the technical discussion revolves around realtime rendering of visuals to make an immersive, interactive environment seem natural and intuitive for users.
July - Hot Button Discussion: AI Pushes VR Forward
Kevin Cornish, founder of the Moth + Flame VR content creation studio, is considered a leading director of cinematic virtual reality (VR) entertainment content, yet it was only three years ago that he had his very first experience creating VR product.
May - Hot Button Discussion: New Media Over IP Standard Foundation Set
Heading to 2017 NAB Show, Al Kovalick, founder of Media Systems Consulting and a longtime technical strategist and designer in the field of hybrid AV/IT systems, predicted that tools and techniques geared toward IT-based broadcast production facilities would be “the buzz of the show.” Shortly after NAB Show ended, Kovalick believed his prediction was correct, that “everyone was talking about IP,” and that “we didn’t have nearly that much interest last year.”
April - Hot Button Discussion: The Live Stream Rises
It wasn’t that long ago—early 2016—that a novelty online live video stream traveling through Twitter’s Periscope platform lured approximately 20,000 people to their computers to watch pedestrians splash around in a giant puddle somewhere in England. That was just a bit of good fun, yet although live video streaming has been around for a while, mainly as a niche or supplemental broadcasting alternative, milestones within the medium have been piling up in recent years. As such, it is starting to become apparent that the live-streaming phenomenon may well have the potential to fundamentally change the entire broadcasting industry over time.
February - Hot Button Discussion: Creative Control as the Palette Expands
Over the years, Newswatch has examined the development, improvement, and standardization of modern image display enhancements by the technical community. The January 2017 Newswatch discussed how consumers perceive new image enhancements in typical viewing scenarios. That issue, however, connects to how content creators are using these enhancements. From that perspective, how has the perception of content creators been factored into the evolution of the so-called “expanded palette” provided by the technical community? By expanded palette, we mean, in particular, the three big-ticket visual improvements—high dynamic range (HDR), wide color gamut (WCG), and the use of higher capture frame rates (HFR).
January - Hot Button Discussion: Human Image Perception
As engineers push ongoing image display technology enhancements, and content creators experiment with how they can utilize those enhancements to tell stories, a few questions naturally come up. How do either engineers or content creators know what average consumers are capable of seeing when they watch images in state-of-the-art display situations anyway? And what are the goals of enhancing the viewing experience? Are they merely to evolve, change, improve, or differentiate that experience from what it was previously? Or are the goals about trying to bring it closer to what human beings experience when they view the real world?
December - Hot Button Discussion: Inside the HDR Spec
As the complex ultra-high-definition (UHD), high dynamic range (HDR) ecosystem continues to evolve on the broadcast front, there are some simple concepts that should be considered as these issues are addressed, in the opinion of Tim Borer, Lead Engineer for Immersive and Interactive Content for the BBC’s Research and Development Division and a SMPTE Fellow. The first concept to understand is that what used to be known as “broadcast television” has evolved into “the Wild West when it comes to viewing environments,” meaning that broadcasters cannot possibly have any idea what viewing device or environment consumers will be using to watch their content at any given moment. Secondly, when addressing the issue of improving dynamic range and brightness values on consumer televisions and the content being viewed on those televisions, manufacturers and broadcasters would do well to keep the first concept in mind. They should avoid thinking in terms of a single, one-size-fits-all solution when it comes to distributing programming for the new generation of HDR displays.
November - Hot Button Discussion: Why Dynamic Metadata Matters
In the year and a half since he last spoke with SMPTE Newswatch about efforts to standardize ways to manage and convert content color information while mastering high dynamic range (HDR) content for a wide range of platforms and formats, Lars Borg, principal scientist in Adobe’s Digital Video and Audio Engineering Group and Chair of one important SMPTE workgroup in this arena, suggests the ball has not only moved forward, but also far and wide, particularly where the topic of dynamic metadata is concerned. Of course, he concedes, for many people, this is a difficult issue to understand in the sense that they don’t exactly know how to define dynamic metadata.
October - Hot Button Discussion: Using the Cloud for Production
It’s currently an exciting time for media professionals using Cloud-based computing services for content production and post-production work. That’s because Cloud tools and techniques for particular services have arrived at a point in history where they have become practical and affordable options for content creators. Simultaneously, the technology appears to be showing great potential for making certain new applications available to media professionals, as well. At least that’s the view of Richard Welsh, a longtime SMPTE standards contributor and participant and CEO of the UK’s Sundog Media Toolkit, a company that offers a variety of Cloud-based post-production software services for the feature film and broadcast industries.
September - Hot Button Discussion: Significance of SMPTE Standards
As part of SMPTE’s ongoing Centennial celebration, it’s useful to examine some of the major technology milestones directly influenced by the Society’s 100 years of work standardizing important breakthroughs within the motion imaging industry and engaging in a wide range of educational and outreach activities to increase awareness of these efforts and their importance. A recent SMPTE Webcast, for example, examined the top SMPTE standards that have fundamentally helped shape the industry over the last century.
August - Hot Button Discussion: Compression Down the Chain
The topic of compression has never been a sexy one, but inside the increasingly complex production ecosystem that serves the modern, multi-platform, high-resolution broadcasting industry, few subjects are more important. That’s the view of Jean-Baptiste Lorent, product and marketing manager of Belgium-based IntoPIX, an independent image technology company deeply involved in various initiatives related to new and evolving compression schemes. As Lorent points out, with more data and larger files involved with capturing, processing, and transmitting broadcast image files, “we have a lot of challenges to solve in the full chain—all the way from initial image capture to the final display of those images, plus storage, bandwidth, and interface issues.”
June - Hot Button Discussion: Acquisition HDR
As the broadcast industry pivots into 4K UHD broadcasting, much of the chatter revolves around how best to transmit and manifest 4K signals so that high dynamic range capabilities are maximized on the first generation of 4K displays, as discussed in the May issue of Newswatch. After all, HDR is all about increasing the range of brightness in images in order to boost contrast between the whitest and the blackest elements. However, the front end of the equation in terms of acquiring the image with the greatest dynamic range possible, to begin with, and then maintaining it all the way down the chain, is also evolving. Indeed, image capture for various types of broadcast applications, particularly live broadcasting, is being forced to factor in numerous changes and new considerations with the advent of UHD. Among them—how best to capture, utilize artistically, and then maximize the effect of greater dynamic range, and how best to combine higher dynamic range capabilities with other UHD improvements, including wider color gamut, higher frame rates, new scanning formats, and other considerations.
May - Hot Button Discussion: How Best to Interface HDR into 4K Displays
When recently asked to identify a pressing issue in the display-manufacturing world, veteran broadcast industry analyst Pete Putman pointed to the challenge of most efficiently and seamlessly incorporating high dynamic range capabilities into the next generation of consumer displays. As Putman, a member of SMPTE’s Annual Technical Conference Committee, discussed last year in a Newswatch article, the overall industry trend line remains steadfastly pointed in the direction of phasing out large-screen high-definition televisions and replacing them with Ultra HD (3840 x 2160) TVs. But his point was that this ongoing transition is still in its early stages, meaning that some image display improvements are not currently available, easily accessible, or available in their most efficient form in the first generation of large-screen UHD televisions for various reasons. And none of those improvements, he adds, are more important than the industry’s recent thrust into the world of high dynamic range (HDR).
April - Hot Button Discussion: IP for Content Creators
Much of the focus regarding media’s thrust into the world of IP-based data transmission has revolved around the issues of how to build Ethernet foundations for broadcast plants, and the various methods for delivering content efficiently to consumers using Internet Protocol (IP)-based methodologies and systems. But beyond broadcasters, content creation companies of all sizes and types are also currently in various stages of melding into the IP universe. For them, “there is a definite push to make the entire content industry a data-centric industry, and we are in that transition right now, but it’s not yet a done deal,” explains John Footen, Partner in the Information, Media and Entertainment Business Consulting Practice at Cognizant Technology Solutions. Footen, who has previously served as co-chair of SMPTE standards committees TC-32NF (Network/Facilities Architecture) and TC-34CS (Media Systems, Control, and Services), says “we are arguably at the beginning of that transition, not even at the halfway point yet.”
March - Hot Button Discussion: Light Field Cinema Capture Coming
Conceptually, the notion of capturing what has come to be known as “light field” imagery dates back to Leonardo da Vinci in the 16th century. Back then, da Vinci detailed theories about an “imaging device capturing every optical aspect [of] a scene.” In one manuscript, he talked about “radiant pyramids” being visible in images. Today, optical experts say by “radiant pyramids,” da Vinci meant “light rays,” and explain that da Vinci was describing what we now refer to as “light fields.”
February - Hot Button Discussion: Drone Implications
While so much of the news about unmanned aerial vehicles (UAV’s) revolves around the many business, safety, and cultural implications of what people commonly refer to as drones, for the cinematography community, the technology’s rapid evolution has brought with it creative implications of potentially huge proportions. That’s the view of Michael Chambliss, a longtime director of photography/camera operator, and now a business representative for the International Cinematographer’s Guild (ICG), focused on the implementation of new on-set technologies. Chambliss, who also serves as the ICG representative on the ASC Technology Committee, the virtual Production Committee, and on USC Entertainment Technology Center projects, suggests that drone-based camera work is leading to a potentially new and important style of cinematography—a development akin to the arrival of the Steadicam in the 1970s.
November #2 - Hot Button Discussion: Cinematically Immersive Environments
One of the less well-defined concepts percolating through the media industry right now is the notion of the so-called multi-screen or multi-view environment. The term can potentially be applied in different directions, depending on whether one is discussing home viewing environments, cinematic environments, or virtual environments, and whether one is discussing the use of multiple devices to view and digest content or supplement primary content with secondary content, or the use of multiple screens to create a single image for an immersive viewing experience in a cinematic setting.
November #1 - Hot Button Discussion: Implementing Assistive Technologies
Since SMPTE Newswatch last examined the topic of closed captioning and other accessibility technologies a couple of years ago, not much has changed in terms of governmental regulatory requirements on broadcasters to widen access to modern communication technologies. Indeed, the only major recent action taken by the FCC regarding accessibility related to the expansion of rules regarding how to get critical emergency information to consumers with visual impairments by making that information accessible on their so-called “second screen” personal assistive devices. However, since the Twenty-First Century Communications and Video Accessibility Act of 2010 was passed, the media industry has steadfastly been seeking ways to make captioning, video description, and other enhancements more consistently available with their content across all platforms. In fact, the action in this space right now appears to be focused mainly around how to most efficiently implement the FCC’s requirements across an industry that “broadcasts” content just about everywhere, to everyone, using both traditional and non-traditional methods, and delivery and viewing systems.
September - Hot Button Discussion: Precise Time and Synch on an Imprecise Landscape
Over the course of the past year, one consequence of the media industry’s initial strides toward networked, IT-based broadcast facilities has been for the industry to ramp up an examination of the steps needed to ensure that broadcasters can generate and transmit synchronous video signals across networks large and small, including on and between those that are state-of-the-art and those that will still dwell in a barely digital nether-region for the time being. In other words, the industry has been working on the problem of what is the most efficient way to generate, transmit, and manage such signals through a hybrid, or patchwork, minefield of systems and technologies. This question has led SMPTE and other industry bodies to advance work on how best to apply the industry standard specification for ensuring that synchronized time can be delivered from control stations to all kinds of slave devices—the IEEE’s 1588 Precision Time Protocol (PTP)—to professional broadcast applications specifically.
August - Hot Button Discussion: Compression in Context
Since Newswatch reported a year ago on the ongoing rollout of the High-Efficiency Video Coding compression standard (HEVC, also known as H.265 and MPEG-H Part 2), the industry conversation about HEVC’s importance as the necessary lynchpin that enables the transmission of ultra-high-definition (UHD) content to consumer homes has not changed much. Quite simply, “for UHD specifically, basically the only way to bring that content to the home efficiently is to use HEVC,” states Matthew Goldman, senior vice president of TV compression technology at Ericcson and a SMPTE Fellow, reiterating a point he made in that September 2014 issue of Newswatch. “We needed a much better compression scheme than what we were using previously for UHD, and HEVC was that scheme. So right now, that is where HEVC is gaining traction—in places where you can’t [compress a so-called 4K signal and transmit it to consumers] any other way. That is where we are now seeing HEVC implementations. But for more traditional television, such as linear [HD] broadcasting, HEVC’s rollout is more in the future—we don’t expect it to happen quickly. It will be ongoing.”
July - Hot Button Discussion: Broadcasting Live Events in UHD
Much of the conversation regarding the broadcast industry’s next great leap forward into the world of Ultra High Definition (UHD) has centered around how broadcasters will be building or rebuilding their wider infrastructures on IT-based foundations capable of handling the high-bandwidth data that the UHD broadcast paradigm requires. Less debated are the nuances of the front end of the UHD transition—image capture. This is largely due to the belief that ultra high-resolution cameras have become so common that this shouldn’t be much of an issue. But that notion is simplistic, according to many broadcast professionals, because it is references modern digital cinematography camera systems with high-resolution imaging sensors, none of which are particularly applicable to conversations involving the broadcast of live events, particularly where sports and action are concerned. But figuring out how to shoot and broadcast that kind of content specifically is crucial to broadcasters because it is live content—sporting events, concerts, breaking news, and the like—that modern, IT-based streaming services like Netflix are not addressing. That means such content remains the province of major broadcast entities as the UHD era dawns, and they need to shoot such events so that the images will translate well on UHD televisions configured for watching 4K resolution movies with a variety of other image characteristic improvements—greater dynamic range, higher frame rates, better color, among other things.
June - Hot Button Discussion: Pushing Cinema Sound Systems Into the Future
The “thickest” part of the work currently being conducted by the SMPTE Technology Committee on Cinema Sound, TC-25CSS, is the ongoing initiative on the interoperability of immersive sound systems for cinemas, according to Brian Vessa, chairman of the technology committee and also executive director of digital audio mastering at Sony Pictures Entertainment. That work is a reaction to an audio technology revolution from a small group of companies that eventually caused the industry to conclude, as Newswatch noted last year, “that a single specification for the packaging, distribution, and theatrical playback of D-Cinema-based audio tracks that pushes past what was initially described in the original Digital Cinema Initiative specification” was of crucial importance.
May - Hot Button Discussion: IMF's Growing Relevance
With the rollout of the Interoperable Master Format (IMF) ongoing, it’s instructive to examine the impact so far, and looming next steps for the flexible, new international standard format for file-based professional workflows. IMF is essentially an umbrella term for a linked family of standards that permit content publishers and distributors to exchange master files and linked metadata that make it more efficient to disseminate different versions of their material to all the world’s viewing platforms and territories, no matter what form those platforms may take today or in the foreseeable future.
April - Hot Button Discussion: Better Dynamic Range All Around
With terms like ultra-high-definition (UHD), 4K, and “next-generation television” flying furiously through broadcast industry conversations these days, an obvious question is often overlooked: What is the core image improvement that consumers are most likely to notice, pay for, and at the end of the day, care about? Generically, many people say that “better resolution” is what next-generation television is all about. But, as Lars Borg, principal scientist in Adobe’s Digital Video and Audio Engineering Group, emphasizes, that’s a subtle concept that depends on a host of factors related not only to how content is created and mastered, but also to how it is transmitted and viewed, on what device, in what room, and under what conditions.
March - Hot Button Discussion: UHD Televisions Advance
About a year ago, broadcast industry analyst Pete Putman suggested to Newswatch that while the broadcast industry's overall transition to a 4K Ultra High Definition (UHD) ecosystem would be a long and winding process, a foundational element was already well under way--the inexorable march by consumer display manufacturers toward phasing out large-screen, high-definition televisions and replacing them with Ultra HD panels (with a resolution of 3840 x 2160). A year later, Putman says the display industry's transition has picked up considerable speed, to the point where he expects the production of large-screen 1080p HDTV's to largely cease in the next few years.
February - Hot Button Discussion: The UHD Chain, One Link at a Time
As new technological developments make it increasingly clear that the broadcast industry's commitment to pushing past HD and into the Ultra High Definition (UHD) realm is continuing to pick up steam, it's also clear that this is a broad, open-ended, and somewhat fuzzy process. When the initiative gets boiled down to specific aspects of the broadcast chain, the industry remains far from having all the answers in terms of how to implement a full UHD broadcasting ecosystem any time in the near future.
January - What's the Best IP Video Path Forward?
With 2015's arrival, the work being done to drive the broadcast industry toward IP-based foundations for broadcast and studio facilities was accelerating, suggests longtime industry consultant Wes Simpson, who works with SMPTE to help produce its Annual Technical Conference & Exhibition and online courseware regarding IP video. Simpson argues that the industry has now arrived at a place where the IP video debate has been comfortably settled as it relates to the transition to IP for delivering video to consumers, now that the new ATSC 3.0 specification, which is based on IP, is close to being finalized.
December - New Tools, New Creative Possibilities
As 2015 arrives, it's appropriate to ponder the creative impact of some of the numerous technological advancements unleashed across the cinematic landscape that SMPTE Newswatch has been examining in recent months. By that, we mean, how have the technical capabilities provided by new and improved digital tools, workflows, and techniques impacted the creative community's ability in recent years to advance how they use image and sound attributes like 3D, immersive audio, higher frame rates, and higher dynamic range, among other things, to tell stories?
November - Achieving IP-Based Facilities for Content Creators
As content creators and broadcasters continue to build new IP-based foundations for their facilities, two factors are becoming increasingly clear. The first is that Ethernet technology is rising to the forefront of this transition as the industry's best and most reliable replacement for SDI technology to move live video data streams over IP networks, as recommended by the SMPTE 32NF-60 Working Group and discussed in the November 2013 Newswatch, bringing with it lots of corresponding technological innovations to make it possible for broadcast plants, production companies, and studios to build IP-style plants. The second point is that all the cool technology in the world won't matter much if it can't be seamlessly designed, engineered, utilized, and integrated by such entities that, for decades, have used analog equipment and physical, passive media inside the heart of their facilities.
October - Augmented Reality Light Fields
Of the many interactive digital technology platforms designed to offer modern consumers new entertainment experiences, none have the paradigm-changing potential of so-called virtual reality and augmented reality technology. This was clearly demonstrated earlier this year when Facebook announced a massive $2.3-billion-dollar acquisition of videogame start-up company Oculus VR, the company that created the Oculus Rift virtual reality headset. The acquisition was part of a Facebook strategy to not only move into the space generally, but also to explore whether virtual reality (VR) technology that had originally been developed for professional-oriented, simulation applications in the military, medical, and industrial design worlds, and later ported over to the gaming universe, might eventually evolve into a platform to build the ultimate, interactive, consumer shopping and entertainment playground of the future.
September - HEVC Slowly Rolls Out
Since SMPTE Newswatch last reported on the then-impending formal arrival of the High-Efficiency Video Coding compression standard (HEVC, also known as H.265 or MPEG-H, Part 2) in 2012, much has transpired in terms of the standard's direction and potential impact on a wide range of improvements regarding the efficient broadcasting of high-quality video content direct to consumers. The bottom line on one hand, suggests Matthew Goldman, senior vice president of TV compression technology at Ericsson, is that HEVC has clearly established itself as the eventual enabling compression standard for making the transmission of ultra high-definition television (UHDTV) content to consumers possible. On the other hand, he points out that the process of fully rolling the new spec into the professional and consumer hardware systems necessary to make all this work efficiently could take several more years as the horse, in essence, is only now leaving the starting gate.
August - Approaching an Immersive Audio Standard
While precise technical and timeline details about a single, internationally accepted standard for an interoperable, immersive audio format for digital cinema remain to be settled, one important aspect of this quest has crystallized in the past year or so, according to Peter Ludé, chair of the SMPTE Working Group on Immersive Sound (operating under the auspices of the SMPTE TC-25CSS Audio Technology Committee on Cinema Sound Systems). Ludé says the big breakthrough in the past year has not been technical in nature. Rather, it has been the fact that virtually everyone with a stake in the conversation has gotten publicly and enthusiastically on board with the industry's need for a single specification for the packaging, distribution, and theatrical playback of D-Cinema-based audio tracks that pushes past what was initially described in the original DCI (Digital Cinema Initiative) specification
July - HFR's Next Steps
Among the many digital cinema standardization processes currently in various stages of development, none is more dependent on input and interaction with the creative community than the issue of higher frame rates (HFR). That is especially true these days, as the industry's examination of how, where, and when HFR might, could, or should be woven into the digital cinema fabric on a formal basis is largely reliant on what it learns from ongoing experimental work by the creative community as part of a strategy to analyze HFR's technical impact on workflow and distribution, and its creative impact on viewers.
June - UHDTV- Which Flavor
Since SMPTE Newswatch last spoke with him about the status of ongoing efforts to develop an international framework for developing standards for ultra high- definition television (UHDTV) last Fall, David Wood suggests a key development on the UHDTV landscape involves a sharpening of several basic "world views" about how the next generation--or rather, generations--of broadcast viewing technology should evolve in the coming years.
May - The HDR Ecosystem
While many image quality improvements are easily definable and explainable as the digital broadcasting world pushes into the ultra-high-definition (UHDTV) era, the question of high dynamic range (HDR) is less well-defined. At this year's NAB show, HDR was a hot topic certainly, and both professional and consumer display manufacturers are now vigorously promoting different ways in which they are improving the quality of their pixel offerings.
April - The Business of 4K Displays
As NAB 2014 approaches, the broadcast industry's march into the world of 4K Ultra High Definition (UHDTV) understandably lies at the heart of many trend conversations about where the next generation of display monitors are heading. As SMPTE Newswatch reported in the October 2013 issue, both the technology and standards' work for UHDTV is moving along at a relatively fast clip, and hardware manufacturers are already well into their rollout of the first generation of UHD televisions. At the same time, however, questions permeate those conversations about how rapidly broadcasters can convert their infrastructures for UHD, create meaningful 4K content to distribute widely, and whether UHDTV will, at the end of the day, catch on with consumers any faster or more permanently than 3D or Smart TV technology did.
March - Evolution of SDI
The startlingly rapid evolution of new broadcast television and digital cinema applications, such as 3D, UHDTV1, UHDTV2, and 4K D-Cinema production, all potentially at increased frame rates and bit depths, has created a "bandwidth disparity" in terms of realtime streaming media bandwidth capability, suggests John Hudson, director of production definition and broadcast technology at the Semtech Corporation. Hudson chairs SMPTE's 32NF40 Working Group on SDI interfaces and the 32NF70 6 Gbit/s SDI drafting group.
February - AXF Advances, 8K, and Digital Workflows
It has been noted in recent issues of SMPTE Newswatch that 2014 is shaping up to be a year in which standardization efforts to address certain long-standing, crucial issues in the world of electronic media data transmission will bear major fruit. Among these is the issue of interoperability in data archive systems. To address this issue, in recent years, the SMPTE Working Group TC-31FS30 has been pushing to develop an open specification, the Archive eXchange Format (AXF), and codify it into an official industry standard. AXF addresses the need for interchange of archived data and is intended to end years of separate, proprietary approaches to archiving, in which systems from different manufacturers do not have the ability to simply and efficiently migrate data to and from systems of other manufacturers.
January - Lip-Sync Progress, Faster Satellite IP, Standardized Camera Reports, and more
In the world of technical broadcast standards, 2013 showed major progress. 2014 promises to significantly extend that progress, as it relates to a problem that began vexing content creators almost from the first day that sound was added to moving images, and then become far more complicated once the digital broadcast, multiplatform era took over. That longstanding problem involves how to seamlessly synchronize audio and video program signals from the moment of acquisition to when that content reaches viewers. This challenge, of course, is better known as the "lip-sync issue."
November - VoIP for Professional Media Networks
As the Video over Internet Protocal (VoIP) revolution marches on in the different sectors of the media content creation, distribution, and consumption sectors, it is helpful to consider what the term VoIP means to broadcasters as they push to evolve their facilities for the file-based era. From their point of view, in terms of adopting IT tools and principles, the SMPTE 32NF-60 Working Group is focusing on VoIP specifically as it relates to professional media networks--the professional use of live video over IP.
October - The UHDTV Paradigm
Headed to IBC 2013, where Ultra High Definition Television (UHDTV) was to be a prominent topic, David Wood, Chairman of the International Telecommunications Union (ITU) working party 6C, the group responsible for making international recommendations associated with UHDTV, was eager to discuss the latest on UHDTV. UHDTV has clearly soared from being a buzzword and interesting technological experiment to a format that will have major implications for broadcasters, filmmakers, and consumers in the coming years.
August - Workflows in the Cloud
Now that media distribution companies such as Hulu, Netflix, and others have successfully pioneered ways to distribute media content using Cloud computing-based systems, the industry is now pondering what parts of the content production workflow chain might someday be moved into the Cloud.
July - Broadcast Networking Infrastructures
If you want to see a technological culture clash in action, look no further than the ongoing drive to modernize the networking infrastructures of typical broadcasting institutions. It is there, suggests broadcast media consultant and longtime SMPTE Fellow John Luff, that traditional broadcast techniques, IT technology, and financial pressures are routinely colliding these days.
June - Internet Broadcasting
With all the buzz surrounding digital television these days, the impact of a specific subset of that category--Internet Protocol Television (IPTV) and broadcasting video via the Internet in a myriad of ways--can get lost in the haze of all the new camera, encoding, and display technologies and the fancy, new workflows and infrastructure overhauls that are now permeating the broadcast world. That said, in a sense, no new technology has more fundamentally impacted the traditional broadcast industry model than the Internet. After all, studies are consistently showing that the rapid growth of IPTV subscribers around the world, generally, the wide-ranging distribution of Internet-enabled television technology, and the ongoing movement toward Web-based services such as Netflix and Hulu have transformed home media consumption by diverting it away from merely "watching TV" to the advent of home media centers, where TV, streaming, games, data, and interactivity with content are all available to the consumer on demand. This development has diluted, disrupted, or altered traditional broadcast viewing numbers, habits, programming strategies, and business models in recent years.
April - Compression Trends: HEVC and More
The July 2012 SMPTE Newswatch covered how the Main Profile of the new High Efficiency Video Coding (HEVC) scheme, technically known as ITU-T H.265|ISO/IEC 23008-2, was on the verge of being finalized--a potentially significant leap forward in helping broadcasters achieve meaningful bandwidth savings on the distribution of high-quality video images over the existing H.264|MPEG-4 AVC standard. This past January, as expected, the ITU-T Study Group 16 achieved consensus and MPEG received approval for publication of the standard. That consensus, according to industry experts, is that HEVC has the potential to reduce data bit rates by as much as 50% over the current AVC standard. The new standard includes a main (8-bit support) profile and a main-10 (10-bit support) profile.
March - What's Next for Digital Cinema?
In an era where manufacturers of digital cinema-related technologies are routinely pushing an onslaught of tools and workflow techniques to advance what is theoretically possible in categories like stereoscopic imagery, higher frame rates, laser projection, content security, image capture, and so many others, the question to ask is what will eventually be feasible, rather than what will be possible. Keeping that question in mind, the follow-up might be, what is the next big thing for digital cinema, or at least, what should be the next big thing?
February - Closed Captioning
In recent years, most of the significant action in advancing the accessibility of closed captions in media has occurred in the world of broadband/Internet-based video. This march forward shifted into high gear in the past few years, since the Federal Communications Commission (FCC) encouraged adoption of the SMPTE Timed Text (SMPTE-TT) format for delivering closed captions over the Internet.
January - Technology Trends 2013
Fresh from a visit to the 2013 Consumer Electronics Show (CES), SMPTE Immediate Past President Peter Ludé, also Sr. VP of Engineering at Sony Electronics, recently sat down for his annual conversation with SMPTE Newswatch to offer his thoughts on where media creation, distribution, and viewing technologies might be heading in 2013. His overall impression is that the industry remains in a state of evolution, rather than revolution, but he is pleased with the progress he has seen in many technology categories.
January - Next Generation Cinema Audio
In the world of theatrical sound, there are two important issues that SMPTE Technology Committees and sound system manufacturers are expected to focus on in 2013, closely followed by content creators and exhibitors. The first issue revolves around the growing need to improve ways to standardize testing, measurement, and calibration of theatrical sound systems in order to arrive at consistent playback between theaters. The second major issue involves expanding and improving multichannel playback to provide the cinema listening audience with "immersive" sound.