Program for SMPTE17 (Sydney, Australia on 18-21 July 2017)
Tuesday, July 18
Tuesday, July 18 9:30 - 11:00
This topic will focus on the value proposition SMPTE provides the industry. We will talk though issues focusing on education of our members and take a look at a new standards initiative.
- 9:30 How the television industry can help create motivating online learning experiences for university students
- Research shows that everyone learns differently and one of the ways to present material and motivate students is through video. Internet-based learning has created new revenue opportunities for the television industry to provide programming for high school and middle school STEM courses, but there is an untapped opportunity to provide video content for use in secondary (university) education not only in Australia but internationally. As more and more higher educational institutes adapt their courses to mobile devices and force students to be active over the internet, instructional designers are eagerly looking for video content to enhance this active learning. Unfortunately, much of the useful desirable video content available is too long and requires editing. Time and money the education industry normally does not have. This paper will present research, by using video clips from a documentary and sports program from Canada, to show how the content and teaching themes of the program can be adapted by producers to better fit the needs of the international educational industry and how revenue can be created. The paper will be of interest to program managers and producers looking at additional ways to grow revenue as well as educational media technicians interested in understanding more about the direction international education is moving.
- 10:00 Arts and Science - Building the Educational Bridge
- Art in the media and entertainment industry cannot exist without the Science, and without Art the Science has no meaning and no market for products. A long long time ago, the film and television industry had a significant training scheme for young people entering the industry and they went on to develop careers in the industry. During this time, there was only either PAL or NTSC, no one cared about colour space and there were just one or two resolutions. The massive changes that took place from the 1990's saw this education process collapse. At the same time, the digitization of media introduced complex new technologies in terms of formats, colour spaces, encoding and new technologies are further expanding industry complexity. The democratization of technology has made it available to a very large cohort. There is a very large gap in the knowledge of the creatives, and many in engineering with the choices of technology. Experience has shown that the technical education of creatives in the film schools is extremely limited. Yet it falls to these creatives to make significant choices when they prepare their project. Technical parameters are all part of the decision-making process in today's modern content creation. We are also now seeing the blending on traditional studio ENG style cameras with cinematography cameras and the inherent issues that occur. This paper explores building a bridge between the arts and sciences and the value this brings to the entire industry. The particular requirements in explaining technical jargon and science to be comprehendible and meaningful to the creatives and engineering is explored.
- 10:30 OBID: A Revolution in Audience Measurement?
- Audience measurement has long been dominated by closed systems, created in the age of single-screen viewing. Our world has evolved to one in which multi-platform consumption of media is a reality. All eyeballs watching your content must be accounted for in an open way, encouraging innovation and transparency. An effort has been brewing recently to realize an open, standards-based means of achieving true cross-platform measurement. It's known as the Open Binding of IDs (OBID) initiative, and it involves a collection of industry leaders, from the network, advertising, and measurement sectors. What it promises to do is nothing short of revolutionary. We'll look at what this group is doing, and its potential game-changing impacts to audience measurement.
Ever want to understand the economics of Media, then this session is for you. We will be looking at a series of papers that focus on the commercial elements of Media production.
- 9:30 The 'Everything on Demand' era is here
- While the broadcast industry has seen 'Multiplatform Distribution' as one of the most important technology trends for the past several years, the audience shift is happening more rapidly than expected. In some countries in Asia, smart phones are the dominant way people access the internet and mobile video. The explosion of mobile video consumption is both a threat and opportunity for broadcasters. This paper/presentation will look at the data behind this mobile video explosion, the winners and losers, the new content consumption landscape, and the key technologies that enable broadcasters to efficiently create and distribute mobile content.
- 10:00 Linear Sales Automation and Data Insight as a Step Towards Programmatic TV
- As audiences and advertising revenues shift from linear to digital platforms, data-driven automation of linear sales processes provides a key stepping stone towards programmatic TV and cross-platform trading, which promise to provide operational efficiencies and increased revenue. This paper will examine how correlating linear and digital audience data and establishing converged data management and insight capabilities can increase understanding of audience engagement, consumption and behaviour. The paper will conclude by discussing how broadcasters can develop and implement the required shifts of technologies and operating models and illustrating the ways in which these can enable improved business performance and business controls.
- 10:30 Ooyala Quarterly Video Index / Insights into online video consumption patterns
- Online video revenues across 14 markets in Asia Pacific are projected to rocket from US$12.8 billion last year to US$35 billion by 2021, with a compound annual growth rate (CAGR) of 22%, according to a recent report by Media Partners Asia (MPA)*. This growth includes both advertising and subscription-based online video services, and is the result of consumers (and advertisers) increasingly moving to online video in recent years. OTT video is no longer a consideration but a strategy that broadcasters, media companies and operators have to execute effectively in order to strive in this next wave of the TV industry. Given that the industry is still being disrupted and evolving, broadcasters and operators need industry insights to help them make better business decisions. Ooyala publishes a quarterly video index which reflects the anonymized online video metrics of the vast majority of our 500+ customers, whose collective audience of hundreds of millions of viewers spans nearly every country in the world, results in a representative view of global consumption and engagement trends. With all these insights and trends into how, where and when the consumers are watching, we provide an analysis and conclusion on where the industry is headed.
Tuesday, July 18 11:30 - 13:00
SMPTE invites members and guests to the opening address and keynote for the SMPTE17 Conference "Embracing Connective Media". Join us as we open the conference and welcomes Ms Michelle Rowland MP, the Shadow Minister of Communications, as the Key Note Speaker.
Tuesday, July 18 14:00 - 15:30
This session takes a deep dive into the issues around live streaming, from processing, distribution and cloud storage.
We also take a breather to look at another storm - how to get 10G cables terminated effectively, and we've snared a global expert to introduce you to the topic.
- 14:00 Live streaming broadcast-quality video over commodity Internet networks
- Live and near-live streaming of broadcast-quality video content over IP networks has traditionally used linear transport over satellite and dedicated fibre networks -- high quality but limiting for today's use cases as these networks are often not available on demand, and can require large capital investments and long startup times. In addition, the linear feed format can be difficult to customize and integrate with modern file- and cloud-based workflows. Internet broadband bandwidth is increasingly available at production venues, opening the possibility for new live video transport solutions, but despite various techniques being used to groom the IP network, the same universal quality and "zero-delay" experience of traditional live satellite transmission cannot be provided because arrival rates and round trip delays cannot be guaranteed. This session will outline the technology behind Aspera FASPStream Software, an open video transport solution capable of live streaming broadcast-quality video globally over commodity Internet networks.
- 14:30 Dynamic Lifecycle Management of a Data Corpus across multiple Cloud Storage Services
- Every public cloud storage provider and service offers unique combinations of performance, availability and cost. Parking content in one service for an entire lifecycle can result in increased cost and misalignment of performance in one or more technical dimensions. This paper discusses how customers can distribute a single corpus of data over a mix of public storage services to obtain optimal performance and price through each step of content's lifecycle (acquisition through archive storage and retrieval). The paper will provide a number of examples using currently available services, performance and costs, along with strategies for dynamically managing storage.
- 15:00 Improvements in Data Cables and Connectors
- Everything is heading for Ethernet, so data cables and connectors are something you cannot avoid. When it comes to Category 6a (10gig) cables, they are very hard to assemble and get full performance. There are new RJ45 connectors which are easier to assemble, better performing, and less likely to fail. This presentation will look at the development and use of such a connector, key to good performance in the emerging IP world.
This tech savvy session really hits a note for those who love workflow, how automating it can assist with the demands of modern playout on multiple platforms, re-timing of live subtitles and the workflow of distributing content on multiple platforms.
- 14:00 How can automation help the broadcast industry in meeting the demands of digital TV
- The global TV & video industry is experiencing disruption and evolution with the rapid growth of video streaming services like Netflix and Amazon. This session will discuss the importance of eliminating silos of production through tight integrations across the ecosystem, and introducing production automation to eliminate manual tasks, shorten production cycles and increase productivity. These recommendations will include global examples from broadcasters and production companies who were facing some of these industry challenges.
- 14:30 Re-timing of Live Subtitles
- Live subtitles for television, which are produced using speech to text software, are inherently late compared to the audio that they represent. This delay can result in a sub-optimal experience for the viewer. There is an opportunity to reduce this delay by exploiting the time taken by the broadcast encoder to encode the video for transmission which is longer than the encoding time for subtitles. As such, the subtitles have a compensating delay to ensure pre-prepared, accurately authored, subtitles are synchronised with the audio. During programmes with live subtitles the compensating delay can be decreased to reduce the delay of the live subtitles. This paper describes explains the technology, by detailing a proof of concept carried out to assess the effectiveness and practicality of implementing this technique in a live broadcast environment.
- 15:00 Content everywhere
- Assuming that content is going to be delivered, managed and stored in file form, now is the time to see what asset and workflow management functionality should migrate to the cloud. This paper will draw on work with some of the world's major production and broadcast companies, including Discovery Communications, but will also demonstrate how the same cloud techniques can be applied to production asset management and distribution for even the smallest enterprises to achieve maximum efficiency and monetisation.
Tuesday, July 18 16:00 - 17:30
This topic spends lots of time looking at Cloud - how to produce it and how to store it.
- 16:00 Lost in the Clouds? Cloud Storage Fits Into Media Workflows—Just Not Everywhere
- Within the media and entertainment space, the question businesses face with respect to cloud infrastructure is not whether to use it but how best to use it. Public cloud storage provides flexible, on-demand storage capacity and an operating expense financial model. Whilst moving media post-production workflows entirely into "The Cloud" would solve content storage, sharing, and distribution challenges, there are limitations and cost implications. Cloud storage has attributes that make it perfectly suited for some workflow stages and perfectly ill-suited for others. The use of private cloud is adding another dimension to this tiered storage approach, offering many advantages. This presentation will describe a multi-tier storage strategy that maximizes the cost, access, and performance benefits of different types of storage with a particular focus on the most appropriate and beneficial uses of private and public cloud. It covers how the right storage and data management technology can aid media companies in making the most of all tiers of storage including cloud infrastructures.
- 16:30 The Future of Storage in IP / Cloud Based Broadcast Workflows
- The emergence of IP-based broadcast workflows, increasing file sizes, and new means of content acquisition create new challenges for organizations as content grows exponentially, placing greater demands on infrastructure, workflows, and storage. To increase business efficiency, organizations are driven to re-evaluate their existing infrastructure and workflows. As traditional archive models struggle to meet evolving business needs, organizations are best served to seek out new technologies and architectural approaches to address their business drivers. The cloud architectural model offers solutions to many of these challenges. As such, organizations are looking to a hybrid cloud approach as they design their next-generation infrastructure and workflows. This hybrid model supports a mix of on-premise, private, and public cloud. During this session, we will examine and review the benefits and concerns associated with the adoption of new technologies, and outline how organizations can align them with the pressing business needs of this ever-demanding and evolving digital world.
- 17:00 SaaS Based Video Production
- Virtualization, Cloud and Software as a Service (SaaS) are redefining video production, especially for complex OTT distribution scenarios. Workflows are currently being refreshed where the traditional product / appliance approach is increasingly questioned from those seeking improved flexibility, streamlined support and commissioning service provision on an ad-hoc basis. While there is often a strong commercial desire to move from CAPEX to OPEX, this paper will explore the technical justification for such a move and highlight successful approaches with examples to put the move to SaaS in context.
Everything you want and need to know to get started with HDR
The workshop is aimed at non-technical, newbie and getting-started practitioners and technologists alike.
- 16:00 HDR Introduction Workshop - Pat Griffis
- Come and meet the new notion of colour volumes, what it takes to deliver more than 100 nits peak white, learn the terminology of High Dynamic Range, and the benefits of analysing content to provide a consistent image regardless of device. If you don't yet know anything about High Dynamic Range, this is the session you need to start with, because you'll leave knowing what you don't know and how to approach the sessions in the rest of the conference. Plus: find out about SMPTE 2094-10 which will be the emerging standard for delivering scene-based content information, because smarter pixels make better pictures.
Wednesday, July 19
Wednesday, July 19 9:00 - 10:30
This session focuses on innovations in workflow and the standards underpinning those changes, as well as the changing infrastructure on which these workflows exist. Whether you're interested in IT or IP (or both), this session is for you.
- 9:00 The Automation of the Grouping Process
- The automation of the grouping process has been proven to turn a tedious, laborious task that traditionally takes many hours to do manually and is subject to human error and various other issues, into a task that now takes a matter of seconds. Group It For Me! was acknowledged by the National Academy of Television Arts and Sciences in 2016 with a Primetime Engineering Emmy Award for its technical contribution to the way television is made.
- 9:30 BXF: New Horizons and Help For Implementers
- The Broadcast eXchange Format (BXF) has been around for over a decade now, and has become one of the most popular methods of metadata exchange decoupled from the actual content. Most are familiar with its capabilities in integrating sales, traffic, and on-air playout operations. However, it has evolved recently to include such areas as: - Commercial Delivery Support - Traffic Instructions - Content Delivery Specification Exchange - Quality Control Now at version 5.0 of this standard, and have also recently introduced a BXF SDK, making implementation of this wide-ranging standard much simpler for those who might have been intimidated by it in the past. We'll talk about all the latest advances, how they can help you to better achieve automated workflows, and how BXF is being used today to solve a wide array of issues. Key in all of this is how BXF fits in with all of the other "puzzle pieces" that promise to help you with your management of metadata.
- 10:00 System Migration from SDI to ST 2110
- Today we build systems on SDI infrastructures which forms the basis for the feature and performance requirements of ST2110. To be successful, ST2110 needs to enable an evolutionary transition from SDI to IP, whether it's a change out to IP with legacy SDI islands and edges, or IP islands in SDI systems; achievable using ST2110 for transport and ST2059 for Synchronization. This presentation reviews the expectations and characteristics of SDI systems as a platform on which to discuss IP evolution including handling essence, timing and synchronisation.
The improvements in vision from SD to HD and now UHD make imperfections more noticeable to the viewer. Papers in this session explain some ways of creating good quality pictures and sound to maximise the benefits of today's technology and looks at an interesting case study using IT-based switching in Francis Ford Coppola's new "live cinema" project.
- 9:00 Pixel size and the effect on performance in 2K,4K, 8K acquisition
- Assume that one would do 2K,4K,8K in 2/3" inch, implying that pixel size has to do down Or assume that 2K,4K,8K is done with the same pixel size, e.g. from 2K in 2/3" up to 8K with 35mm. This presentation will focus on how that impacts MTF, Depth Of Field, Noise, dynamic range, sensitivity, f-number, the usable operating region and some parameters scale with pixel size some with image diagonal.
- 9:30 Calibrated Color and Software for LED Luminaires
- LED luminaires have become increasing sophisticated over the past several years. Lampheads are essentially computers with light engines attached to them. Some manufactures have been able to take full advantage of this by pairing advanced electronics with versatile software features. More importantly is the utilization of calibrated LED systems to create precise and accurate colors. With a calibrated LED system, software can be developed over the lifetime of the fixture to create new and better ways of generating color. In this paper, we will discuss the possibilities and advantages of such a system and what it means for the way we pick and generate colors using LED luminaires for motion picture and television image capture.
- 10:00 IT-based switching brings Francis Ford Coppola's new "live cinema" project to life
- This session goes behind the scenes of an ambitious "live cinema" project from Francis Ford Coppola. The legendary filmmaker used new IT-based switching technology to cut together feeds from 40 cameras, which captured a live production-film hybrid called Distant Vision. The film's 17 scenes each had their own complex camera requirements, so each scene's inputs and sources were pre-set using an IP-designed switcher with creative software that scales well beyond the limits of standard switchers. After capture, the inputs and sources were displayed on one of three multiviewer screens in UCLA's control room while the next scene's set up was displayed on another. The system, provided more creative freedom, lessened set-up times and simplified production, enabling Coppola and his team to focus only on the sources needed at a certain point of the production and enabled faster and more direct switching for the technical director. The discussion will include insights from three weeks spent on the set, including on-the-fly system changes/advances.
Wednesday, July 19 11:00 - 13:00
IP infrastructure changes everything. This session examines how we transport IP around the plant, whether it is the best choice and some of the new challenges of synchronisation as the time-honoured black burst signal can no longer hold everything together.
- 11:00 Adventures in 4K Video
- 4K video is here, but how will you integrate it into your professional broadcast facility? There are three basic choices, coax, twisted pairs (Cat 5e, 6, 6a and 8 soon), and fiber optic cable. The story of each is surprising. This also covers the history and development of Belden's 4K 6 GHz and 12 GHz coax, what we currently offer, with test data and details, and where we're headed.
- 11:30 12G or IP
- This presentation examines differences between HD and the many UHD variants, different ways to transport them and the technical/practical implications of making a choice. An analysis of Quad vs 2si, HDR content, compressed versus uncompressed, emerging IP transport speeds, 12G over copper and the distances attained, and the evolution of quad link. Routing platforms are also considered: traditional processing routers with internal multiviewers and internal audio processing vs the distributed processing approach that comes as a result of IP. Finally, blended systems using top of rack implementations are analyzed, as is the overall implication of these systems for live production.
- 12:00 Synchronization in Hybrid SDI / IP Broadcast and Production Facilities
- Using IEEE-1588 Precision Time Protocol (PTP) in conjunction with the SMTPE ST 2059 Standard, all legacy reference signals can be virtualized over an IP network fabric. This presentation discusses traditional synchronization mechanisms and how they work, and then explains the fundamentals of PTP. On this foundation, how the ST 2059 Standard virtualizes legacy references is examined and how legacy reference infrastructures can be evolved transparently to support IP synchronization. Construction of fully timed hybrid systems is explored using use cases.
- 12:30 Taming the VOD monster - Delivering broadcast quality to consumers anywhere without blowing the budget
- Delivering to a broadcast audience means reaching the viewer on any device anywhere at anytime. The challenge is to reach the audience with a broadcast quality product at scale without blowing the budget. In this paper we cover the pain points of operations teams around the world, what VOD platforms require in both essence production and metadata and how modern software defined solutions can help.
If you are someone who has a bee in your bonnet about quality production, then this topic is for you and will focus on quality control in 2017.
- 11:00 Quality Control & Monitoring in OTT Workflow
- In the OTT world, viewers are watching content when they want, where they want and on the device they want. Content needs to be streamed as per user requirements, on demand and as per the resolution of the playing device. Broadcasters need to ready their content for this mode of playback - they don't control the delivery, consumer does. OTT technology is evolving, and the requirements for monitoring are also changing. Monitoring tools need to be architecturally versatile in order to accommodate this environment and allow broadcasters to figure out which issues are the most critical. Ultimately, broadcasters should choose an OTT monitoring solution for Live and VOD assets that works in tandem with a file-based Quality Control (QC) tool. By deploying a complete QC and monitoring solution for ingest to delivery, broadcasters can deliver the best Quality of Service (QoS) and Quality of Experience (QoE) to viewers in the OTT world.
- 11:30 QoE and Video Quality - Operational Challenges and Opportunities
- Video service providers, whether traditional broadcasters, network operators or 'OTT' video service providers all recognise that video service quality is a crucial element of their offering to viewers, directly affecting the value of their business; their revenues and operational efficiencies. Understanding and ensuring viewer video service quality is therefore a key concern but, despite best intentions is often poorly understood from an operational perspective. Bridge Technologies thus proposes to present an introduction to video service quality, introducing overlapping terms such as QoE, QoS, Video Quality and considering their meaning. The presentation presents some of the QoE/QoS/Video Quality tools and techniques currently available, and considers the challenges and opportunities of using these tools within an operational environment.
- 12:00 File based QC challenges for Linear & VOD deliveries
- Audiovisual content has moved beyond traditional television and is now available on a variety of new screens. Content has to pass through various processes and organisations before reaching these screens. To provide the best experience to consumers on all the screens, including television, these organisations need to have an integrated process for detection and rectification of content issues. This is possible only with a holistic QC strategy. This presentation will discuss the various file QC challenges faced during the content cycle and effective ways of handling them.
- 12:30 Synopsis: Improving Operational Efficiency by using Automated File Based Audio Processing
- Improving Operational Efficiency by using Automated File Based Audio Processing As the world of broadcast moves away from tape to a file based IP centric solutions, processing audio in Video Files (MXF, .mov) has become increasingly complex. There is also additional pressure to deliver more content with fewer resources. Presently, Edit Suites are used to perform, mundane and non creative Audio Processes (loudness, Dolby E Encode Decode, channel swap, replicate, mute etc. This paper will examine some of the challenges involved in performing the common audio processes that are required and compares traditional approaches with a file based automated solution. We will present a number of use cases that are helping a number of Broadcasters and Content Delivery companies to greatly improve their operational efficiency
Wednesday, July 19 14:00 - 15:30
Matthew Goldman leads an in-depth tutorial on IT architectures and provides and industry overview of the state of play.
- 14:00 Broadcaster Migration to IT Infrastructure
- In order to become more agile in operations and leverage the economies of scale and flexibility that IT infrastructure brings, Broadcasters increasingly have been migrating from broadcast-specific architectures to IT-based solutions. This goes hand-in-hand with the trend toward software-defined media processing and network function virtualization. This tutorial will first give an overview of the challenges that Broadcasters face, the benefits of transforming to "All IP" and a high-level overview of what exactly is meant by "All IP". It then will dive deeper into the technologies themselves, starting with a discussion of the Joint Task Force on Network Media (JT-NM) roadmap and the data essence and control planes of interoperability, followed by an overview of the major standards beyond "All IP", including Precision Time Protocol, Real-time Transport Protocol, Session Description Protocol, Real-time Media over IP, and Discovery & Registration. The current state of the industry will be reviewed and what's to be done next.
Changing equipment brings with it unexpected issues. Learn to avoid some of the traps in implementing new systems and their impact on workflow and how to optimise performance.
- 14:00 Subtitle Delivery in Automation: Teletext OP-47, IMF, and OTT
- To maximize efficiency and savings, broadcasters today are focused on automating all their video deliverables within their transcoding workflow. If subtitling isn't considered when building that automation, it can become a real speed bump as subtitling requires the ability to handle various formats and protocols such as Teletext OP-47 and TTML for OTT delivery. In addition, future video formats such as IMF will require broadcasters to migrate their existing video archives while preserving and transcoding subtitling data. This paper discusses the many formats, potential pitfalls, and guidelines for success when including the subtitling process in an automated workflow.
- 14:30 How to Coordinate the Streamlined Operations for Commercial Purposes Successfully?
- In today's digital age where audiences are able to access the content they want quickly, Content Delivery network (CDN) is the new media to serve content with high availability performance based on the geographic location of the user, the origin of the webpage and a content delivery server. For every broadcaster, the real challenge lies in how to go into the new media outside of traditional licence or transmitter area boundaries and to new devices. This paper will explore simple broadcast playout systems, playlists, CDN and user profiles to generate different commercial content for each set of audiences, enabling broadcasters to multiply their revenue with minimal effort. Different CDN providers require different metadata schema and metadata needs to be delivered on time together with the video. When you need to deliver blocks of content instead of a constant stream, metadata is the key. IP and CDN distribution present new business opportunities. However, this is a technology that requires new methods to amplify its benefits.
- 15:00 IP and Scalable Video Performance
- The transition from analog to digital took nearly a quarter of a century and an act of Congress; the nonlinear revolution consumed another decade. By contrast, the transformation to an IP workflow is arriving like a bullet train, without legislation, sanction, or specification. In this article we will examine what is driving this headlong rush, and how it will lead to much more elemental transformations. We will make the case that high-performance IP workflows offers meaningful advantages in productivity and can often be achieved without extravagant expenditures or disruptions.
Wednesday, July 19 16:00 - 17:30
Ubiquitous content distribution requires re-purposing of content into may forms to access the variety of different display devices. This session looks at technology, standards and cloud based micro services offering solutions to tomorrow's problems.
- 16:00 Near real time global video processing in the cloud via micro services
- Utilising modern globally distributed cloud computing it is now possible to take a single source clip out of the studio and have it dynamically encoded, packaged, encrypted and on the fly edited for ad insertion using cloud technology. This paper will cover some practical approaches for how it is now possible to globally distribute content from a central repository using a minimal set of mezzanine assets to achieve the maximum flexibility in distribution across devices, regional specifics (captions, alternate audio) and digital rights management.
- 16:30 Media Storage in the Cloud
- This paper provides insight into the emergence of cloud-based storage services and illustrates how video providers and content distributors are leveraging the scalability, reliability and speed of Internet storage to meet their requirements. Readers will learn about challenges using storage in the cloud; best practices to achieve the highest level of security and performance, together with cost optimization; transitioning to cloud-based video infrastructures; and, optimizing the performance and cost of their cloud storage. Use cases highlight how media and entertainment companies can benefit from cloud object storage.
- 17:00 Project Suitcase: Using C4, Semantics and NoSQL for Managing Motion Picture Data
- Data has never been more important to the Entertainment industry and this session will highlight pioneering work getting to the source of data in today's distributed cloud production environments. The Entertainment Technology Center at the USC sponsored "The Suitcase" Project, a short film created by Abi Damaris Corbin with a technical test to examine the process of upstream authoritative metadata extraction as well as creating and enhancing metadata farther downstream when necessary. This project used the C4 framework - Cinema Content Creation Cloud, semantics and NoSQL technologies to manage tagging metadata. Subsequent projects have leveraged production data to impact financial reporting in addition to authoritative source data.
Join Anna Lockwood from SMPTE and an exceptional panel of women technology and business leaders at a special session during the SMPTE Conference. Come to be inspired, learn and be challenged as each panellist speaks from different stages and perspectives on their careers in media. If you're a student exploring a career in media technology, one of our industry colleagues wanting to hear more from women in the media and technology business, or an Executive interested in building diverse and high performing teams, this is a great opportunity to participate in an open and important conversation. The panel will feature Beatriz Alonso-Martinez, Business Development Director, Ooyala; Jacqui Feeney, Managing Director, Fox Networks Group ANZ; Tanya Kelly, Senior Consultant, Cadium; Holly Knill, Director of Product & Technology, Seven Network; Audrey Ku, Chief Solutions Architect, Linius Technologies.
Thursday, July 20
Thursday, July 20 9:00 - 10:30
This session will enthrall you with an interesting case study on a big screen at Taronga Zoo and conclude with some interesting results on cinema sound quality. A case study extravaganza!
- 9:00 Delivering Drama for Extreme Screen
- A case study on delivering drama for the unique cinema screen recently constructed at Taronga Zoo. On a scale comparable to IMAX but with a curvature greater than Cinerama, the Centenary Theatre posed a remarkable combination of technical challenges, including managing an average of 1.5 TB of data per day while shooting. This paper will cover such topics as full workflow beyond 4K, codec selection for shooting as well as post and delivery, working with a 5:1 aspect ratio that's more than double Cinemascope and testing for an immersive picture and surround sound system that had yet to be built and don't exist anywhere else. Examining issues such as handling the audience has a 270º perspective, acceptable size of pixel pitch viewed at close range and the effects of the large, wrap-around screen with it's similarities and differences to both VR & 3D. This will be a highly technical case study with detailed information and materials from behind the scenes including diagrams and calculations from planning through to delivery.
- 9:30 Designing, Planning and Executing Immersive Audio for the 2016 Rio Olympics Opening Ceremony
- At the Rio 2016 Olympics, US broadcaster NBC, cable provider Comcast and Dolby Laboratories executed the first broadcast of 4K, HDR and Dolby Atmos to selected VIP viewing parties across the US. This broadcast was the result of weeks and months of design and testing with a number of partners spanning the broadcast industry. This will be a discussion of the path taken and choices made to execute this broadcast successfully, as well as capture and edit an Opening Ceremony highlight reel containing Dolby Atmos audio for both Xfinity HD Comcast subscribers and the Comcast 4K Sampler application deployed on Samsung and LG UHD televisions. This case study provides an insight to other broadcasters on possible approaches to deploying similar efforts on other high-profile events.
- 10:00 "Why does cinema sound quality mostly fail to realise its potential?" Some interesting results from the SMPTE 's 2014 report on cinema sound systems
- In late 2014, SMPTE issued a substantial report titled "Frequency and Temporal Analysis of Cinema Sound Systems", which has become an important milestone for the SMPTE's review of the B Chain systems. Presented in the report are detailed measurements of the time and frequency domain performances of the multi-channel sound systems in four commercial cinemas and two dubbing stages, and a detailed commentary about the process and results. This paper examines some of the interesting results and trends in those measurements, which point to reasons why cinema sound is currently far from ideal.
Reducing cost is a high priority for any business. The costs of producing an outside broadcast event or gathering content form the field involve high staff setup costs. with transmission costs reducing it is now feasible to do less in the filed and more in house so in this session we look at a fascinating case study and innovations in remote production and contribution.
- 9:00 LTE Managed Media Contribution
- Broadcasters are shouting out for more flexible and instant ways that they can provide their audio and video feeds from Outside Broadcast locations into their studios. I have been researching ways to integrate the LTE network with Fixed Wide Area Network to provide customers with a flexible, reliable and non-contended mechanism to contribute audio and video back to their studio locations while preserving the quality of the feeds, minimising latency and guaranteeing delivery.
- 9:30 Cost Effective Alternative Solutions for Remote and Synchronized Multi-Camera Live Video Production and Social Media Distribution
- Live coverage of major broadcast and sporting events using multiple synchronized cameras has traditionally been produced using costly on-location production trucks and crews with dedicated links. Broadcasters around the world are adapting new technology to stay ahead of the curve and streamline remote synchronized multi-camera video production. This session will outline an innovative solution using nothing more than a broadcaster's existing studio infrastructure and a public Internet connection in the field and cover advancements in IP based live video acquisition, transmission and distribution.
- 10:00 Remote TV production for F1
- Previously, RTL Television, had a full production team on site at the race track. The world feed, produced by F1 host broadcaster Formula One Management (FOM) and completed by RTL Television's own production team on site, resulted in a fully-produced program delivered on bidirectional links between the race circuit and the broadcast station in Cologne. A good part of RTL Television's production team on-site now remains in Cologne - including vision and sound control, along with editing suites and replay servers. On location are presenters, reporters and editors, cameramen, and technicians. The on-site production now includes two wireless cameras and live commentary, transmitting three feeds (plus three return video feeds) besides the host program feed to Cologne. This paper will describe the groundbreaking remote production concept is supported by a MediorNet on-site backbone, providing de-centralized router functionality and a high-density signal distribution network which also provides a perfect connection for the wireless camera receiving infrastructure.
Thursday, July 20 11:00 - 12:30
We are moving from the real world into a more virtual world for the creation of both pictures and sound. Get updated on the latest developments in vitrualisation and what it can do for you.
- 11:00 Capturing sound for VR and AR
- Interest in creating programme content for Virtual Reality and Augmented Reality is growing quickly. This session looks at two different ways of capturing audio in formats, multi-channel and 2 channel, to suit VR and AR soundtracks and explores the differences in the audience perception that can be expected from these recordings.
- 11:30 The AR/VR Opportunity
- While there are technology similarities between Augmented Reality (AR) and Virtual Reality (VR), there are marked differences in the use cases and potential presented by each. This paper will present ibb's perspective on the different use cases made possible by each of these technologies, as well as a high-level roadmap that aims to provide a structure that could be used when planning the development and rollout of AR/VR products and services. It will analyse consumer devices as well as the hardware, software, network and cloud-based infrastructure required to "end-to-end enable" commercial delivery of AR/VR services. The paper will conclude by illustrating a series of AR/VR adoption possibilities and timelines for service providers.
- 12:00 Designing Spatial Sound: Adapting Contemporary Sound Design Practices for Virtual Reality
- The introduction of 360º film and Virtual Reality (VR) has meant practitioners have had to adapt to a multitude of new platforms and specifications, with no existing pathways or working methodologies. VR audio is continually being redefined, with contemporary film sound practices having to adapt to new forms of spatialisation and headphone delivery. The variations in platforms also necessitates different delivery requirements, with the audio often a secondary consideration. This paper will examine the current state of sound for virtual reality and discuss sound design for VR from a creative practice perspective.
Thursday, July 20 11:00 - 13:00
Its been 3 years since Australia completed its transition to digital with spectrum released a year later and local broadcasters face new challenges in transmission. In this session we will hear about the UK experience, look at outside impacts on transmission and examine how to cater for better services in the future.
- 11:00 Challenges facing UK public service broadcasters
- The UK public broadcasters have transitioned to a fully digital terrestrial broadcast TV service whilst maintaining their viewer share in a very competitive UK broadcast market. Now facing new challenges to release more spectrum, the growth of over-the-top video services, IP television and increased competition for prime content rights. This paper provides an insight into the state of the UK and European FTA broadcast market and discusses the strategies that the main commercial public service broadcasters have adopted to strengthen the free to air Freeview DTT platform.
- 11:30 Ambient Noise levels in broadcast radio frequency bands
- The noise floor within the AM and FM broadcast frequency bands comprises intentional and unintentional radiators. Many new full power and low power stations in the last two decades has increased the contribution of intentional radiators, but the growth of unintentional radiators has been dramatic. All microprocessor based devices have clock frequencies, plus many compact fluorescent lights and LED lights emit substantial RF energy, in these bands. The impact and possible mitigation strategies of these unintentional radiators will be discussed.
- 12:00 "Taking the UHD Experience to the Next Level: Phase 2"
- UHD Phase 1 has provided us with higher resolutions. Now with UHD Phase 2 we also get higher Frame Rates, more colours, higher contrast and new audio coding schemes. Different High Dynamic Range technologies will be shown. The consequences for new services in terrestrial distribution such as DVB-T2 will be presented and discussed.
- 12:30 Cost Increase due to UHD Video Broadcasting as Compared to HD
- It is well known that the television industry has been working passionately to bring UHD content to the viewer's home. 4K production cameras, broadcast equipment and television sets are ready to go, but the actual transmission of the UHD content is yet to begin due to transmission cost constraints. Therefore, in this research, cost increase due to UHD video transmission is compared to HD, using signal quality parameters, in a Rician Fading Channel. Results show that the increase in cost for UHD video transmission is negligible when transmitted through 8PSK, but significant for QPSK. This research work is a continuation of work presented at the SMPTE15 conference, "Effect of UHD High Frame Rates (HFR) on DVB-S2 Bit Error Rate (BER)".
Thursday, July 20 14:00 - 15:30
HDR will allow for creation and display of better quality pictures. This session looks at the technology and some of the issues involved in its implementation and the impacts this has on other aspects of production.
- 14:00 HDR from the perspective of a camera manufacturer
- An introduction into HDR from the perspective of a camera manufacturer - HDR Background and Terminology - Intro into HDR technology - Dynamic Range - Transfer curves The presentation may not contain HDR imagery due to the assumed lack of presentation environment, but could if a HDR environment is present.
- 14:45 HDR production from top to bottom
- The key issues with the emergence of HDR will be covered, including: shooting; lighting considerations; camera capabilities and how that works with HDR; codecs: what to use and why; Post Production considerations; how many masters?; mastering monitors; target EOTF (PQ/HLG); deliverable paths and the moving target of peak luminance in domestic displays. Taking many of the gems learnt from coverage of NAB and the interviews conducted, this presentation will give you an overview of each topic.
This interactive panel session moderated by Anna Lockwood of SMPTE will focus on the latest technology developments in sports broadcasting, including remote production workflows, immersive and interactive technology, wearables, data analytics and artificial intelligence in sports, and OTT video delivery at scale. With representation from a wide range of sports industry leaders, this is a session not to be missed.
The session will feature short technology presentations followed by a panel discussion and audience Q&A. Presenters and Panelists include Luke Gooden, NRL; Stuart Newman, Telstra Broadcast Services; Scott O'Brien, Humense VR; Rebecca Reed, Odgers Berndtson; Djuro Sen, Seven Network; Ken Shipp, SBS Television; Stuart Taggart, Envision ; and Nick Vanzetti, ESL Gaming Australia.
Thursday, July 20 16:00 - 17:30
Teaming HDR with UHD holds great promise. Learn more about this dynamic duo.
- 16:00 A format independent native HDR live production workflow
- HDR production workflows are well established for non-live applications, typically based on cinematography style image capturing and a file based post production process. For live applications these established workflows are not usable and other ways producing live content in HDR need to be developed and implemented, catering for a new series of problems including mapping different HDR transfer functions and conversions. The paper will explain how a format independent native HDR live production workflow with the different HDR transfer functions can be realized and in addition results and present experiences from multiple HDR test productions.
- 16:30 HDR Signalling, Metadata and Chroma Downconversion: Key Additions to a UHD Workflow
- High Dynamic Range (HDR)-capable screens are becoming available so many assume that all is in place for UHD channel rollout. This may be the case for some VOD streaming services, but so far HDR-enabled UHD services complimenting Standard Dynamic Range (SDR) for channel delivery need new workflows, HDR scheme support, and high quality conversion to preserve quality for current HD viewers. These additions will move UHD from being a resolution-only upgrade to unleashing the benefits of HDR. This paper will explore what is needed from a workflow, backwards compatibility and image quality perspective for this move to become reality.
- 17:00 Virtualized 4K-HDR Video Solutions
- This paper examines virtualized solutions for video compression, from concept to deployed solutions and benefits. Leveraging existing servers in the data centre for several applications allows operators quick implementation of new services without having to physically access a specific machine and install the operating system and software. The paper discusses the benefits of HDR distribution solutions, addressing key video distribution issues like bandwidth management and backwards compatibility and cover state-of-the-art technologies, with or without 3rd party proprietary HDR technology metadata.
As broadcasters current technology reaches end-of-life, what would prevent or enable a transition to a full IP facility? This session is about the central site infrastructure, but recognising that an IP workflow could be implemented in your own hardware, in a private cloud, or a public cloud or some amalgam of the three. Has technology matured to the point where the full IP facility is possible, and how do we build the skills to make this happen?
The panel will feature a range of industry stakeholders; Terry Manley, Chief Engineer, NEP Australia; Leander Serrao, Senior System Architect, NEP Australia; Dave Bowers CTO, Nine Network; David Ross CEO, Ross Video, Paul Briscoe SMPTE Fellow and Principal Consultant, Televisionary Consulting and Philipp Lawo, CEO, Lawo AG.