Christian Timmerer, Associate Professor at the Institute of Information Technology (ITEC) and Director of the ATHENA Christian Doppler Laboratory, has been appointed IEEE Communications Society Distinguished Lecturer for the term 2021-2022.
IEEE ComSoc Distinguished Lecture program page: https://www.comsoc.org/membership/distinguished-lecturers
Speaker Profile: https://www.comsoc.org/christian-timmerer
Topics and abstracts (below)
- HTTP Adaptive Streaming (HAS) — Quo Vadis?
- Quality of Experience (QoE) for Traditional and Immersive Media Services
- Immersive Media Services: from Encoding to Consumption
- 20 Years of Streaming in 20 Minutes
- Multimedia Communication, Networking, Protocols, Delivery
- Multimedia Standards (MPEG, IETF, W3C)
HTTP Adaptive Streaming (HAS) — Quo Vadis?
Video traffic on the Internet is constantly growing; networked multimedia applications consume a predominant share of the available Internet bandwidth. A major technical breakthrough and enabler in multimedia systems research and of industrial networked multimedia services certainly was the HTTP Adaptive Streaming (HAS) technique. This resulted in the standardization of MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH) which, together with HTTP Live Streaming (HLS), is widely used for multimedia delivery in today’s networks. Existing challenges in multimedia systems research deal with the trade-off between (i) the ever-increasing content complexity, (ii) various requirements with respect to time (most importantly, latency), and (iii) quality of experience (QoE). Optimizing towards one aspect usually negatively impacts at least one of the other two aspects if not both.
This situation sets the stage for our research work in the ATHENA Christian Doppler (CD) Laboratory (Adaptive Streaming over HTTP and Emerging Networked Multimedia Services; https://athena.itec.aau.at/), jointly funded by public sources and industry.
In this talk, we will present selected novel approaches and research results of the first year of the ATHENA CD Lab’s operation. We will highlight HAS-related research on (i) multimedia content provisioning (machine learning for video encoding); (ii) multimedia content delivery (support of edge processing and virtualized network functions for video networking); (iii) multimedia content consumption and end-to-end aspects (player-triggered segment retransmissions to improve video playout quality); and (iv) novel QoE investigations (adaptive point cloud streaming). We will also put the work into the context of international multimedia systems research.
A preview of this talk is available here.
Quality of Experience (QoE) for Traditional and Immersive Media Services
Quality of Experience (QoE) is the degree of delight or annoyance of the user of an application or service. It results from the fulfillment of his or her expectations with respect to the utility and / or enjoyment of the application or service in the light of the user’s personality and current state. — QUALINET white paper
In this talk, we will present the fundamentals of Quality of Experience (QoE), its definitions, factors influencing and features of QoE as well as application areas for both traditional (e.g., audio/visual content including adaptive streaming thereof) and immersive (e.g., 360-degree, virtual reality) media applications and services
Immersive Media Services: from Encoding to Consumption
Universal media access as proposed in the late 90s, early 2000 is now a reality. Thus, we can generate, distribute, share, and consume any media content, anywhere, anytime, and with/on any device. A major technical breakthrough was the adaptive streaming over HTTP resulting in the standardization of MPEG-DASH. The next big thing in adaptive media streaming is virtual reality applications and, specifically, omnidirectional (360-degree) media streaming, which is currently built on top of the existing adaptive streaming ecosystems. In this talk, we focus on omnidirectional (360-degree) media from creation to consumption as well as first thoughts on dynamic adaptive point cloud streaming. We survey means for the acquisition, projection, coding and packaging of omnidirectional media as well as delivery, decoding and rendering methods. Emerging standards and industry practices are covered as well (OMAF, VR-IF). Both parts present some of the current research trends, open issues that need further exploration and investigation, and various efforts that are underway in the streaming industry.
20 Years of Streaming in 20 Minutes
In this talk, we will present an overview of streaming techniques developed in the past 20 years including early streaming systems, first adaptive bitrate (ABR) streaming systems, early standards in this domain (e.g., RTSP, ISMA, 3GPP PSS), and the shift towards HTTP-based streaming (progressive download, pseudo streaming, HTTP adaptive streaming) incl. standard formats MPEG-DASH, HLS, CMAF. The talk is concluded with a summary and outlook about what comes next.
A preview of this talk is available here (which has been jointly provided with Yuriy Reznik, Technology Fellow and Head of Research at Brightcove).
Multimedia Communication, Networking, Protocols, Delivery
In this talk, we will provide an overview of multimedia communication focusing on formats, networking, protocols, and delivery. In particular, we will review traditional push-based vs. pull-based delivery techniques utilizing different network protocol stacks (RTP/UDP vs. HTTP/TCP) including its underlying data formats.
Multimedia Standards (MPEG, IETF, W3C)
In this talk, we will provide an overview of multimedia standards mainly from MPEG, IETF, and W3C:
- MPEG: video coding formats (AVC, HEVC, VVC); system standards (MPEG-DASH, CMAF)
- IETF: RTP-based family of standards and HTTP/1.x/2/3 in the context of HTTP adaptive streaming
- W3C: HTML5, Media Source Extensions (MSE) and Encrypted Media Extensions (EME)