Author

Jun YiFollow

Date of Award

12-12-2022

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Computer Science

First Advisor

Yubao Wu

Second Advisor

Xiaolin Hu

Third Advisor

Ashwin Ashok

Fourth Advisor

Jun Kong

Abstract

360-degree live video streaming is becoming increasingly popular. While providing viewers with enriched experience, 360-degree live video streaming is challenging to achieve since it requires a significantly higher bandwidth and a powerful computation infrastructure. A deeper understanding of this emerging system would benefit both viewers and system designers. Although prior works have extensively studied regular video streaming and 360-degree video on demand streaming, we for the first time investigate the performance of 360-degree live video streaming. We conduct a systematic measurement of YouTube’s 360-degree live video streaming using various metrics in multiple practical settings. Our research insight will help to build a clear understanding of today’s 360-degree live video streaming and lay a foundation for future research on this emerging yet relatively unexplored area.

To further understand the delay measured in YouTube’s 360-degree live video streaming, we conduct the second measurement study on a 360-degree live video streaming platform. While live 360-degree video streaming provides an enriched viewing experience, it is challenging to guarantee the user experience against the negative effects introduced by start-up delay, event-to-eye delay, and low frame rate. It is therefore imperative to understand how different computing tasks of a live 360-degree streaming system contribute to these three delay metrics. Our measurement provide insights for future research directions towards improving the user experience of live 360-degree video streaming.

Based on our measurement results, we propose a motion-based trajectory transmission method for 360-degree video streaming. First, we design a testbed for 360-degree video playback. The testbed can collect the users viewing data in real time. Then we analyze the trajectories of the moving targets in the 360-degree videos. Specifically, we utilize optical flow algorithms and gaussian mixture model to pinpoint the trajectories. Then we choose the trajectories to be delivered based on the size of the moving targets. The experiment results indicates that our method can obviously reduce the bandwidth consumption.

DOI

https://doi.org/10.57709/32523170

File Upload Confirmation

1

Share

COinS