Skip to main content

MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones

Publication ,  Conference
Li, A; Wu, C; Chen, Y; Ni, B
Published in: Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)
October 11, 2020

Recent research has made great progress in realizing neural style transfer of images, which denotes transforming an image to a desired style. Many users start to use their mobile phones to record their daily life, and then edit and share the captured images and videos with other users. However, directly applying existing style transfer approaches on videos, i.e., transferring the style of a video frame by frame, requires an extremely large amount of computation resources. It is still technically unaffordable to perform style transfer of videos on mobile phones. To address this challenge, we propose MVStylizer, an efficient edge-assisted photorealistic video style transfer system for mobile phones. Instead of performing stylization frame by frame, only key frames in the original video are processed by a pre-trained deep neural network (DNN) on edge servers, while the rest of stylized intermediate frames are generated by our designed optical-flow-based frame interpolation algorithm on mobile phones. A meta-smoothing module is also proposed to simultaneously upscale a stylized frame to arbitrary resolution and remove style transfer related distortions in these upscaled frames. In addition, for the sake of continuously enhancing the performance of the DNN model on the edge server, we adopt a federated learning scheme to keep retraining each DNN model on the edge server with collected data from mobile clients and syncing with a global DNN model on the cloud server. Such a scheme effectively leverages the diversity of collected data from various mobile clients and efficiently improves the system performance. Our experiments demonstrate that MVStylizer can generate stylized videos with an even better visual quality compared to the state-of-the-art method while achieving 75.5× speedup for 1920×1080 videos.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)

DOI

Publication Date

October 11, 2020

Start / End Page

31 / 40
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Li, A., Wu, C., Chen, Y., & Ni, B. (2020). MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones. In Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc) (pp. 31–40). https://doi.org/10.1145/3397166.3409140
Li, A., C. Wu, Y. Chen, and B. Ni. “MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones.” In Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc), 31–40, 2020. https://doi.org/10.1145/3397166.3409140.
Li A, Wu C, Chen Y, Ni B. MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones. In: Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc). 2020. p. 31–40.
Li, A., et al. “MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones.” Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc), 2020, pp. 31–40. Scopus, doi:10.1145/3397166.3409140.
Li A, Wu C, Chen Y, Ni B. MVStylizer: An efficient edge-assisted video photorealistic style transfer system for mobile phones. Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc). 2020. p. 31–40.

Published In

Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc)

DOI

Publication Date

October 11, 2020

Start / End Page

31 / 40