UPDATED 17:15 EDT / AUGUST 14 2017

EMERGING TECH

MIT’s new AI might solve slow video streaming

A team of researchers from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory have developed an artificial intelligence called Pensieve that could make slow or blurry web videos a thing of the past.

Streaming video content has grown by leaps and bounds in recent years, but unfortunately so has its bandwidth requirements. To overcome this challenge, video platforms like YouTube or Netflix use algorithms that divide videos into chunks that are easier to handle. If their system detects a slowdown in your Internet speed, the next chunk of video will be played at lower resolution as it tries to catch up.

The idea behind this system is to ensure smooth video playback, but sometimes even with the reduced quality, the video still has to pause for several seconds as it tries to buffer the next chunk. This happens because the algorithms do not always accurately predict what resolution should be used for the next chunk.

That’s where Pensieve comes in. The MIT researchers developed it using a system of rewards and penalties that trained it to recognize effective buffering techniques. For example, Pensieve might be rewarded whenever it successfully played a full video without having to rebuffer, or it might be penalized if video quality dropped below a certain threshold.

“It learns how different strategies impact performance, and, by looking at actual past performance, it can improve its decision-making policies in a much more robust way,” said Hongzi Mao, a Ph.D. student and the lead author on the paper that outlined Pensieve’s process and function.

According to the research team’s paper, Pensieve can stream video with 10 to 30 percent less rebuffering than other systems, and users rated its “quality of experience” at 10 to 20 percent higher.

Mao said that Pensieve is flexible enough to adapt to different requirements. For example, it could be trained to prioritize continuous video playback at the cost of resolution, or it might be trained for the opposite, prioritizing quality over smoothness.

The research team’s next project will be to test Pensieve’s effectiveness in streaming virtual reality content, which requires significantly more bandwidth and has a much lower tolerance for poor quality.

“The bitrates you need for 4K-quality VR can easily top hundreds of megabits per second, which today’s networks simply can’t support,” said Mohammad Alizadeh, a professor at MIT and a co-author of the paper on Pensieve. “We’re excited to see what systems like Pensieve can do for things like VR. This is really just the first step in seeing what we can do.”

You can watch a video showing Pensieve in action below:

Photo: MIT

Since you’re here …

… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.