Engineers at Google have developed a new tool that uses AI and machine learning to filter out and replace the background in videos much like a green screen does. They call it mobile real-time video segmentation, and it’s here to up your YouTube game — assuming you have the limited beta stories, that is.
That’s because, for the moment, the feature is only to be found there. Even so, it looks like it could have a big impact. After all, just imagine what the wonderfully twisted minds of the internet will do now that they can easily digitally replace the boring scenery behind them with, I don’t know, maybe a mural?
Whatever, the specifics are for them to figure out.
One thing is for sure: Google’s dry explanation for the new tech belies the fun possibilities it creates.
“Our new segmentation technology allows creators to replace and modify the background, effortlessly increasing videos’ production value without specialized equipment,” notes the blogpost announcing the feature.
So, how can you get your hands on this? If you don’t have access to the stories beta in the YouTube app, you should look toward augmented reality.
“Our immediate goal is to use the limited rollout in YouTube stories to test our technology on this first set of effects,” Google explains. “As we improve and expand our segmentation technology to more labels, we plan to integrate it into Google’s broader Augmented Reality services.”
Unfortunately, the company doesn’t mention a specific timeline. So while Google looks to have big plans for their mobile real-time video segmentation, maybe don’t toss out your green screen just yet.