It’s that kind of energy that we wanted to try to capture with this background.” “For this year’s Ramadan effects, I was able to work with a chapter of our Inter Belief Network ERG - they talked about how important street markets are at night during Ramadan, when everybody goes out and breaks their fast together. “It’s so important to partner with people who really understand all of the symbolic imagery,” says Lauren. “Then we collaborate with artists and agencies on nailing down the designs.”įor cultural moments, the team often turns to Google’s Employee Resource Groups (ERGs). “I start by putting together a content strategy for everything that we want to do for the whole year: What permanent backgrounds and what permanent effects do we want? What older content do we want to replace? What seasonal events do we want to hit?” explains Lauren Morrison, lead designer for Meet’s video effects. The team works with artists from all over the world to design the different filters. Since the introduction of background blur, the team continued to expand the library of options over time, replacing old, stock photography shots with more curated scenes for professional meetings and even more engaging, fun filters for day-to-day use. “One of our main jobs is to make sure that things work everywhere, whether it's an entry-level laptop or the most powerful PC, by optimizing how that processing works - In close collaboration with Chrome and several other web browsers,” Dan says. This makes Meet far more accessible to people using a range of devices, but creates a tougher technical challenge to ensure effects and filters run smoothly and lag-free, because they don’t have the same access to system resources that a native app might have. “And we’re sending up to 30 frames per second, so we only have a few milliseconds between each frame to do that computation.” This work also allowed the team to add 360-degree video backgrounds.Īll that processing is made even more complicated by the fact that Meet runs entirely in a web browser. “The big challenge is that we have to do that background separation for every frame,” Dan says. MediaPipe runs a combination of machine learning and signal processing that breaks down each still image of your video stream, distinguishing the person from the background - allowing Meet to replace your living room or other workspace with your selected backdrop. Meet identifies the separation between the person and the rest of the frame using MediaPipe, a Google machine learning framework designed for live and streaming video processing. “So we put our efforts into the highest gear to make it work.” The team launched into overdrive, debuting the ability to blur backgrounds during calls in September 2020, followed by the more advanced filters and background effects that we now enjoy on our calls today. “The use of video conferencing exploded - and people needed more options to add more professional backgrounds to their calls,” says Dan Gunnarsson, who leads the team building the effects and video processing technology Meet uses. They began exploring ways to bring that same technology from mobile devices to other practical applications with video calls on the web. “When we started with video effects in 2018, we wanted to add fun and delight to one-to-one calls,” says Magnus Flodman, head of the Meet Media team in Stockholm. (Anyone else work from that light-filled coffee shop?) The effects are among a variety of new scenes and backgrounds in Meet planned for 2023, which include more holiday-themed and timely backdrops and filters - as well as some updated staples.
0 Comments
Leave a Reply. |