Flutter by the Holobooth for Web-based TensorFlow integration - Electronics Weekly

This article on Medium from the Google Flutter team takes a look at how they made the Holobooth, a virtual photo booth experience designed to showcase both the toolkit and the use of Machine Learning. Particularly, the potential for Flutter web apps to integrate with TensorFlow.js models in a relatively easy way.

Holobooth

The way the virtual 'booth' works is selecting your avatar ('Dash' or 'Sparky'), selecting your setting (tropical beach, volcanic mountain, outer space, ocean floor) and then begin recording your video.

The Holobooth has the ability to map live video of your face onto a 3D model of an avatar as you travel through their virtual (basic) world. If your face expresses surprise, for example, the avatar will mimic it.


The virtual booth was created as part of the recent Flutter Forward event held in Nairobi (building on the first version of the Photo Booth app from Google I/O 2021).

How to

The article covers detecting faces with TensorFlow.js (pictured right). Specifically, using the MediaPipe FaceMesh model, which estimates 468 3D face landmarks in real-time, to detect features of the user's face within the camera frame across web and mobile browsers.

Then there is animating backgrounds and avatars with Rive and TensorFlow.js…

"The avatars use Rive State Machines that allow us to control how an avatar behaves and looks. In the Rive State Machine, designers specify all of the inputs. Inputs are values that are controlled by your app. You can think of them as the contract between design and engineering teams. Your product's code can change the values of the inputs at any time, and the State Machine reacts to those changes."

Finally, there is capturing dynamic photos with Firebase.

For social media promotion of the Flutter Forward event, the main feature of Holobooth is the GIF or video that you can share at the end.

"We turned to Cloud Functions for Firebase to help us generate and upload your dynamic photo to Cloud Storage for Firebase. Once you press the camera button, the app starts capturing frames for a duration of 5 seconds. With ffmpeg, we use a Cloud Function to convert the frames into a single GIF and video that are then uploaded to Cloud Storage for Firebase. You can choose to download your GIF or video for later viewing or to manually upload it to social media."

You can have a go at the Holoboth itself here (though my Chromebook wasn't really powerful enough).

See also: Flutter GUI toolkit sees Material 3, menu, and debugging support

Comments

Popular Posts

Signal, WhatsApp and Telegram: All the major security differences between messaging apps - CNET

WhatsApp beta update seeks to remind everyone why encryption is so vital - iMore

VPN browser extensions: Why you shouldn't use then - Tech Advisor