Skip to content

Facial Beauty Analysis

WARNING

While the general Vitals™ Health Assessment (VHA) process does not store, transmit, or receive any Personally Identifiable Information (PII) to or from the Vitals™ Cloud Service, it is important to note that enabling the Facial Beauty Analysis feature will involve sending an image of the user's face to the server for facial analysis. By default, this feature is disabled.

Vitals™ SDK can analyze a person's facial and skin characteristics and provide skin quality metrics for user reference. The analysis results encompass various attributes such as eye shape, lips shape, acne, dark circles, and more. For a comprehensive list of supported facial beauty metrics, please refer to the Interpreting Result page.

Target SDK

JavaScript
React
Vue
Flutter
ReactNative
Android
iOS

TIP

You can refer to the sample code and the API Reference for more details. Most related API(s) include: createVitalSignCamera(), VitalSignCameraCreationProps, VideoFrameProcessedEvent, and BeautyAnalysis.

To perform facial beauty analysis, first, set the enableBeautyAnalysis parameter to true in createVitalSignCamera() when creating the Vital Sign Camera component:

typescript
window.onload = () => {
    /* Create and initialize Vital Sign Camera */
    const video = document.querySelector("video")!;
    const cam = createVitalSignCamera({ 
        isActive: true, 
        config, 
        enableBeautyAnalysis: true // Enable beauty analysis
    });
    cam.bind(video);
}
js
window.onload = () => {
    /* Create and initialize Vital Sign Camera */
    const video = document.querySelector("video");
    const cam = createVitalSignCamera({ 
        isActive: true, 
        config, 
        enableBeautyAnalysis: true // Enable beauty analysis
    });
    cam.bind(video);
}

The Vital Sign Camera component returns the facial beauty analysis results in the event.healthResult?.health?.beautyAnalysis property through the onVideoFrameProcessed callback function. One exceptional case is the "Facial Skin Age" metric, it is available through the event.healthResult?.health?.vitalSign?.facialSkinAge variable. In the example code below, it checks if the facial beauty analysis results are ready, and display them in the console if they are ready.

typescript
function printFacialBeautyAnalysisResult(event : VideoFrameProcessedEvent) {
    const facialSkinAge = event.healthResult?.health?.vitalSigns?.facialSkinAge;
    const beautyResults = event.healthResult?.health?.beautyAnalysis;
    if (beautyResults) {
        console.log(`************* Skin Quality *************`)
        console.log(`Skin Quality = ${beautyResults.skinQualityScore.toFixed(2)}`)

        console.log(`************* Skin Characteristics *************`)
        console.log(`Facial Skin Age = ${facialSkinAge} years`)
        console.log(`Skin Moisture = ${beautyResults.skinMoisture}`)
        console.log(`Acne = ${beautyResults.acne}`)
        console.log(`Pigmentation = ${beautyResults.pigmentation}`)
        console.log(`Dark Circles = ${beautyResults.darkCircle}`)
        console.log(`Wrinkles = ${beautyResults.wrinkles}`)
        console.log(`Redness = ${beautyResults.redness}`)

        console.log(`************* Facial Characteristics *************`)
        console.log(`Face Shape = ${beautyResults.faceShape}`)
        console.log(`Lips Shape = ${beautyResults.lipsShape}`)
        console.log(`Eye Bags = ${beautyResults.eyeBags}`)
        console.log(`Eye Shape = ${beautyResults.eyeShape}`)
    }
}
js
function printFacialBeautyAnalysisResult(event) {
    const facialSkinAge = event.healthResult?.health?.vitalSigns?.facialSkinAge;
    const beautyResults = event.healthResult?.health?.beautyAnalysis;
    if (beautyResults) {
        console.log(`************* Skin Quality *************`)
        console.log(`Skin Quality = ${beautyResults.skinQualityScore.toFixed(2)}`)

        console.log(`************* Skin Characteristics *************`)
        console.log(`Facial Skin Age = ${facialSkinAge} years`)
        console.log(`Skin Moisture = ${beautyResults.skinMoisture}`)
        console.log(`Acne = ${beautyResults.acne}`)
        console.log(`Pigmentation = ${beautyResults.pigmentation}`)
        console.log(`Dark Circles = ${beautyResults.darkCircle}`)
        console.log(`Wrinkles = ${beautyResults.wrinkles}`)
        console.log(`Redness = ${beautyResults.redness}`)

        console.log(`************* Facial Characteristics *************`)
        console.log(`Face Shape = ${beautyResults.faceShape}`)
        console.log(`Lips Shape = ${beautyResults.lipsShape}`)
        console.log(`Eye Bags = ${beautyResults.eyeBags}`)
        console.log(`Eye Shape = ${beautyResults.eyeShape}`)
    }
}

Call the function in onVideoFrameProcessed callback:

typescript
/* Update the onload event handler function */
window.onload = () => {
    // ...

    /* Update the onVideoFrameProcessed callback function */
    cam.onVideoFrameProcessed = (event : VideoFrameProcessedEvent) => {
        // ...

        /* Print the facial beauty analysis result if it is ready */
        printFacialBeautyAnalysisResult(event);
    }
}
js
/* Update the onload event handler function */
window.onload = () => {
    // ...

    /* Update the onVideoFrameProcessed callback function */
    cam.onVideoFrameProcessed = (event) => {
        // ...

        /* Print the facial beauty analysis result if it is ready */
        printFacialBeautyAnalysisResult(event);
    }
}