Skip to content

Quick Start Guide for React Native

The quickest and simplest way to kickstart your journey with the ReactNative SDK is by downloading and experimenting with our sample code. Additionally, you should follow this quick start guide to swiftly create your first vital sign app with React Native.

After you've completed this guide, we strongly recommend you to follow our Software Development Guide to guarantee accurate and reliable health reports generated by the Vitals™ SDK.

SDK Installation

Please follow the instructions in the Downloads page to install and integrate the ReactNative SDK into your project.

Camera Setup

As outlined in Integrating Vitals™ Health Assessment (VHA), the first step is to setup the camera. To begin, create a new project and follow the instructions below.

Camera Permission

When developing for iOS or Android, you need to allow the app to use the camera by specifying the following in the file Info.plist for iOS, and AndroidManifest.xml for Android:

xml
<key>NSCameraUsageDescription</key>
<string>Use for measuring vital signs</string>
xml
<uses-permission android:name="android.permission.CAMERA" />

TIP

In Info.plist, the string "Use for measuring vital signs" is an example description of the camera usage. You should specify your own description that matches the usage of your application.

After that, request camera permission by using these code in the App.tsx file:

tsx
import {
  VitalSignCamera,
} from 'react-native-vital-sign-plugin';

function App(): JSX.Element {
  React.useEffect(() => {
    (async () => {
      const cameraPermission =
        await VitalSignCamera.getCameraPermissionStatus();
      if (cameraPermission !== 'authorized') {
        await VitalSignCamera.requestCameraPermission();
      }
    })();
  }, []);

  // ...
}

Vital Sign Camera

As described in Vital Sign Camera, in Vitals™ SDK, we provide a tailor-made camera component, named VitalSignCamera. To set it up, in the App.tsx file, declare the component VitalSignCamera. For example:

tsx
import React from 'react';
import {
  SafeAreaView,
  StyleSheet
} from 'react-native';

import {
  VitalSignCamera
} from 'react-native-vital-sign-plugin';

function App(): JSX.Element {

  // ... request camera permission code ...
  
  return (
    <SafeAreaView style={styles.container}>
      <VitalSignCamera
          isActive={true}
          /* Vitals™ Cloud Service Config */
          config={{apiKey:'__YOUR_API_KEY__'}}
          style={StyleSheet.absoluteFill}
        />
    </SafeAreaView>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
  },
});

export default App;

TIP

In the above example, the string __YOUR_API_KEY__ should be replaced by your own API Key. Please contact us to obtain your API Key, which is the credential for accessing the Vitals™ Cloud Service.

Run the app, you should be able to see the camera preview now. For more details on system and camera setup, please refer to Camera Setup.

Acquire Health Profile

As outlined in Integrating Vitals™ Health Assessment (VHA), user data and medical history data have to be gathered to obtain personalized health insights and enhance the accuracies of measurements.

You can provide the user information gathered by supplying them to the userInfo property in the VitalSignCamera component. For example:

tsx
import { Gender, VitalSignCamera } from 'react-native-vital-sign-plugin';

// ...

const userInfo : UserInfo = {
  age: 30,
  gender: Gender.Male,
  userId: '__YOUR_USER_ID__',
}

/* Provide the user info to Vitals™ SDK */
<VitalSignCamera
    // ...
    userInfo={userInfo}
  />

TIP

In the above example, the string __YOUR_USER_ID__ should be replaced by your own User ID. Please contact us to obtain your User ID, which specifies the subscription plan for the Vitals™ Cloud Service. The set of vital signs obtained from a scan varies depending on the licensed subscription plan, with each plan offering a unique set of vital signs.

For more details, please refer to the Acquire Health Profile page.

Scan Vital Signs

To start a scan, you can simply call the startScanning() API of the VitalSignCamera component.

First, create a camera reference so that we can call the startScanning() API:

tsx
import React, { useRef } from 'react';
// ...

const App = () => {
  const camera = useRef<VitalSignCamera>(null);

  // ...

  <VitalSignCamera
      ref={camera}
      // ...
    />
}

After that, add a start button, setup the onPress handler, and call the startScanning() API:

tsx
import { 
  // ...
  TouchableOpacity, 
  Text 
} from 'react-native';

// ...

function App(): JSX.Element {

  const start = async () => {
    if (!camera.current) return
    await camera.current.startScanning()
  }

  // ...
  return (
    <SafeAreaView style={styles.container}>
      // ...
      <TouchableOpacity style={styles.button} onPress={start}>
        <Text style={styles.buttonText}> Start Scanning</Text>
      </TouchableOpacity>
    </SafeAreaView>
  );
}

const styles = StyleSheet.create({
  // ...
  button: {
    backgroundColor: 'black',
    padding: 10,
    borderRadius: 5,
  },
  buttonText: {
    color: 'white'
  }
});

export default App;

After calling the API, the component will start scanning the face from the camera for about 30 seconds. And vital signs will be returned upon successful scan. You can log the scanning progress and observe any error by adding the onVideoFrameProcessed callback. This callback function delivers processing results at a rate of 30Hz per second (The rate depends on the frame rate of the camera).

tsx
import React, { useCallback, useRef } from 'react';
import {
  // ...
  VideoFrameProcessedEvent,
  GetHealthStage
} from 'react-native-vital-sign-plugin';

// ...

function App(): JSX.Element {

  // ...

  const onVideoFrameProcessed = useCallback((event:VideoFrameProcessedEvent) => {
    const healthResult = event.healthResult;
    
    /* Print the scanning stage & remaining seconds if the scan is in progress. */
    if (healthResult?.stage !== GetHealthStage.Idle) {
      console.log(`Scanning Stage=${healthResult?.stage}, Remaining Seconds=${healthResult?.remainingTime}`)
    }
    /* Print error if any */
    if (healthResult?.error) {
      console.log(`Error=${healthResult.error}`)
    }
  }, []);

  // ...

  <VitalSignCamera
      // ...
      onVideoFrameProcessed={onVideoFrameProcessed}
    />

  // ...
}

// ...

Run the app, with your face centered in the camera frame, click the "Start Scanning" button. Move your face out of the camera frame to simulate a "face lost" error. You should see something similar to this in your console:

 LOG  Scanning Stage=1, Remaining Seconds=22.866644978523254
 LOG  Scanning Stage=1, Remaining Seconds=22.82732403278351
 LOG  Error=face lost
 LOG  Error=face lost

For more details on handling the scanning process, please refer to the Scan Vital Signs page.

Get & Display Health Results

The VitalSignCamera component returns the health results by the onVideoFrameProcessed callback function. In the example code below, it checks if the heart rate is ready, and display it in the console if it is ready.

tsx
/* Update the onVideoFrameProcessed callback function */
const onVideoFrameProcessed = useCallback((event:VideoFrameProcessedEvent) => {
    const healthResult = event.healthResult;
    // ...

    /* Obtain the heart rate result */
    if (healthResult?.health) {
      console.log(`Heart Rate=${healthResult.health.vitalSigns.heartRate}`);
    }
}, []);

Run the app and perform the scan, after 30 seconds, you will be able to see the heart rate result like this:

 LOG  Scanning Stage=2, Remaining Seconds=0.053999900817871094
 LOG  Scanning Stage=2, Remaining Seconds=0.012000083923339844
 LOG  Heart Rate=70.68102456201278
 LOG  Heart Rate=70.68102456201278

There are more vital signs available in the Vitals™ SDK and we also provide health result interpretation guides. For more information, please refer to the Get & Display Health Result page.


🎉🎉🎉 Congratulations! 🎉🎉🎉

You have completed your first Vitals™ application!


WARNING

This guide only provides you the minimum viable product (MVP) using our ReactNative SDK, there are still more features and more things to do to ensure an accurate and reliable measurements by our Vitals™ Cloud Service. For example: performing Conditions Check, using the Signal Quality metric, and more. Please refer to the Software Development Guide page, the sample code, or the API reference for further details.