Skip to content

Getting Started Guide

The easiest and fastest way to get started with the SDK is to download and try the sample code. We provide full featured samples for different frameworks and platforms for developer to start with. Or you can also follow this guide to create your first vital sign app quickly.

Installation

For Vue, React and ReactNative

In your NodeJS project, install the SDK with the npm install or yarn add commands depending on your preference. Please note that different frameworks require different packages. For example:

bash
yarn add "https://sdk.panoptic.ai/npm/vue-vital-sign-camera-1.1.0.tgz"
bash
yarn add "https://sdk.panoptic.ai/npm/react-vital-sign-camera-1.5.0.tgz"
bash
yarn add "https://sdk.panoptic.ai/npm/react-native-vital-sign-plugin-3.4.1.tgz"

For Flutter

  1. Download the Flutter SDK here.

  2. Unzip the downloaded SDK zip file to a folder in your local drive.

  3. In your Flutter project, add the SDK package by:

    bash
    flutter pub add vital_sign_camera --path "/path_of_downloaded_folder/flutter-vital-sign-sdk/vital_sign_camera"

    Please replace the string path_of_downloaded_folder above with the actual downloaded path.

  4. Add the following line to the file android/settings.gradle:

    groovy
    include ':vital-sign-engine'
    project(':vital-sign-engine').projectDir = new File('/path_of_downloaded_folder/flutter-vital-sign-sdk/vital_sign_camera/android/libs')

    Please replace the string path_of_downloaded_folder above with the actual downloaded path.

IMPORTANT

Please note that for Android, the minimum SDK version must be at least 24.

Camera Component

In the main page of your project, declare VitalSignCamera. For example:

vue
<script setup lang="ts">
import {VitalSignCamera,Gender} from 'vue-vital-sign-camera'
</script>

<template>
  <VitalSignCamera 
      :isActive="true"
      :userInfo="{gender:Gender.Male, age:30, userId:'__YOUR_USER_ID__'}"
      :config="{apiKey:'__YOUR_API_KEY__'}"
  />
</template>
tsx
import React from 'react';
import { VitalSignCamera, Gender } from 'react-vital-sign-camera';

function App() {
  return (
    <div>
      <VitalSignCamera
        isActive={true}
        userInfo={{gender:Gender.Male, age:30, userId: '__YOUR_USER_ID__'}}
        config={{apiKey:'__YOUR_API_KEY__'}}
      />
    </div>
  ) 
}

export default App;
tsx
import React from 'react';
import { View } from 'react-native';
import { VitalSignCamera, Gender } from 'react-native-vital-sign-plugin';

function App() {
  return (
    <View>
      <VitalSignCamera
        isActive={true}
        userInfo={{gender:Gender.Male, age:30, userId: '__YOUR_USER_ID__'}}
        config={{apiKey:'__YOUR_API_KEY__'}}
      />
    </View>
  ) 
}

export default App;
dart
import 'package:flutter/material.dart';
import 'package:vital_sign_camera/vital_sign_camera.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({super.key});

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        body: VitalSignCamera(
          isActive: true,
          userInfo: UserInfo(age: 30, gender: Gender.male, userId: '__YOUR_USER_ID__'),
          config: VitalSignCameraConfig(apiKey: '__YOUR_API_KEY__'),
        ),
      )
    );
  }
}

In the above example, the strings __YOUR_USER_ID__ and __YOUR_API_KEY__ should be replaced by your own userId and apiKey repectively. Please contact us to obtain your userId and apiKey.

TIP

The userId specifies the subscription plan of the Vital Sign Service. The set of vital signs result of a scan is depending on the subscription plan. Different subscription plan returns different set of vital signs. (See Subscription Plans for details.) The apiKey is the credential for accessing the vital sign API server.

Style Sheet

Import the style sheet of the VitalSignCamera (For Vue or React only).

For example, in main.ts:

ts
import 'vue-vital-sign-camera/dist/style.css'
tsx
import 'react-vital-sign-camera/dist/style.css'

Scanning Vital Sign

To start a scanning, just call the startScanning() API of the component. After calling the API, the component will start scanning the face from the camera for about 30 seconds. And the vital signs will be returned upon successful scan. In the below code snippet, a start button is added. If the start button is pressed, it calls the startScanning().

vue
<script setup lang="ts">
import { ref } from 'vue'
import { VitalSignCamera, Gender } from 'vue-vital-sign-camera'

const camera = ref(null)

const startScanning = ()=> {
  camera.value.startScanning()
}
</script>

<template>
  <button @click="startScanning">start</button>
  <VitalSignCamera 
      ref="camera"
      :isActive="true"
      :userInfo="{gender:Gender.Male, age:30, userId:'__YOUR_USER_ID__'}"
      :config="{apiKey:'__YOUR_API_KEY__'}"
  />
</template>
tsx
import React, { useState } from 'react';
import { 
  VitalSignCamera, 
  VitalSignCameraInstance, 
  Gender 
} from 'react-vital-sign-camera';

function App() {
  const [camera, setCamera] = useState<VitalSignCameraInstance|undefined>(undefined)
  const startScanning = () => {
    camera?.startScanning();
  }
  return (
    <div>
      <button onClick={startScanning}>start</button>
      <VitalSignCamera
        onCreated={setCamera}
        isActive={true}
        userInfo={{gender:Gender.Male, age:30, userId:'__YOUR_USER_ID__'}}
        config={{apiKey:'__YOUR_API_KEY__'}}
      />
    </div>
  )
}

export default App;
tsx
import React, { useRef } from 'react';
import { View, TouchableOpacity } from 'react-native';
import { VitalSignCamera, Gender } from 'react-native-vital-sign-plugin';

const App = () => {
  const camera = useRef<VitalSignCamera>(null);
  const start = async () => {
    if (!camera.current) return
    await camera.current.startScanning()
  }

  return (
    <View>
      <TouchableOpacity onPress={start} />
      <VitalSignCamera
        ref={camera}
        isActive={true}
        userInfo={{gender:Gender.Male, age:30, userId: '__YOUR_USER_ID__'}}
        config={{apiKey:'__YOUR_API_KEY__'}}
      />
    </View>
  ) 
}

export default App;
dart
import 'package:flutter/material.dart';
import 'package:vital_sign_camera/vital_sign_camera.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({super.key});

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  late final VitalSignCameraController _vitalSignCameraController;
  late Future<CameraDevice?> cameraDevice;

  @override
  void initState() {
    super.initState();
    cameraDevice = getFrontCamera();
  }

  Future<CameraDevice?> getFrontCamera() async {
    if (CameraPermissionStatus.authorized != await requestCameraPermission()) {
      return null;
    }
    return queryCameraDevice(CameraPosition.front);
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        body: Stack(children: [
          VitalSignCamera(
              onCreated: _onVitalSignCameraCreated,
              isActive: true,
              userInfo: UserInfo(
                age: 30, gender: Gender.male, userId: '__YOUR_USER_ID__'),
              config: VitalSignCameraConfig(apiKey: '__YOUR_API_KEY__'),
              device: cameraDevice),
          Center(
            child: ElevatedButton(
                onPressed: () {
                  setState(() {
                    _vitalSignCameraController.startScanning();
                  });
                },
                child: const Text('start')),
          ),
        ]),
      ),
    );
  }

  void _onVitalSignCameraCreated(VitalSignCameraController controller) {
    _vitalSignCameraController = controller;
  }
}

Getting Vital Signs

The camera components returns the status and vital signs by the onVideoFrameProcessed callback functions at a rate of 30Hz for seconds. (The rate depends on the frame rate of the camera.)

To get the vital signs, register a callback function to the component. In the example code below, the callback function check if the heart rate is ready, if it is ready it will be shown on the page.

vue
<script setup lang="ts">
import { ref } from 'vue'
import {VitalSignCamera,Gender} from 'vue-vital-sign-camera'

const camera = ref(null)
const heartRate = ref(undefined)

const startScanning = () => {
  camera.value.startScanning();
}
const onVideoFrameProcessed = (event) => {
  const healthResult = event.healthResult;
  if (healthResult?.health) {
    heartRate.value = healthResult.health.vitalSigns.heartRate
  }
}
</script>

<template>
  <button @click="startScanning()">start</button>
  <p>Heart rate: {{heartRate}}</p>
  <VitalSignCamera 
      ref="camera"
      :isActive="true"
      :userInfo="{gender:Gender.Male, age:30, userId:'__YOUR_USER_ID__'}"
      :config="{apiKey:'__YOUR_API_KEY__'}"
      @onVideoFrameProcessed="onVideoFrameProcessed"
  />
</template>
tsx
import React, { useState, useCallback } from 'react';
import { 
  VitalSignCamera, 
  VitalSignCameraInstance, 
  Gender,
  VideoFrameProcessedEvent 
} from 'react-vital-sign-camera';

function App() {
  const [camera, setCamera] = useState<VitalSignCameraInstance|undefined>(undefined)
  const [heartRate, setHeartRate] = useState<number|undefined>(undefined);
  const startScanning = () => {
    camera?.startScanning();
  }
  const onVideoFrameProcessed = useCallback((event: VideoFrameProcessedEvent) => {
    const healthResult = event.healthResult;
    if(healthResult?.health) {
      setHeartRate(healthResult.health.vitalSigns.heartRate);
    }
  }, [])
  return (
      <div>
      <button onClick={startScanning}>start</button>
      <p>Heart rate: {heartRate}</p>
      <VitalSignCamera
        onCreated={setCamera}
        isActive={true}
        onVideoFrameProcessed={onVideoFrameProcessed}
        userInfo={{ gender: Gender.Male, age: 30, userId: '__YOUR_USER_ID__' }}
        config={{ apiKey: '__YOUR_API_KEY__' }} />
    </div>
  )
}

export default App;
tsx
import React, { useRef, useCallback, useState } from 'react';
import { View, TouchableOpacity, Text } from 'react-native';
import { 
  VitalSignCamera, 
  Gender, 
  VideoFrameProcessedEvent 
} from 'react-native-vital-sign-plugin';

const App = () => {
  const camera = useRef<VitalSignCamera>(null);
  const [heartRate, setHeartRate] = useState<number|undefined>(undefined);
  const start = async () => {
    if (!camera.current) {
      return;
    }
    await camera.current.startScanning();
  }
  const onVideoFrameProcessed = useCallback((event:VideoFrameProcessedEvent) => {
    const healthResult = event.healthresult;
    if (healthResult?.health) {
      setHeartRate(healthResult.health.vitalSigns.heartRate);
    }
  }, [])

  return (
    <View>
      <TouchableOpacity onPress={start} />
      <Text>Heart rate: {heartRate}</Text>
      <VitalSignCamera
        ref={camera}
        isActive={true}
        userInfo={{gender:Gender.Male, age:30, userId: '__YOUR_USER_ID__'}}
        config={{apiKey:'__YOUR_API_KEY__'}}
      />
    </View>
  ) 
}

export default App;
dart
import 'package:flutter/material.dart';
import 'package:vital_sign_camera/vital_sign_camera.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({super.key});

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  late final VitalSignCameraController _vitalSignCameraController;
  late Future<CameraDevice?> cameraDevice;

  @override
  void initState() {
    super.initState();
    cameraDevice = getFrontCamera();
  }

  double? _heartRate;

  Future<CameraDevice?> getFrontCamera() async {
    if (CameraPermissionStatus.authorized != await requestCameraPermission()) {
      return null;
    }
    return queryCameraDevice(CameraPosition.front);
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        body: Stack(children: [
          VitalSignCamera(
              onCreated: _onVitalSignCameraCreated,
              isActive: true,
              userInfo: UserInfo(
                  age: 30, gender: Gender.male, userId: '__YOUR_USER_ID__'),
              config: VitalSignCameraConfig(apiKey: '__YOUR_API_KEY__'),
              device: cameraDevice,
              onVideoFrameProcessed: _onVideoFrameProcessed),
          Center(
            child: Column(
              mainAxisAlignment: MainAxisAlignment.center,
              children: [
                ElevatedButton(
                    onPressed: () {
                      setState(() {
                        _vitalSignCameraController.startScanning();
                      });
                    },
                    child: const Text('start')),
                Text('Heart rate: $_heartRate'),
              ],
            ),
          ),
        ]),
      ),
    );
  }

  void _onVideoFrameProcessed(VideoFrameProcessedEvent event) {
    setState(() {
      _heartRate = event.healthResult?.health?.vitalSigns.heartRate;
    });
  }

  void _onVitalSignCameraCreated(VitalSignCameraController controller) {
    _vitalSignCameraController = controller;
  }
}

🎉 Congratulations! 🎉 You have completed your first vital sign application.

TIP

Besides the vital signs, the onVideoFrameProcessed callback returns many useful informations and status of the component, such as the scanning stages, the remaining time, the bounding box of the detected face, etc. The application can use these information to visualize the scanning process on the user interface. For more information, please rerfer to the sample code or API reference.

Conditions Check

To ensure accurate measurments, the component feedbacks the conditions that affects the accuracies, such as lighting, head positions, etc. These conditions are returned in the scanConditions variable of the onVideoFrameProcessed event. The application can use the values of the conditions to determine whether the scanning is allowed. For example, In the below code snippets, the start button is enabled only if all the conditions are true.

vue
<script setup lang="ts">
...
const enabled = ref(false)
const onVideoFrameProcessed = (event) => {
  const healthResult = event.healthResult;
  const scanConditions = event.scanConditions;
  if (healthResult?.health) {
    heartRate.value = healthResult.health.vitalSigns.heartRate;
  }
  enabled.value = scanConditions ?
    Object.values(scanConditions).every((item) => item) :
    false;
}
</script>
<template>
  <button @click="startScanning()" :disabled="!enabled">start</button>
  ...
</template>
tsx
  ...
  const [enabled, setEnabled] = useState<boolean>(false);
  const onVideoFrameProcessed = useCallback((event: VideoFrameProcessedEvent) => {
    const healthResult = event.healthResult;
    const scanConditions = event.scanConditions;
    if(healthResult?.health) {
      setHeartRate(healthResult.health.vitalSigns.heartRate);
    }
    setEnabled(scanConditions ? 
      Object.values(scanConditions).every((item) => item) :
      false
    )
  }, [])
  return (
    <div>
      <button onClick={startScanning} disabled={!enabled}>start</button>
  ...
tsx
  ...
  const [enabled, setEnabled] = useState<boolean>(false);
  const onVideoFrameProcessed = useCallback((event: VideoFrameProcessedEvent) => {
    const healthResult = event.healthResult;
    const scanConditions = event.scanConditions;
    if(healthResult?.health) {
      setHeartRate(healthResult.health.vitalSigns.heartRate);
    }
    setEnabled(scanConditions ? 
      Object.values(scanConditions).every((item) => item) :
      false
    )
  }, [])
  return (
    <View>
      <TouchableOpacity onPress={start} disabled={!enabled}>start</button>
  ...
dart
  ...
  bool? _enabled;
  double? _heartRate;
  HealthResult? _healthResult;

  void _onVideoFrameProcessed(VideoFrameProcessedEvent event) {
    setState(() {
      _heartRate = event.healthResult?.health?.vitalSigns.heartRate;
      _healthResult = event.healthResult;
      _enabled = checkIfAllScanConditionsMet(event.scanConditions);
    });
  }

  bool checkIfAllScanConditionsMet(ScanConditions scanConditions) {
    if (scanConditions.centered == true &&
        scanConditions.distance == true &&
        scanConditions.frameRate == true &&
        scanConditions.lighting == true &&
        scanConditions.movement == true &&
        scanConditions.serverReady == true) {
      return true;
    } else {
      return false;
    }
  }

  return Column(
      childen: [
        ElevatedButton(
          onPressed: (_enabled ?? false) ? () {
            setState(() {
              _vitalSignCameraController.startScanning();
            });
          } : () {},
          child: const Text('start')),
        Text('Heart rate: $_heartRate'),
    ])
  ...

TIP

The scan conditions are feedbacks in realtime manner based on the every video frame, it is recommended to visualize the scan conditions on screen realtime. So if some conditions are not met, the user can react immediately (e.g. by adjusting the head position) to fulfill the conditions. Please refer to the sample code.

Lifecycle

The lifecycle of the camera components in shown in the below flowchart. The lifecycle consists of the following states:

  1. Inactive - If the value of the isActive attribute is false, the component will be in the inactive state, all the functionality are turned off in this state.
  2. Idle - If the isActive attribute is true, the component enters the Idle state. It starts processing video frames, performs face detection and condition checking. The process result is returned by the onVideoFrameProcessed callback function at the rate 30Hz (depending on the camera frame rate).
  3. Waiting (Scanning) - If the startScanning() API is called, the component starts the scanning. There are three stages in a scan. The first is Waiting. In this stage it waits until the component and the server is ready. The waiting time is usually very short and almost not noticeable.
  4. Collecting Data (Scanning) - In this stage, the component start collecting data from the detected face. This stages usually last for 25 seconds.
  5. Analysing Data (Scanning) - In the final stage of the scanning, it sends the collected data to the vital sign cloud service for analysing. When finish anylaysing, the vital signs are returned from the server and the scan is completed.

During the scanning, if stopScanning() API is called, the scanning will be aborted and go back to the Idle state.

Component States

TIP

All the scanning stage change and the progress of each stage (e.g. remaining time) are returned in the variable healthResult.stage by the callback function onVideoFrameProcessed. The application should visualize all the stage changes and the progress in the user interface for a good user experience. Please refer to the sample code for a reference implementation.

Camera Devices

In some situaltion, you may need to change to use different camera instead of the front facing camera. In this case, just specify the deviceId of the desired camera in the device prop of the component. For example:

vue
<script setup lang="ts">
  ...
  const deviceId = ref()

  let devices = await navigator.mediaDevices.enumerateDevices();
  devices = devices.filter(device => device.kind === 'videoinput');
  deviceId.value = devices[0].deviceId

</script>
<template>
  ...
  <VitalSignCamera 
    ...
    :device="deviceId" />
</template>
tsx
import React, { useState, useCallback, useEffect } from 'react';
...

function App() {
  ...
  const [device, setDevice] = useState<string>('');
  useEffect(() => {
    const listDevice = async () => {
      let devices = await navigator.mediaDevices.enumerateDevices();
      devices = devices.filter(device => device.kind === 'videoinput');
      setDevice(devices[0].deviceId);
    }
    listDevice();
  }, [])

  return (
    <div>
      ...
      <VitalSignCamera 
        ...
        device={device}
      />
    </div>
  )
}

export default App;
tsx
import { 
  VitalSignCamera,
  Gender, 
  VideoFrameProcessedEvent, 
  useCameraDevices 
} from 'react-native-vital-sign-plugin';
...

function App() {
  ...
  const devices = useCameraDevices('wide-angle-camera');
  const device = devices.front;

  return (
    <View>
      ...
      <VitalSignCamera 
        ...
        device={device}
      />
    </View>
  )
}

export default App;
dart
import 'package:vital_sign_camera/vital_sign_camera.dart';
...

class _MyAppState extends State<MyApp> {
  ...
  late Future<CameraDevice?> cameraDevice;

  @override
  void initState() {
    super.initState();
    cameraDevice = getBackCamera();
  }

  Future<CameraDevice?> getBackCamera() async {
    if (CameraPermissionStatus.authorized != await requestCameraPermission()) {
      return null;
    }
    return queryCameraDevice(CameraPosition.back);
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        body: ...
          VitalSignCamera(
            ...
            device: cameraDevice,
          ),
      ),
    );
  }
  ...
}

Camera Permission

When developing for iOS or Android, you need to allow the app to use the camera by specifying the following in Info.plist for iOS, and AndroidManifest.xml for Android:

xml
<key>NSCameraUsageDescription</key>
<string>Use for measuring vital signs</string>
xml
<uses-permission android:name="android.permission.CAMERA" />

TIP

In Info.plist, the string "Use for measuring vital signs" is an example description of the camera usage. You should specify your own description that matches the usage of your application.