Flutter Hand Landmarker

pub package Pub Points MIT License

A Flutter plugin for real-time hand landmark detection on Android. This package uses Google's MediaPipe Hand Landmarker task, bridged to Flutter using JNI, to deliver high-performance hand tracking.

This plugin provides a simple Dart API to hide the complexity of native code, image format conversion, and multi-threading, allowing you to focus on building your app's features.

Features

  • Live Hand Tracking: Performs real-time detection of hand landmarks from a CameraImage stream.
  • High Performance: Leverages the native Android MediaPipe library for performant ML inference.
  • Smooth UI: Offloads heavy image processing (YUV to RGBA conversion) to a background isolate to prevent UI jank.
  • Simple, Type-Safe API: Provides clean Dart data models (Hand, Landmark) for the detection results.
  • Resource Management: Includes a dispose() method to properly clean up all native and isolate resources.
  • Bundled Model: The required hand_landmarker.task model is bundled with the plugin, so no manual setup is required for users.

How it Works

The plugin follows a robust, multi-threaded architecture to ensure both performance and a smooth UI.

  1. Camera Stream (Flutter): Your application provides a stream of CameraImage frames from the camera plugin.
  2. Background Isolate (Dart): The raw YUV CameraImage is sent to a background isolate managed by the plugin. This isolate performs the computationally expensive conversion to RGBA format without blocking the main UI thread.
  3. JNI Bridge (Dart -> Kotlin): The resulting RGBA ByteBuffer is passed to the native Android side via a JNI bridge generated by jnigen.
  4. Hand Detection (Kotlin): The native code uses the MediaPipe HandLandmarker task to detect hand landmarks in the RGBA image.
  5. Return to Flutter: The detection results are serialized to JSON and returned to Dart, where they are parsed into the clean data models (List

Getting Started

Prerequisites

  • Flutter SDK
  • An Android device or emulator (Minimum SDK version 24).

Installation

Add the following dependencies to your app's pubspec.yaml file:

dependencies:  
  hand_landmarker: ^1.0.0

Then, run flutter pub get.

Usage

Here is a basic example of how to use the plugin within a Flutter widget.

1. Initialize the Plugin and Camera

First, create an instance of the HandLandmarkerPlugin and your CameraController. It's best to do this in initState.

import 'package:flutter/material.dart';  
import 'package:camera/camera.dart';  
import 'package:hand_landmarker/hand_landmarker.dart';

class HandTrackerView extends StatefulWidget {  
  const HandTrackerView({super.key});  
  @override  
  State<HandTrackerView> createState() => _HandTrackerViewState();  
}

class _HandTrackerViewState extends State<HandTrackerView> {  
  HandLandmarkerPlugin? _plugin;  
  CameraController? _controller;  
  List<Hand> _landmarks = [];  
  bool _isInitialized = false;

  @override  
  void initState() {  
    super.initState();  
    _initialize();  
  }

  Future<void> _initialize() async {  
    // Get available cameras  
    final cameras = await availableCameras();  
    // Select the front camera  
    final camera = cameras.firstWhere(  
      (cam) => cam.lensDirection == CameraLensDirection.front,  
      orElse: () => cameras.first,  
    );

    _controller = CameraController(  
      camera,  
      ResolutionPreset.medium,  
      enableAudio: false,  
    );

    // Create an instance of our plugin  
    _plugin = await HandLandmarkerPlugin.create();

    // Initialize the camera and start the image stream  
    await _controller!.initialize();  
    await _controller!.startImageStream(_processCameraImage);

    if (mounted) {  
      setState(() => _isInitialized = true);  
    }  
  }

  @override  
  void dispose() {  
    _controller?.dispose();  
    _plugin?.dispose();  
    super.dispose();  
  }

2. Process the Camera Stream

Create a method to pass the CameraImage to the plugin's detect method.

  Future<void> _processCameraImage(CameraImage image) async {  
    if (!_isInitialized || _plugin == null) return;

    try {  
      // The core of the plugin: simply call detect()  
      final hands = await _plugin!.detect(  
        image,  
        _controller!.description.sensorOrientation,  
      );  
      if (mounted) {  
        setState(() => _landmarks = hands);  
      }  
    } catch (e) {  
      debugPrint('Error detecting landmarks: $e');  
    }  
  }

3. Render the Results

You can now use the _landmarks list in a CustomPainter to draw the results over your CameraPreview.

  @override  
  Widget build(BuildContext context) {  
    if (!_isInitialized) {  
      return const Center(child: CircularProgressIndicator());  
    }

    return Stack(  
      children: [  
        CameraPreview(_controller!),  
        CustomPaint(  
          painter: LandmarkPainter(hands: _landmarks),  
          // ... painter setup  
        ),  
      ],  
    );  
  }  
}

Data Models

The plugin returns a List<Hand>. Each Hand object contains a list of 21 Landmark objects.

Hand

A detected hand.

class Hand {  
  /// A list of 21 landmarks for the detected hand.  
  final List<Landmark> landmarks;  
}

Landmark

A single landmark point with normalized 3D coordinates (x, y, z), where x and y are between 0.0 and 1.0.

class Landmark {  
  final double x;  
  final double y;  
  final double z;  
}

License

This project is licensed under the MIT License - see the LICENSE file for details.