
Product
Introducing Tier 1 Reachability: Precision CVE Triage for Enterprise Teams
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.
@gymbrosinc/react-native-mediapipe-pose
Advanced tools
React Native MediaPipe Pose Detection with GPU acceleration, jump detection, and high-performance analysis
⚠️ IMPORTANT NOTICE - TEST PACKAGE
This is a testing/development package that may be deprecated or removed at any time without notice.
🚨 DO NOT USE IN PRODUCTION without explicit permission from the package owner.
📧 Contact: khalidahammeduzzal@gmail.com for licensing and production use.
🔄 This package is subject to breaking changes and may transition to private access in the future.
React Native module for real-time pose detection using Google's MediaPipe BlazePose
Production-ready pose detection with GPU acceleration and automatic hardware optimization
Installation • Quick Start • API Reference • Performance • Examples
⚠️ USAGE WARNING: This package is for testing purposes only. Contact the owner before using in any project.
npm install @gymbrosinc/react-native-mediapipe-pose
# or
yarn add @gymbrosinc/react-native-mediapipe-pose
⚠️ CRITICAL: Manual Configuration Required
Your app will crash without these steps. The package cannot automatically configure these settings for security reasons.
Add camera permission to your app's ios/YourAppName/Info.plist
:
<key>NSCameraUsageDescription</key>
<string>This app needs camera access for pose detection</string>
Alternative descriptions you can use:
<!-- For fitness apps -->
<string>Camera access is required for real-time pose tracking and exercise analysis</string>
<!-- For AR/entertainment apps -->
<string>Camera access enables pose detection for interactive experiences</string>
<!-- Generic -->
<string>This app uses the camera to detect and analyze human poses in real-time</string>
Ensure your app targets iOS 13.0+ (MediaPipe requirement):
In ios/YourApp.xcodeproj
:
13.0
or higherIn package.json
:
{
"engines": {
"ios": ">=13.0"
}
}
iOS dependencies are automatically installed when you build your app. However, if you encounter issues, you can manually run:
cd ios
pod install
cd ..
Note: Modern React Native (0.72+) and Expo automatically handle pod installation during the build process.
For better camera performance, add to ios/YourAppName/Info.plist
:
<key>CADisableMinimumFrameDurationOnPhone</key>
<true/>
If you want to support different orientations, configure in ios/YourAppName/Info.plist
:
<!-- For iPhone - Portrait only (recommended for pose detection) -->
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
</array>
<!-- For iPad - All orientations -->
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
The MediaPipe pose detection model (pose_landmarker_full.task
) is automatically included with the package. No manual download required.
Your ios/YourAppName/Info.plist
should include at minimum:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<!-- Your existing keys... -->
<!-- REQUIRED: Camera permission -->
<key>NSCameraUsageDescription</key>
<string>This app needs camera access for pose detection</string>
<!-- REQUIRED: Minimum iOS version -->
<key>LSMinimumSystemVersion</key>
<string>13.0</string>
<!-- RECOMMENDED: Performance optimization -->
<key>CADisableMinimumFrameDurationOnPhone</key>
<true/>
<!-- RECOMMENDED: Portrait orientation for best pose detection -->
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
</array>
<!-- Your other existing keys... -->
</dict>
</plist>
After configuration, verify your setup:
App crashes immediately:
NSCameraUsageDescription
is added to Info.plistnpx expo run:ios --clear
cd ios && pod install
Camera permission denied:
NSCameraUsageDescription
Pose detection not working:
enablePoseDetection={true}
Performance issues:
CADisableMinimumFrameDurationOnPhone
to Info.plistenablePoseDataStreaming={false}
for productionautoAdjustFPS={true}
for device optimizationNote: Android support is currently under development and not yet available.
Android implementation is in active development. Follow our progress:
Expected Android support in upcoming releases.
import React, { useState } from 'react';
import { View, StyleSheet, Alert } from 'react-native';
import ReactNativeMediapipePose, {
ReactNativeMediapipePoseView,
PoseDetectionResult,
CameraType,
DeviceCapability,
FrameProcessingInfo,
} from '@gymbrosinc/react-native-mediapipe-pose';
export default function App() {
const [cameraType, setCameraType] = useState<CameraType>('front');
const [isPoseDetectionEnabled, setIsPoseDetectionEnabled] = useState(false);
const [fps, setFps] = useState<number>(0);
// Request camera permissions
const requestPermissions = async () => {
try {
const granted = await ReactNativeMediapipePose.requestCameraPermissions();
if (!granted) {
Alert.alert(
'Camera Permission Required',
'This app requires camera access to detect poses.'
);
}
} catch (error) {
Alert.alert('Error', 'Failed to request camera permissions');
}
};
const handlePoseDetected = (event: { nativeEvent: PoseDetectionResult }) => {
const { landmarks, processingTime, confidence } = event.nativeEvent;
console.log(
`Detected ${landmarks.length} landmarks in ${processingTime}ms`
);
};
const handleFrameProcessed = (event: {
nativeEvent: FrameProcessingInfo;
}) => {
setFps(Math.round(event.nativeEvent.fps));
};
const handleDeviceCapability = (event: { nativeEvent: DeviceCapability }) => {
const { deviceTier, recommendedFPS } = event.nativeEvent;
console.log(`Device: ${deviceTier}, Recommended FPS: ${recommendedFPS}`);
};
const handleGPUStatus = (event: { nativeEvent: any }) => {
const { isUsingGPU, processingUnit } = event.nativeEvent;
console.log(
`GPU Acceleration: ${isUsingGPU ? 'Enabled' : 'Disabled'} (${processingUnit})`
);
};
const switchCamera = () => {
// Enhanced camera switching with full refresh to prevent coordinate system issues
const wasEnabled = isPoseDetectionEnabled;
setIsPoseDetectionEnabled(false); // Temporarily disable
// Switch camera type
setCameraType((current) => (current === 'back' ? 'front' : 'back'));
// Re-enable pose detection after brief delay for proper initialization
setTimeout(() => {
if (wasEnabled) {
setIsPoseDetectionEnabled(true);
}
}, 100);
};
React.useEffect(() => {
requestPermissions();
}, []);
return (
<View style={styles.container}>
<ReactNativeMediapipePoseView
key={`camera-${cameraType}`} // Force re-render on camera switch
style={styles.camera}
cameraType={cameraType}
enablePoseDetection={isPoseDetectionEnabled}
enablePoseDataStreaming={true} // Enable to receive pose data
targetFPS={30}
autoAdjustFPS={true}
onCameraReady={(event) =>
console.log('Camera ready:', event.nativeEvent.ready)
}
onError={(event) =>
Alert.alert('Camera Error', event.nativeEvent.error)
}
onPoseDetected={handlePoseDetected}
onFrameProcessed={handleFrameProcessed}
onDeviceCapability={handleDeviceCapability}
onGPUStatus={handleGPUStatus}
/>
{/* Simple UI overlay showing FPS */}
<View style={styles.overlay}>
<Text style={styles.fpsText}>FPS: {fps}</Text>
<TouchableOpacity style={styles.button} onPress={switchCamera}>
<Text style={styles.buttonText}>Switch Camera</Text>
</TouchableOpacity>
<TouchableOpacity
style={[styles.button, isPoseDetectionEnabled && styles.buttonActive]}
onPress={() => setIsPoseDetectionEnabled(!isPoseDetectionEnabled)}
>
<Text style={styles.buttonText}>
{isPoseDetectionEnabled ? 'Stop Pose' : 'Start Pose'}
</Text>
</TouchableOpacity>
</View>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
camera: {
flex: 1,
},
overlay: {
position: 'absolute',
top: 50,
left: 20,
right: 20,
flexDirection: 'row',
justifyContent: 'space-between',
alignItems: 'center',
},
fpsText: {
color: 'white',
fontSize: 16,
fontWeight: 'bold',
},
button: {
backgroundColor: 'rgba(255, 255, 255, 0.2)',
padding: 10,
borderRadius: 8,
},
buttonActive: {
backgroundColor: 'rgba(76, 175, 80, 0.8)',
},
buttonText: {
color: 'white',
fontSize: 14,
fontWeight: '600',
},
});
import React, { useState, useEffect } from 'react';
import { View, Text, TouchableOpacity, StyleSheet } from 'react-native';
import ReactNativeMediapipePose, {
ReactNativeMediapipePoseView,
DeviceCapability,
FrameProcessingInfo,
} from '@gymbrosinc/react-native-mediapipe-pose';
export default function OptimizedPoseDetection() {
const [deviceCapability, setDeviceCapability] =
useState<DeviceCapability | null>(null);
const [fps, setFps] = useState<number>(0);
const [targetFPS, setTargetFPS] = useState<number>(30);
const [autoAdjustFPS, setAutoAdjustFPS] = useState<boolean>(true);
const [enablePoseDataStreaming, setEnablePoseDataStreaming] =
useState<boolean>(false);
const [showPerformanceControls, setShowPerformanceControls] =
useState<boolean>(false);
const handleDeviceCapability = (event: { nativeEvent: DeviceCapability }) => {
const capability = event.nativeEvent;
setDeviceCapability(capability);
setTargetFPS(capability.recommendedFPS); // Use device-recommended FPS
};
const handleFrameProcessed = (event: {
nativeEvent: FrameProcessingInfo;
}) => {
const { fps: currentFps, autoAdjusted, newTargetFPS } = event.nativeEvent;
setFps(Math.round(currentFps));
if (autoAdjusted && newTargetFPS) {
setTargetFPS(newTargetFPS);
console.log(`Auto-adjusted FPS to ${newTargetFPS}`);
}
};
const handleGPUStatus = (event: { nativeEvent: any }) => {
const { isUsingGPU, delegate, deviceTier } = event.nativeEvent;
console.log(
`GPU: ${isUsingGPU}, Delegate: ${delegate}, Device: ${deviceTier}`
);
};
return (
<View style={styles.container}>
<ReactNativeMediapipePoseView
key={`camera-optimized-${Date.now()}`} // Ensure clean re-renders
style={styles.camera}
enablePoseDetection={true}
// Performance optimizations (recommended for production)
enablePoseDataStreaming={enablePoseDataStreaming} // Toggle data streaming
enableDetailedLogs={false} // Disable for production
poseDataThrottleMs={100} // Throttle data updates
fpsChangeThreshold={2.0} // Only report significant FPS changes
fpsReportThrottleMs={500} // Throttle FPS reports
// Auto performance adjustment
autoAdjustFPS={autoAdjustFPS} // Enable automatic FPS optimization
targetFPS={targetFPS} // Dynamic target based on device
onFrameProcessed={handleFrameProcessed}
onDeviceCapability={handleDeviceCapability}
onGPUStatus={handleGPUStatus}
onPoseDetected={(event) => {
if (enablePoseDataStreaming) {
const { landmarks, processingTime } = event.nativeEvent;
console.log(
`Pose: ${landmarks.length} landmarks, ${processingTime}ms`
);
}
}}
/>
{/* Performance Controls Overlay */}
<View style={styles.performanceOverlay}>
<Text style={styles.performanceText}>
FPS: {fps} | Target: {targetFPS} | Device:{' '}
{deviceCapability?.deviceTier?.toUpperCase()}
</Text>
<TouchableOpacity
style={styles.toggleButton}
onPress={() => setShowPerformanceControls(!showPerformanceControls)}
>
<Text style={styles.toggleButtonText}>Performance Controls</Text>
</TouchableOpacity>
{showPerformanceControls && (
<View style={styles.controlsPanel}>
<TouchableOpacity
style={[
styles.controlButton,
enablePoseDataStreaming && styles.controlButtonActive,
]}
onPress={() =>
setEnablePoseDataStreaming(!enablePoseDataStreaming)
}
>
<Text style={styles.controlButtonText}>
Data Streaming: {enablePoseDataStreaming ? 'ON' : 'OFF'}
</Text>
</TouchableOpacity>
<TouchableOpacity
style={[
styles.controlButton,
autoAdjustFPS && styles.controlButtonActive,
]}
onPress={() => setAutoAdjustFPS(!autoAdjustFPS)}
>
<Text style={styles.controlButtonText}>
Auto FPS: {autoAdjustFPS ? 'ON' : 'OFF'}
</Text>
</TouchableOpacity>
<View style={styles.fpsButtons}>
{[15, 30, 60].map((fpsValue) => (
<TouchableOpacity
key={fpsValue}
style={[
styles.fpsButton,
targetFPS === fpsValue && styles.fpsButtonActive,
]}
onPress={() => setTargetFPS(fpsValue)}
>
<Text style={styles.fpsButtonText}>{fpsValue}</Text>
</TouchableOpacity>
))}
</View>
</View>
)}
</View>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
camera: {
flex: 1,
},
performanceOverlay: {
position: 'absolute',
top: 50,
left: 20,
right: 20,
backgroundColor: 'rgba(0, 0, 0, 0.7)',
borderRadius: 10,
padding: 15,
},
performanceText: {
color: 'white',
fontSize: 14,
fontWeight: '600',
marginBottom: 10,
},
toggleButton: {
backgroundColor: '#007AFF',
padding: 8,
borderRadius: 6,
alignItems: 'center',
},
toggleButtonText: {
color: 'white',
fontSize: 12,
fontWeight: '600',
},
controlsPanel: {
marginTop: 10,
},
controlButton: {
backgroundColor: 'rgba(255, 255, 255, 0.2)',
padding: 8,
borderRadius: 6,
marginBottom: 8,
alignItems: 'center',
},
controlButtonActive: {
backgroundColor: '#4CAF50',
},
controlButtonText: {
color: 'white',
fontSize: 12,
fontWeight: '500',
},
fpsButtons: {
flexDirection: 'row',
justifyContent: 'space-around',
marginTop: 8,
},
fpsButton: {
backgroundColor: 'rgba(255, 255, 255, 0.2)',
padding: 8,
borderRadius: 6,
minWidth: 40,
alignItems: 'center',
},
fpsButtonActive: {
backgroundColor: '#007AFF',
},
fpsButtonText: {
color: 'white',
fontSize: 12,
fontWeight: '600',
},
});
import React, { useState } from 'react';
import { View, Text, Alert, TouchableOpacity, StyleSheet } from 'react-native';
import { ReactNativeMediapipePoseView } from '@gymbrosinc/react-native-mediapipe-pose';
export default function JumpDetectionApp() {
const [isJumpEnabled, setIsJumpEnabled] = useState(false);
const [jumpPhase, setJumpPhase] = useState<string>('idle');
const [countdown, setCountdown] = useState<number | null>(null);
const [jumpStats, setJumpStats] = useState({
totalJumps: 0,
bestHeight: 0,
averageHeight: 0,
});
const handleJumpResult = ({ nativeEvent: result }) => {
setJumpPhase('complete');
// Update statistics
setJumpStats((prev) => ({
totalJumps: prev.totalJumps + 1,
bestHeight: Math.max(prev.bestHeight, result.jumpHeight),
averageHeight:
(prev.averageHeight * prev.totalJumps + result.jumpHeight) /
(prev.totalJumps + 1),
}));
// Show completion modal
Alert.alert(
'🎉 Jump Complete!',
`Height: ${result.jumpHeight.toFixed(1)}cm\n` +
`Flight Time: ${result.flightTime.toFixed(3)}s\n` +
`Accuracy: ${(result.accuracy * 100).toFixed(1)}%`,
[{ text: 'OK' }]
);
};
const resetJumpDetection = () => {
setIsJumpEnabled(false);
setJumpPhase('idle');
setCountdown(null);
};
return (
<View style={styles.container}>
<ReactNativeMediapipePoseView
style={styles.camera}
enablePoseDetection={true}
enableJumpDetection={isJumpEnabled}
onBaselineSet={() => setJumpPhase('baseline')}
onCountdownStart={({ nativeEvent: { countdown } }) => {
setJumpPhase('ready');
setCountdown(countdown);
}}
onJumpStart={() => {
setJumpPhase('takeoff');
setCountdown(null);
}}
onPeakHeight={() => setJumpPhase('peak')}
onJumpEnd={() => setJumpPhase('landing')}
onJumpResult={handleJumpResult}
/>
<View style={styles.overlay}>
<Text style={styles.phaseText}>Phase: {jumpPhase}</Text>
{countdown !== null && countdown >= 0 && (
<Text style={styles.countdownText}>
{countdown === 0 ? 'JUMP!' : countdown}
</Text>
)}
<View style={styles.statsPanel}>
<Text style={styles.statsText}>
Total Jumps: {jumpStats.totalJumps}
</Text>
<Text style={styles.statsText}>
Best: {jumpStats.bestHeight.toFixed(1)}cm
</Text>
<Text style={styles.statsText}>
Average: {jumpStats.averageHeight.toFixed(1)}cm
</Text>
</View>
<View style={styles.controls}>
<TouchableOpacity
style={[styles.button, isJumpEnabled && styles.buttonActive]}
onPress={() => setIsJumpEnabled(!isJumpEnabled)}
>
<Text style={styles.buttonText}>
{isJumpEnabled ? 'Stop Jump Detection' : 'Start Jump Detection'}
</Text>
</TouchableOpacity>
{jumpPhase === 'complete' && (
<TouchableOpacity
style={styles.button}
onPress={resetJumpDetection}
>
<Text style={styles.buttonText}>Reset</Text>
</TouchableOpacity>
)}
</View>
</View>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
camera: {
flex: 1,
},
overlay: {
position: 'absolute',
top: 60,
left: 20,
right: 20,
},
phaseText: {
color: 'white',
fontSize: 18,
fontWeight: 'bold',
textAlign: 'center',
marginBottom: 10,
},
countdownText: {
color: 'yellow',
fontSize: 48,
fontWeight: 'bold',
textAlign: 'center',
marginBottom: 20,
},
statsPanel: {
backgroundColor: 'rgba(0, 0, 0, 0.7)',
padding: 15,
borderRadius: 8,
marginBottom: 20,
},
statsText: {
color: 'white',
fontSize: 14,
marginBottom: 5,
},
controls: {
flexDirection: 'column',
gap: 10,
},
button: {
backgroundColor: 'rgba(255, 255, 255, 0.2)',
padding: 12,
borderRadius: 8,
alignItems: 'center',
},
buttonActive: {
backgroundColor: 'rgba(76, 175, 80, 0.8)',
},
buttonText: {
color: 'white',
fontSize: 16,
fontWeight: '600',
},
});
Prop | Type | Default | Description |
---|---|---|---|
style | StyleProp<ViewStyle> | - | Camera view styling and layout properties |
cameraType | 'front' | 'back' | 'front' | Camera position selection (front-facing or back) |
enablePoseDetection | boolean | false | Enable/disable real-time pose detection processing |
Prop | Type | Default | Description |
---|---|---|---|
enableJumpDetection | boolean | false | Enable real-time jump detection and analysis |
onBaselineSet | function | - | Triggered when baseline position is established |
onCountdownStart | function | - | Triggered during countdown sequence (3, 2, 1, 0) |
onJumpStart | function | - | Triggered when takeoff is detected |
onPeakHeight | function | - | Triggered when maximum jump height is reached |
onJumpEnd | function | - | Triggered when landing is detected |
onJumpResult | function | - | Final jump analysis with height, flight time, and metrics |
Prop | Type | Default | Description |
---|---|---|---|
enablePoseDataStreaming | boolean | false | CRITICAL: Enable pose data transmission to React Native |
poseDataThrottleMs | number | 100 | Throttle pose data events to reduce bridge overhead (ms) |
enableDetailedLogs | boolean | false | Enable comprehensive logging (disable for production) |
fpsChangeThreshold | number | 2.0 | Minimum FPS change required to trigger onFrameProcessed event |
fpsReportThrottleMs | number | 500 | Throttle frequency for FPS reporting to reduce bridge calls |
Prop | Type | Default | Description |
---|---|---|---|
targetFPS | number | 30 | Target frames per second for pose detection processing |
autoAdjustFPS | boolean | true | Enable automatic FPS adjustment based on device capability |
Event | Type | Description |
---|---|---|
onCameraReady | (event: { nativeEvent: { ready: boolean } }) => void | Triggered when camera initializes and is ready to use |
onError | (event: { nativeEvent: { error: string } }) => void | Camera or pose detection errors |
onPoseDetected | (event: { nativeEvent: PoseDetectionResult }) => void | Requires enablePoseDataStreaming=true |
onFrameProcessed | (event: { nativeEvent: FrameProcessingInfo }) => void | Frame processing statistics and FPS monitoring |
Event | Type | Description |
---|---|---|
onDeviceCapability | (event: { nativeEvent: DeviceCapability }) => void | Device performance tier and recommended settings |
onGPUStatus | (event: { nativeEvent: GPUStatusEvent }) => void | GPU acceleration status and delegate information |
onPoseServiceLog | (event: { nativeEvent: PoseServiceLogEvent }) => void | Requires enableDetailedLogs=true |
onPoseServiceError | (event: { nativeEvent: PoseServiceErrorEvent }) => void | Pose service specific errors and recovery info |
Available only when enablePoseDataStreaming={true}
interface PoseDetectionResult {
landmarks: PoseLandmark[]; // Array of 33 pose landmarks with coordinates
processingTime: number; // Processing time in milliseconds
timestamp: number; // Detection timestamp (Unix time)
confidence: number; // Overall detection confidence (0.0 - 1.0)
// Extended data (when enableDetailedLogs=true):
deviceTier?: string; // 'high' | 'medium' | 'low'
gpuAccelerated?: boolean; // Current GPU acceleration status
processingUnit?: string; // 'GPU' | 'CPU' processing unit
delegate?: string; // MediaPipe delegate type ('GPU' | 'CPU')
}
33 Body Landmarks with 3D Coordinates
interface PoseLandmark {
x: number; // Normalized x coordinate (0.0 - 1.0)
y: number; // Normalized y coordinate (0.0 - 1.0)
z: number; // Normalized z coordinate (depth, relative to hips)
visibility: number; // Landmark visibility confidence (0.0 - 1.0)
}
// Landmark indices (0-32):
// 0: nose, 1-6: eyes, 7-8: ears, 9-10: mouth corners
// 11-12: shoulders, 13-14: elbows, 15-16: wrists
// 17-22: hand landmarks, 23-24: hips, 25-26: knees
// 27-28: ankles, 29-30: heels, 31-32: foot indices
interface FrameProcessingInfo {
fps: number; // Current frames per second
autoAdjusted?: boolean; // Whether FPS was automatically adjusted
newTargetFPS?: number; // New target FPS after auto-adjustment
reason?: string; // Reason for FPS adjustment
processingTime?: number; // Frame processing time (ms)
droppedFrames?: number; // Count of dropped frames
}
Automatic Device Performance Detection
interface DeviceCapability {
deviceTier: 'high' | 'medium' | 'low'; // Performance classification
recommendedFPS: number; // Optimal FPS for this device
processorCount: number; // Number of CPU cores
physicalMemoryGB: number; // Available RAM in GB
// Device classification examples:
// high: iPhone 12 Pro+, iPad Pro M1+ → 60 FPS, GPU acceleration
// medium: iPhone XS-12, iPad Air → 30 FPS, GPU acceleration
// low: iPhone X-, older iPads → 15 FPS, CPU processing
}
interface GPUStatusEvent {
isUsingGPU: boolean; // Current GPU acceleration status
delegate: string; // 'GPU' | 'CPU' delegate type
processingUnit: string; // Detailed processing unit description
deviceTier: string; // Device performance tier
metalSupported?: boolean; // Metal framework availability (iOS)
}
Available only when enableDetailedLogs={true}
interface PoseServiceLogEvent {
message: string; // Log message content
level: string; // Log level ('info', 'warning', 'debug')
timestamp: number; // Log timestamp
}
interface PoseServiceErrorEvent {
error: string; // Error description
processingTime: number; // Time when error occurred (ms)
recoverable?: boolean; // Whether error is recoverable
suggestion?: string; // Recovery suggestion
}
import ReactNativeMediapipePose from '@gymbrosinc/react-native-mediapipe-pose';
// Camera permission management
const granted: boolean = await ReactNativeMediapipePose.requestCameraPermissions();
// Camera switching (requires view reference)
await ReactNativeMediapipePose.switchCamera(viewTag: number);
// GPU status inquiry (requires view reference)
const gpuStatus = await ReactNativeMediapipePose.getGPUStatus(viewTag: number);
<ReactNativeMediapipePoseView
enablePoseDetection={true}
// CRITICAL: Disable data streaming for max performance
enablePoseDataStreaming={false} // Eliminates bridge overhead
enableDetailedLogs={false} // Reduces logging overhead
// Optimize bridge communication
poseDataThrottleMs={200} // Higher throttling
fpsReportThrottleMs={1000} // Reduce FPS reports
fpsChangeThreshold={5.0} // Only significant changes
// Auto-optimization
autoAdjustFPS={true} // Device-based adjustment
targetFPS={30} // Conservative target
// Essential events only
onFrameProcessed={handleFPS} // Monitor performance
onDeviceCapability={handleDevice} // Device optimization
onError={handleErrors} // Error handling
// onPoseDetected - NOT NEEDED when streaming disabled
/>
<ReactNativeMediapipePoseView
enablePoseDetection={true}
// Enable all data for development
enablePoseDataStreaming={true} // Access pose data
enableDetailedLogs={true} // Comprehensive logging
// Fast updates for development
poseDataThrottleMs={50} // Fast pose data
fpsReportThrottleMs={250} // Frequent FPS updates
fpsChangeThreshold={1.0} // Sensitive monitoring
// Performance monitoring
targetFPS={60} // High target for testing
autoAdjustFPS={true} // Test auto-adjustment
// Full event monitoring
onPoseDetected={handlePoseData} // Process pose landmarks
onFrameProcessed={handleFrames} // Monitor performance
onDeviceCapability={handleDevice} // Device analysis
onGPUStatus={handleGPU} // GPU monitoring
onPoseServiceLog={handleLogs} // Service logs
onPoseServiceError={handleErrors} // Error analysis
/>
const [needsPoseData, setNeedsPoseData] = useState(false);
<ReactNativeMediapipePoseView
enablePoseDetection={true}
// Dynamic data streaming
enablePoseDataStreaming={needsPoseData} // Toggle as needed
enableDetailedLogs={__DEV__} // Dev-only logging
// Balanced performance
poseDataThrottleMs={100} // Moderate throttling
fpsReportThrottleMs={500} // Standard reporting
autoAdjustFPS={true} // Auto-optimization
// Conditional pose data handling
onPoseDetected={needsPoseData ? handlePoseData : undefined}
onFrameProcessed={handlePerformance}
onDeviceCapability={handleCapability}
/>;
For maximum performance in production environments:
<ReactNativeMediapipePoseView
enablePoseDataStreaming={false} // Critical: Disable for max performance
enableDetailedLogs={false} // Critical: Disable for production
poseDataThrottleMs={200} // Increase throttling
fpsReportThrottleMs={1000} // Reduce FPS reporting frequency
autoAdjustFPS={true} // Enable automatic optimization
targetFPS={30} // Conservative target for stability
/>
For development and debugging:
<ReactNativeMediapipePoseView
enablePoseDataStreaming={true} // Enable for data access
enableDetailedLogs={true} // Enable for debugging
poseDataThrottleMs={50} // Fast updates for testing
onPoseDetected={handlePoseData}
onGPUStatus={handleGPUStatus}
onFrameProcessed={handlePerformance}
/>
The module automatically detects device performance and adjusts accordingly:
import React, { useState, useCallback } from 'react';
import { View, Text, ScrollView, StyleSheet } from 'react-native';
import {
ReactNativeMediapipePoseView,
DeviceCapability,
FrameProcessingInfo,
PoseDetectionResult,
} from '@gymbrosinc/react-native-mediapipe-pose';
export default function AdvancedPoseDetection() {
const [performanceMetrics, setPerformanceMetrics] = useState({
fps: 0,
avgProcessingTime: 0,
poseCount: 0,
deviceTier: 'unknown',
isUsingGPU: false,
});
const [logs, setLogs] = useState<string[]>([]);
const [lastError, setLastError] = useState<string | null>(null);
const addLog = useCallback((message: string) => {
setLogs((prev) => [
...prev.slice(-19),
`${new Date().toLocaleTimeString()}: ${message}`,
]);
}, []);
const handleFrameProcessed = useCallback(
(event: { nativeEvent: FrameProcessingInfo }) => {
const { fps, autoAdjusted, newTargetFPS, reason } = event.nativeEvent;
setPerformanceMetrics((prev) => ({
...prev,
fps: Math.round(fps),
}));
if (autoAdjusted && newTargetFPS && reason) {
addLog(`Auto-adjusted FPS to ${newTargetFPS}: ${reason}`);
}
},
[addLog]
);
const handlePoseDetected = useCallback(
(event: { nativeEvent: PoseDetectionResult }) => {
const { landmarks, processingTime, confidence } = event.nativeEvent;
setPerformanceMetrics((prev) => ({
...prev,
avgProcessingTime: (prev.avgProcessingTime + processingTime) / 2,
poseCount: prev.poseCount + 1,
}));
addLog(
`Detected ${landmarks.length} landmarks (${processingTime.toFixed(1)}ms, conf: ${confidence.toFixed(2)})`
);
},
[addLog]
);
const handleDeviceCapability = useCallback(
(event: { nativeEvent: DeviceCapability }) => {
const { deviceTier, recommendedFPS, processorCount, physicalMemoryGB } =
event.nativeEvent;
setPerformanceMetrics((prev) => ({ ...prev, deviceTier }));
addLog(
`Device: ${deviceTier}, ${processorCount} cores, ${physicalMemoryGB.toFixed(1)}GB RAM, rec. FPS: ${recommendedFPS}`
);
},
[addLog]
);
const handleGPUStatus = useCallback(
(event: { nativeEvent: any }) => {
const { isUsingGPU, delegate, processingUnit, deviceTier } =
event.nativeEvent;
setPerformanceMetrics((prev) => ({ ...prev, isUsingGPU }));
addLog(
`GPU: ${isUsingGPU ? 'Enabled' : 'Disabled'}, Delegate: ${delegate}, Unit: ${processingUnit}`
);
},
[addLog]
);
const handlePoseServiceError = useCallback(
(event: { nativeEvent: any }) => {
const { error, processingTime } = event.nativeEvent;
const errorMsg = `${error} (${processingTime}ms)`;
setLastError(errorMsg);
addLog(`ERROR: ${errorMsg}`);
// Auto-clear error after 5 seconds
setTimeout(() => setLastError(null), 5000);
},
[addLog]
);
return (
<View style={styles.container}>
<ReactNativeMediapipePoseView
style={styles.camera}
enablePoseDetection={true}
enablePoseDataStreaming={true}
enableDetailedLogs={true} // Enable for debugging
poseDataThrottleMs={50} // Fast updates for development
targetFPS={30}
autoAdjustFPS={true}
onFrameProcessed={handleFrameProcessed}
onPoseDetected={handlePoseDetected}
onDeviceCapability={handleDeviceCapability}
onGPUStatus={handleGPUStatus}
onPoseServiceError={handlePoseServiceError}
onPoseServiceLog={(event) =>
addLog(`Service: ${event.nativeEvent.message}`)
}
/>
{/* Performance Dashboard */}
<View style={styles.dashboard}>
<Text style={styles.dashboardTitle}>Performance Dashboard</Text>
<View style={styles.metricsRow}>
<Text style={styles.metric}>FPS: {performanceMetrics.fps}</Text>
<Text style={styles.metric}>
Avg Time: {performanceMetrics.avgProcessingTime.toFixed(1)}ms
</Text>
<Text style={styles.metric}>
Poses: {performanceMetrics.poseCount}
</Text>
</View>
<View style={styles.metricsRow}>
<Text style={styles.metric}>
Device: {performanceMetrics.deviceTier.toUpperCase()}
</Text>
<Text
style={[
styles.metric,
{ color: performanceMetrics.isUsingGPU ? '#4CAF50' : '#FF9800' },
]}
>
{performanceMetrics.isUsingGPU ? 'GPU' : 'CPU'}
</Text>
</View>
{lastError && <Text style={styles.errorText}>❌ {lastError}</Text>}
{/* Live Logs */}
<ScrollView
style={styles.logsContainer}
showsVerticalScrollIndicator={false}
>
{logs.map((log, index) => (
<Text key={index} style={styles.logText}>
{log}
</Text>
))}
</ScrollView>
</View>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
},
camera: {
flex: 1,
},
dashboard: {
position: 'absolute',
top: 50,
left: 20,
right: 20,
backgroundColor: 'rgba(0, 0, 0, 0.8)',
borderRadius: 10,
padding: 15,
maxHeight: 300,
},
dashboardTitle: {
color: '#4CAF50',
fontSize: 16,
fontWeight: 'bold',
marginBottom: 10,
},
metricsRow: {
flexDirection: 'row',
justifyContent: 'space-between',
marginBottom: 8,
},
metric: {
color: 'white',
fontSize: 12,
fontWeight: '600',
},
errorText: {
color: '#FF5252',
fontSize: 12,
marginVertical: 5,
},
logsContainer: {
maxHeight: 120,
marginTop: 10,
},
logText: {
color: '#E0E0E0',
fontSize: 10,
fontFamily: 'monospace',
marginBottom: 2,
},
});
const handleError = (event: { nativeEvent: { error: string } }) => {
const { error } = event.nativeEvent;
console.error('Pose detection error:', error);
// Handle specific error types
if (error.includes('Camera')) {
Alert.alert(
'Camera Error',
'Please check camera permissions and try again.'
);
} else if (error.includes('GPU') || error.includes('Metal')) {
console.log('GPU error detected, falling back to CPU processing');
// GPU errors are automatically handled by the module
} else if (error.includes('Model')) {
Alert.alert(
'Model Error',
'Please ensure the pose detection model is properly installed.'
);
}
// Implement automatic recovery
setEnablePoseDetection(false);
setTimeout(() => {
console.log('Attempting to restart pose detection...');
setEnablePoseDetection(true);
}, 2000);
};
const handleCameraNotReady = () => {
console.log('Camera not ready, checking simulator...');
if (__DEV__ && Platform.OS === 'ios') {
Alert.alert(
'iOS Simulator',
'Camera is not available in iOS Simulator. Please test on a physical device.',
[{ text: 'OK' }]
);
}
};
git clone https://github.com/your-repo/react-native-mediapipe-pose.git
cd react-native-mediapipe-pose
npm install
# Run example app
cd example
npm install
npx expo run:ios
npm run test
npm run lint
npm run build
This package is provided for testing and evaluation purposes only.
For production licensing and commercial use, please contact the package owner.
MIT License - see LICENSE for details.
Contributions are welcome! Please read our Contributing Guide for details on our code of conduct and the process for submitting pull requests.
FAQs
React Native MediaPipe Pose Detection with GPU acceleration, jump detection, and high-performance analysis
The npm package @gymbrosinc/react-native-mediapipe-pose receives a total of 24 weekly downloads. As such, @gymbrosinc/react-native-mediapipe-pose popularity was classified as not popular.
We found that @gymbrosinc/react-native-mediapipe-pose demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.
Research
/Security News
Ongoing npm supply chain attack spreads to DuckDB: multiple packages compromised with the same wallet-drainer malware.
Security News
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.