Sensor SDK
Embed the Runhuman sensor in your mobile or web app to capture network requests, console errors, crashes, and user interactions during human QA testing. The captured telemetry is automatically included in the AI analysis, producing richer and more accurate bug reports.
The sensor has zero overhead when no test is running — it only activates when a test is running.
Enterprise Pro — The Sensor SDK is available on the Enterprise Pro plan. Learn more.
Installation
npm install @runhuman/sensor
The sensor works with React Native (0.70+) and all modern browsers. React (18+) is required for the provider component.
Quick Start — React (Web or React Native)
Wrap your app with <RunhumanProvider> — that’s it. No separate initialization needed:
import { RunhumanProvider } from '@runhuman/sensor';
export default function App() {
return (
<RunhumanProvider apiKey="rh_your_api_key">
<YourApp />
</RunhumanProvider>
);
}
The same import works for both web and React Native — the bundler automatically picks the correct entry point.
Quick Start — Vanilla JS (Vue, Svelte, etc.)
For non-React web apps, use initWeb():
import { initWeb } from '@runhuman/sensor';
const { runhuman, overlay } = initWeb({ apiKey: 'rh_your_api_key' });
// Later: overlay.unmount();
This initializes the sensor, sets up URL parameter activation, and mounts the overlay UI in one call.
Configuration
Pass options as props to <RunhumanProvider>:
<RunhumanProvider
apiKey="rh_your_api_key"
debug={true}
position="top-right"
>
<YourApp />
</RunhumanProvider>
| Prop | Type | Default | Description |
|---|---|---|---|
apiKey | string | required | Your Runhuman API key (rh_...) |
enabled | boolean | true | Set to false to disable the sensor entirely (e.g., in production) |
position | 'top-left' | 'top-right' | 'bottom-left' | 'bottom-right' | 'bottom-right' | Corner for the overlay UI |
baseUrl | string | Production URL | Runhuman API server URL |
jobId | string | — | If known upfront, start polling for this job immediately |
platform | 'react-native' | 'web' | 'ios' | 'android' | Auto-detected | Target platform |
flushIntervalMs | number | 5000 | How often to send event batches (ms) |
pollIntervalMs | number | 10000 | How often to check for active jobs (ms) |
maxBufferSize | number | 1000 | Max buffered events before oldest are dropped |
debug | boolean | false | Log all events to console |
interceptors | Interceptor[] | — | Additional interceptors for custom event capture (e.g., navigation tracker) |
storageAdapter | StorageAdapter | Auto (localStorage / AsyncStorage) | Custom storage backend for session persistence |
sessionTtlMs | number | 1800000 (30 min) | How long a persisted session remains resumable after the app closes |
Activation Methods
The sensor needs to know which job to capture telemetry for. There are three ways to activate it.
Code Entry (All Platforms)
The overlay shows a code input panel. Testers enter the short code from their job assignment. This is the default experience — just wrap your app with <RunhumanProvider> and the panel appears automatically.
URL Parameter (Web)
On web, append ?runhuman_job=<id> to the page URL and the sensor activates automatically. This is useful for sharing direct test links with testers.
Deep Links (React Native)
For React Native apps with custom URL schemes, the sensor auto-activates via deep link. Configure your app’s URL scheme in your React Native project (e.g., in app.json for Expo or Info.plist / AndroidManifest.xml for bare React Native).
The deep link format is: <scheme>://runhuman?jobId=<id>
Programmatic
For custom flows, activate directly:
const sensor = Runhuman.getInstance();
sensor.activate('job-id-here');
To end a session:
await sensor.deactivate();
Provider Component
<RunhumanProvider> wraps your app, handles sensor lifecycle, and provides the activation UI and status indicator.
import { RunhumanProvider } from '@runhuman/sensor';
<RunhumanProvider apiKey="rh_your_api_key" position="bottom-right">
<App />
</RunhumanProvider>
What it shows in each state:
| State | UI |
|---|---|
| Idle (no active test) | Floating code entry panel (dismissible to a small “RH” button) |
| Resumable (previous session found) | “Continue session [CODE]?” prompt with Continue / New Session buttons |
| Active (capturing) | Small green pulsing dot |
| Ending (flushing events) | Amber dot |
The overlay is invisible when minimized — it won’t interfere with your app’s UI during normal use.
What Gets Captured
When active, the sensor captures:
| Category | Web | React Native |
|---|---|---|
| Network requests | URLs, HTTP methods, status codes, timing, response sizes | Same |
| Console errors | Error and warning messages | Same |
| JavaScript crashes | Unhandled exceptions with stack traces | Same |
| Clicks / taps | Click coordinates + element descriptor (tag, id, class, text) | Touch coordinates |
| Keyboard input | Buffered text (flushed after 1s inactivity, passwords redacted to ***) | — |
| Navigation | Page loads + SPA route changes (pushState, popstate, History API) | Opt-in via createNavigationTracker() |
All telemetry is session-scoped — the sensor only captures while a test is active. No data is collected or sent when idle.
Events are buffered locally and flushed to the Runhuman API in batches (every 5 seconds by default). If the network is unavailable, events queue locally until connectivity resumes.
Session Persistence
Sessions survive app restarts. If a tester closes the app (or browser tab) mid-test and reopens within 30 minutes, the sensor shows a “Continue session?” prompt instead of the code entry panel.
How it works:
- Web — Uses
localStorageautomatically. No setup needed. - React Native — Uses
@react-native-async-storage/async-storageif installed. Add it as a dependency for persistence to work:
npx expo install @react-native-async-storage/async-storage
- Custom storage — Implement the
StorageAdapterinterface and pass it as a prop:
import { RunhumanProvider } from '@runhuman/sensor';
import type { StorageAdapter } from '@runhuman/sensor';
const myStorage: StorageAdapter = {
getItem: (key) => myDb.get(key),
setItem: (key, value) => myDb.set(key, value),
removeItem: (key) => myDb.delete(key),
};
<RunhumanProvider apiKey="rh_..." storageAdapter={myStorage}>
<App />
</RunhumanProvider>
TTL — By default, persisted sessions expire after 30 minutes. Configure with sessionTtlMs:
<RunhumanProvider apiKey="rh_..." sessionTtlMs={60 * 60 * 1000}> {/* 1 hour */}
<App />
</RunhumanProvider>
Resumed sessions maintain timestamp continuity — events continue from where the previous session left off, so the full timeline stays chronologically correct.
React Native Navigation Tracking
Navigation tracking on React Native is opt-in. Use createNavigationTracker() with a React Navigation ref:
import { RunhumanProvider, createNavigationTracker } from '@runhuman/sensor';
import { useRef } from 'react';
import { NavigationContainer } from '@react-navigation/native';
export default function App() {
const navigationRef = useRef(null);
return (
<RunhumanProvider
apiKey="rh_your_api_key"
interceptors={[createNavigationTracker(navigationRef)]}
>
<NavigationContainer ref={navigationRef}>
<AppNavigator />
</NavigationContainer>
</RunhumanProvider>
);
}
This emits navigation events (route name + params) whenever the user navigates, giving the AI full context about where the tester was in the app.
React Hook: useSensorState
Build custom activation UI instead of using the overlay:
import { useSensorState } from '@runhuman/sensor';
function MyStatusBar() {
const { state, activeJobId, sessionId, persistedSession } = useSensorState();
if (state === 'idle') return null;
return (
<View style={styles.statusBar}>
<Text>Sensor: {state}</Text>
{activeJobId && <Text>Job: {activeJobId}</Text>}
</View>
);
}
SensorState fields:
| Field | Type | Description |
|---|---|---|
state | 'idle' | 'polling' | 'active' | 'ending' | Current sensor state |
activeJobId | string | null | The job being tested |
sessionId | string | null | Current telemetry session ID |
persistedSession | PersistedSession | null | Previous session data if resumable within TTL |
The hook is safe to use before Runhuman.init() is called — it returns idle state until the sensor is initialized.
Debug Mode
Enable debug logging during development to verify the sensor is working:
<RunhumanProvider apiKey="rh_your_api_key" debug>
<YourApp />
</RunhumanProvider>
With debug, the sensor logs all captured events to the console prefixed with [Runhuman]. This is useful for:
- Verifying interceptors are installed correctly
- Checking that events are being captured
- Debugging activation issues
Turn off debug mode in production builds.
Disabling in Production
Use the enabled prop to disable the sensor in production builds. When enabled is false, no initialization occurs and no overlay is rendered — the provider becomes a transparent passthrough.
<RunhumanProvider
apiKey="rh_your_api_key"
enabled={__DEV__}
>
<YourApp />
</RunhumanProvider>
This is the recommended pattern for apps that ship to end users — the sensor stays active during internal QA builds but is completely inert in production.
Full Example — React Native
import React, { useRef } from 'react';
import { View, Text, StyleSheet } from 'react-native';
import { NavigationContainer } from '@react-navigation/native';
import { RunhumanProvider, useSensorState, createNavigationTracker } from '@runhuman/sensor';
function StatusBanner() {
const { state } = useSensorState();
if (state !== 'active') return null;
return (
<View style={styles.banner}>
<Text style={styles.bannerText}>QA test in progress</Text>
</View>
);
}
export default function App() {
const navigationRef = useRef(null);
return (
<RunhumanProvider
apiKey="rh_your_api_key"
debug={__DEV__}
position="bottom-right"
interceptors={[createNavigationTracker(navigationRef)]}
>
<StatusBanner />
<NavigationContainer ref={navigationRef}>
<View style={styles.app}>
<Text>My App</Text>
</View>
</NavigationContainer>
</RunhumanProvider>
);
}
const styles = StyleSheet.create({
banner: {
backgroundColor: '#22c55e',
padding: 8,
alignItems: 'center',
},
bannerText: {
color: '#fff',
fontWeight: '600',
fontSize: 12,
},
app: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
},
});
Full Example — Web (Vanilla JS)
import { initWeb } from '@runhuman/sensor';
const { runhuman, overlay } = initWeb({
apiKey: 'rh_your_api_key',
debug: true,
position: 'bottom-right',
});
// The overlay is now visible. Testers can enter a code or
// activate via ?runhuman_job=<id> in the URL.
// To tear down when your app unmounts:
overlay.unmount();
await runhuman.destroy();
Next Steps
| Want to… | Read |
|---|---|
| Get your API key | Setup |
| Create tests via the API | REST API |
| Use with AI coding agents | Agent Skills |
| Look up technical details | Reference |