This guide covers WebRTC-based real-time channels for advanced use cases requiring direct peer-to-peer communication. For most applications, the standard Twin Messaging patterns (Events and Actions) are sufficient.
When to Use Real-Time Channels
Real-time channels establish direct peer-to-peer connections between devices, bypassing the hub for data transfer. This enables true real-time communication but requires specific network conditions.
Use Cases
| Use Case | Why Real-Time Channels |
|---|---|
| Computer vision | Process video frames directly on-device |
| Live video monitoring | Stream camera feeds to screens or dashboards |
| High-frequency sensors | Send 100+ readings/second with sub-10ms latency |
| Large binary transfers | Send files, images, or data directly between devices |
| Real-time control | Joystick, robotics, or interactive applications |
Network Requirements
WebRTC requires devices to establish a direct connection. This works when:
- Same local network - Devices on the same WiFi/LAN (always works)
- STUN-compatible NAT - Most home routers and many enterprise networks support this
- Public IP - Devices with direct internet access
WebRTC may NOT work when:
- Behind strict corporate firewalls that block UDP
- Using symmetric NAT (some enterprise networks)
- Behind VPNs that don't forward WebRTC traffic
Tip: If you're unsure about network compatibility, try hub-based Events/Actions first. They work across any network. Use WebRTC only when you specifically need real-time streaming or high-frequency data.
For typical command/response patterns, status updates, and event notifications, use the standard Twin Messaging API.
DataChannels
DataChannels provide reliable, low-latency bidirectional messaging between twins. Messages can be strings, objects (automatically serialized to JSON), or binary data.
Creating a DataChannel (Initiator)
import { PhyHubClient } from "@phystack/hub-client";
const client = await PhyHubClient.connect();
// Connect to a target twin
const channel = await client.getDataChannel(targetTwinId);
// Send messages (objects are auto-serialized to JSON)
channel.send({ command: "start", timestamp: Date.now() });
// Receive messages
channel.onMessage((data) => {
console.log("Received:", data);
});
// Check connection status
if (channel.isOpen()) {
channel.send({ status: "active" });
}
// Close when done
channel.close();Accepting a DataChannel (Responder)
// Wait for incoming DataChannel connection from a specific twin
client.onDataChannel(sourceTwinId, (channel) => {
console.log("DataChannel received!");
channel.onMessage((data) => {
console.log("Received:", data);
// Echo back
channel.send({ type: "echo", original: data });
});
channel.onClose(() => {
console.log("Channel closed");
});
});Named Channels
Create multiple independent channels to the same peer:
// Create separate channels for different purposes
const controlChannel = await client.getDataChannel(targetTwinId, "control");
const dataChannel = await client.getDataChannel(targetTwinId, "data");
const telemetryChannel = await client.getDataChannel(targetTwinId, "telemetry");
// Each channel is independent
controlChannel.send({ command: "start" });
dataChannel.send({ payload: largeDataObject });
telemetryChannel.send({ cpu: 45, memory: 2048 });
// Accept specific named channels
client.onDataChannel(sourceTwinId, (ch) => handleControl(ch), "control");
client.onDataChannel(sourceTwinId, (ch) => handleData(ch), "data");
client.onDataChannel(sourceTwinId, (ch) => handleTelemetry(ch), "telemetry");MediaStreams
MediaStreams enable live video and audio streaming between devices. This is particularly useful for:
- Computer vision - Run ML models on video frames (object detection, face recognition, OCR)
- Remote monitoring - View camera feeds from edge devices on screens or dashboards
- Video calls - Two-way video/audio communication between devices
Sending Video (Initiator)
// Get camera access
const localStream = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true,
});
// Create a MediaStream connection with local tracks included
const { stream, close } = await client.getMediaStream(targetTwinId, {
direction: "sendonly",
localStream: localStream,
});
// The localStream tracks are automatically sent to the peer
console.log(`Streaming ${localStream.getTracks().length} tracks to peer`);
// Close when done
close();Receiving Video (Responder)
// Wait for incoming MediaStream connection
client.onMediaStream(sourceTwinId, (stream) => {
console.log("MediaStream connected!");
// Get the underlying MediaStream for use as video srcObject
const mediaStream = stream.getStream();
if (mediaStream) {
const videoElement = document.getElementById("remote-video");
videoElement.srcObject = mediaStream;
}
// Also listen for new tracks arriving after initial connection
stream.onTrack((track) => {
console.log("Received track:", track.kind);
// Update video element with latest stream
const currentStream = stream.getStream();
if (currentStream) {
videoElement.srcObject = currentStream;
}
});
stream.onClose(() => {
console.log("Stream closed");
});
});Two-Way Video
For bidirectional video (both sides send and receive), use direction: "sendrecv":
// Both sides need camera access
const localStream = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true,
});
// Initiator: create two-way stream
const { stream } = await client.getMediaStream(targetTwinId, {
direction: "sendrecv",
localStream: localStream,
});
// Handle incoming video from the peer
const mediaStream = stream.getStream();
if (mediaStream) {
document.getElementById("remote-video").srcObject = mediaStream;
}
stream.onTrack((track) => {
console.log("Received peer track:", track.kind);
});
// Responder: accept with local stream for two-way
client.onMediaStream(sourceTwinId, (stream) => {
// Add local tracks to send back
localStream.getTracks().forEach((track) => {
stream.addTrack(track);
});
// Display incoming video
const remoteStream = stream.getStream();
if (remoteStream) {
document.getElementById("remote-video").srcObject = remoteStream;
}
});Named MediaStreams
Create multiple independent media streams to the same peer:
// Initiator: create named streams
const { stream: videoStream } = await client.getMediaStream(targetTwinId, {
channelName: "primary-camera",
direction: "sendonly",
localStream: cameraStream,
});
const { stream: screenStream } = await client.getMediaStream(targetTwinId, {
channelName: "screen-share",
direction: "sendonly",
localStream: screenCaptureStream,
});
// Responder: accept named streams
client.onMediaStream(sourceTwinId, (stream) => {
document.getElementById("camera-video").srcObject = stream.getStream();
}, { channelName: "primary-camera" });
client.onMediaStream(sourceTwinId, (stream) => {
document.getElementById("screen-video").srcObject = stream.getStream();
}, { channelName: "screen-share" });Computer Vision Example
Process video frames for ML inference:
// Wait for camera stream from another twin (e.g., an edge device)
client.onMediaStream(cameraTwinId, (stream) => {
// Get the MediaStream for processing
const mediaStream = stream.getStream();
// Option 1: Use onFrame for direct frame access (Node.js with @roamhq/wrtc)
stream.onFrame(async (frameData) => {
const detections = await objectDetector.detect(frameData);
// Send results via hub-based messaging (works across any network)
const instance = await client.getInstance();
instance.emit("detections", {
timestamp: Date.now(),
objects: detections,
});
});
// Option 2: In browser, use canvas for frame extraction
if (mediaStream) {
const video = document.createElement("video");
video.srcObject = mediaStream;
video.play();
const canvas = document.createElement("canvas");
const ctx = canvas.getContext("2d");
setInterval(async () => {
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
ctx.drawImage(video, 0, 0);
const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height);
const detections = await objectDetector.detect(imageData);
console.log("Detected:", detections);
}, 100); // Process 10 frames per second
}
});Automatic Reconnection
The hub-client automatically handles WebRTC complexity for you:
- Automatic reconnection - If the connection drops, the client automatically re-establishes it
- Handler persistence - Your
onMessage,onTrack, andonFramecallbacks survive reconnections - STUN/TURN negotiation - ICE candidate exchange is handled automatically
- Fallback strategies - Tries direct connection first, falls back to STUN if needed
You don't need to manage connection state, handle ICE candidates, or implement reconnection logic. Just register your handlers once and the client keeps them working:
const channel = await client.getDataChannel(targetTwinId);
// Register once - persists across reconnections
channel.onMessage((data) => {
console.log("Received:", data);
});
// Check state if needed
if (channel.isOpen()) {
channel.send(data);
} else if (channel.isConnecting()) {
// Reconnection in progress - message would be dropped
}Note: Messages sent while disconnected are dropped, not queued. This is by design for real-time applications where stale data is worse than no data.
Working with Different Twins
You can create channels to any twin you have access to:
const client = await PhyHubClient.connect();
// Create a DataChannel to a specific twin
const channel = await client.getDataChannel(sensorTwinId);
channel.send({ command: "start" });
channel.onMessage((data) => {
console.log("Sensor data:", data);
});
// Create a MediaStream from a camera twin
const { stream } = await client.getMediaStream(cameraTwinId, {
direction: "recvonly", // Just receiving video
});
// Display the video
const mediaStream = stream.getStream();
if (mediaStream) {
document.getElementById("camera-feed").srcObject = mediaStream;
}
stream.onTrack((track) => {
console.log("Track received:", track.kind);
});
// Named channels for different data types
const controlChannel = await client.getDataChannel(peripheralTwinId, "control");
const telemetryChannel = await client.getDataChannel(peripheralTwinId, "telemetry");
// Accept named channels from a peer
client.onDataChannel(peripheralTwinId, (ch) => handleControl(ch), "control");
client.onDataChannel(peripheralTwinId, (ch) => handleTelemetry(ch), "telemetry");Connection Events (Advanced)
Subscribe to WebRTC events for detailed connection monitoring:
const manager = await client.getWebRTCManager({ verbose: true });
manager.on("connected", ({ targetTwinId, connectionType }) => {
console.log(`Connected to ${targetTwinId} (${connectionType})`);
});
manager.on("disconnected", ({ targetTwinId }) => {
console.log(`Disconnected from ${targetTwinId}`);
});
manager.on("reconnecting", ({ targetTwinId, attempt }) => {
console.log(`Reconnecting to ${targetTwinId} (attempt ${attempt})`);
});
manager.on("reconnected", ({ targetTwinId }) => {
console.log(`Reconnected to ${targetTwinId}`);
});
manager.on("error", ({ error }) => {
console.error("WebRTC error:", error);
});API Reference
PhygridDataChannel
| Method | Description |
|---|---|
send(data) | Send string, object, or ArrayBuffer |
onMessage(cb) | Register message handler |
offMessage(cb) | Remove message handler |
onClose(cb) | Register close handler |
offClose(cb) | Remove close handler |
close() | Close the channel |
isOpen() | Check if channel is open |
isConnecting() | Check if connecting/reconnecting |
getTargetTwinId() | Get connected twin ID |
getChannelName() | Get channel name (default: 'default') |
PhygridMediaStream
| Method | Description |
|---|---|
getStream() | Get the underlying MediaStream (use as video srcObject) |
getTracks() | Get all current tracks |
addTrack(track) | Add track to send to peer |
onTrack(cb) | Register track received handler |
offTrack(cb) | Remove track handler |
onFrame(cb) | Register frame data handler (Node.js) |
offFrame(cb) | Remove frame handler |
onClose(cb) | Register close handler |
offClose(cb) | Remove close handler |
close() | Close the stream |
isReceivingFrames() | Check if receiving frames |
isConnecting() | Check if connecting/reconnecting |
getTargetTwinId() | Get connected twin ID |
getChannelName() | Get channel name (default: 'default') |
Node.js Setup
For Node.js environments, install the WebRTC polyfill:
npm install @roamhq/wrtcThe hub-client automatically detects and uses this package when available.
Troubleshooting
Connection Not Establishing
- Ensure both devices are connected to PhyStack
- Verify twin IDs are correct (use twin ID, not device ID)
- Check that both devices are in the same tenant
- For Node.js, ensure
@roamhq/wrtcis installed
Messages Not Being Received
- Verify the channel is open with
channel.isOpen() - Check that message handlers are registered
- Messages sent during disconnection are dropped by design
MediaStream Not Working
- In browsers, ensure camera/microphone permissions are granted
- Check that the correct direction is set (sendrecv, recvonly, sendonly)
- For Node.js, media streams require additional setup with
@roamhq/wrtc
Related Guides
- Communication Overview - Compare messaging approaches
- Twin Messaging - Events and Actions API
- Build Your First Edge App - Create an edge application
- Peripherals and Descriptors - Hardware integration