Skip to main content
Cyberwave connects to 80+ robotic devices out of the box through its hardware catalog. Every device in the catalog has a pre-built driver that is automatically installed when you pair hardware with a digital twin — no manual integration required. Beyond the catalog, Cyberwave’s open driver architecture means any device with an API, serial interface, or network protocol can be connected. You can write a driver in Python or C++ using our SDKs, or use our AI scaffolding tool to generate one in minutes.

How It Works

Every hardware connection follows the same pattern:
  1. Add a digital twin from the catalog (or create a custom one)
  2. Install the Cyberwave edge stack on a device connected to your hardware
  3. Pair — the driver is installed automatically and your hardware starts syncing in real time
The Edge Core orchestrates drivers on the edge device and bridges all communication to the Cyberwave cloud over MQTT.
These platforms have dedicated setup guides with step-by-step instructions.

SO101 Robot Arms

Open-source 6-DOF manipulator arms for desk-based manipulation, teleoperation, and imitation learning.

UGV Beast Rover

Off-road tracked rover with a Raspberry Pi + ESP32 dual-controller architecture and ROS 2 stack.

Unitree Go2

Intelligent quadruped with 4D LiDAR, AI-powered locomotion, and autonomous navigation.

Universal Robot UR7e

Industrial collaborative arm connected via a Raspberry Pi running ROS 2 and an MQTT bridge.

Boston Dynamics Spot

Quadruped robot for inspection, patrol, and autonomous missions.

Camera

USB webcams and laptop cameras for live streaming, vision workflows, and dataset recording.
For devices not listed above, see the Other Hardware guide.

Full ROS Compatibility

Cyberwave is fully compatible with ROS 1 and ROS 2. The open-source cyberwave-os GitHub organization provides SDKs and reference implementations for bridging ROS topics with Cyberwave’s MQTT-based digital twin layer. A typical ROS integration uses the MQTT bridge pattern: a lightweight node subscribes to ROS topics (joint states, odometry, camera feeds) and publishes them to the Cyberwave MQTT broker. Commands flow in the opposite direction — from the dashboard or API, through MQTT, and into ROS action servers or publishers.

Custom Integrations

Integrate with ROS, VDA5050, OPC UA, Modbus, and other industrial protocols

Industrial Protocol Support

Cyberwave’s driver architecture is not limited to ROS. The same pattern works with any protocol your hardware speaks:
ProtocolUse Case
ROS 1 / ROS 2Robot arms, mobile robots, sensor stacks
VDA5050AGV and AMR fleet communication
OPC UAIndustrial automation and PLC connectivity
Modbus TCP/RTUSensor and actuator networks
gRPC / RESTCustom services and microservice architectures
Serial / USBDirect device control (servos, microcontrollers)
Each protocol is bridged to Cyberwave through a driver — a Docker container managed by the Edge Core that translates between your device’s native interface and Cyberwave’s MQTT layer.

Extend Cyberwave with Custom Drivers

If your hardware isn’t in the catalog, you can write a compatible driver and connect it in minutes. A driver is a Docker container that translates between your device’s native API and Cyberwave’s MQTT interface.

Writing Compatible Drivers

Full guide on driver architecture, environment variables, and packaging

Driver Overview

How drivers are registered and managed by Edge Core

Python SDK

Build drivers and applications with the Cyberwave Python SDK

C++ SDK

High-performance SDK for embedded and latency-sensitive drivers
You can also use the Cyberwave Driver Skill to scaffold a complete driver project interactively with AI assistance.

Reference Implementations

These open-source drivers are good starting points:

What You Can Do Once Connected

Regardless of which hardware you connect, every digital twin on Cyberwave gives you access to the same platform capabilities:
  • Real-time teleoperation — control your hardware from the dashboard, SDK, or API
  • Live telemetry streaming — monitor sensor data, joint states, camera feeds, and more
  • Dataset recording — capture episodic datasets for training and evaluation
  • ML model training and deployment — train models on recorded data and deploy them as autonomous controller policies
  • Simulation — test in a browser-based 3D environment before deploying to physical hardware
  • Workflows and automation — chain actions, models, and logic into repeatable workflows

Browse the Catalog

Hardware Catalog

Browse all 80+ supported devices and add them to your environment