How to Build a mmWave Radar Presence Detection Agent with OpenClaw on Raspberry Pi
PIR sensors miss people who sit still. mmWave radar does not. This guide covers wiring a 24GHz FMCW radar module to a Raspberry Pi, reading presence and distance data over UART or I2C, and running an OpenClaw agent that reasons about occupancy across zones, makes automation decisions, and logs events to a persistent workspace.
Why mmWave Radar Solves the Stationary Presence Problem
Every office worker has experienced the lights shutting off mid-meeting. PIR sensors detect infrared heat differentials caused by movement. Sit still for a few minutes and the sensor assumes the room is empty. This false-negative problem wastes energy (lights cycle on and off repeatedly), frustrates occupants, and produces unreliable occupancy data for building automation systems.
mmWave radar takes a fundamentally different approach. A 24GHz FMCW (Frequency Modulated Continuous Wave) transmitter sends a chirp signal that sweeps across a frequency range. When that signal reflects off a human body, the sensor measures the frequency shift between the transmitted and received waves. Moving targets produce a large Doppler shift. Stationary targets still produce a detectable shift because human bodies are never still: your chest rises and falls with each breath, your heart beats, micro-movements in your limbs create tiny reflections the radar can isolate.
This means mmWave presence detection works in complete darkness, through non-metallic enclosures, unaffected by temperature changes or HVAC airflow. It does not rely on the target emitting heat or moving across the sensor's field of view. A person sleeping in a chair, reading a book, or meditating with eyes closed still registers as present.
The limitation of most Raspberry Pi mmWave tutorials is that they stop at reading the sensor output: "presence detected" or "no presence." That binary signal is useful for a light switch. It is not useful for reasoning about multi-room occupancy patterns, deciding whether a room has been vacated or someone just stepped out briefly, or triggering different responses depending on how many people are present and where they are. An OpenClaw agent adds that reasoning layer.
mmWave radar occupies the sweet spot: it detects stationary humans at long range without capturing any visual data. No images, no biometrics, no personally identifiable information. The sensor outputs presence state, distance, and velocity as numeric values.
How to Choose a mmWave Sensor for Raspberry Pi
Three categories of 24GHz mmWave modules work with Raspberry Pi. They differ in range, interface complexity, and what data they expose.
Entry-level: DFRobot C4001 (12m range)
The DFRobot Gravity C4001 is the most beginner-friendly option for Pi projects. It supports both I2C and UART communication (selectable via a physical switch), detects human presence at up to 8 meters and motion at up to 12 meters, measures distance and speed to the nearest target, and costs under $15. The Gravity connector simplifies wiring. Two I2C address options (0x2A and 0x2B) let you run two sensors on the same bus for multi-zone coverage without additional hardware.
Detection field of view is 100 degrees horizontal, which covers most of a standard room from a corner-mounted position. The sensor runs on 3.3V or 5V, drawing minimal current.
Mid-range: Waveshare HMMD sensor (S3KM1110-based)
The Waveshare Human Micro-Motion Detection sensor uses the AIoT S3KM1110 SoC with a built-in DSP that runs presence detection algorithms on-chip. It communicates over UART at 115200 baud by default, outputs presence state plus distance measurements, and provides configurable detection thresholds. At around $10-12, it is the cheapest option, but UART-only communication means it occupies the Pi's serial port unless you add a USB-to-serial adapter for additional sensors.
Advanced: Seeed Studio MR24HPC1 / Hi-Link LD2410
These modules provide richer data including multi-target tracking, zone configuration, and sensitivity tuning. The LD2410 has become popular in the Home Assistant community and has a well-documented Python library (mmwave_presence on GitHub). It outputs energy values per distance gate, letting you build occupancy heatmaps rather than simple binary presence readings.
Recommendation for this guide: The DFRobot C4001 provides the best balance of simplicity, dual-interface support, and data richness for an OpenClaw agent project. I2C communication leaves the UART port free for other sensors or debugging, and the distance/speed output gives the agent more to reason about than a binary presence flag.
How to Wire and Read mmWave Sensor Data
I2C wiring (recommended for C4001)
Set the C4001's interface switch to I2C. Connect four wires:
- SDA to GPIO 2 (Pi pin 3)
- SCL to GPIO 3 (Pi pin 5)
- VCC to 3.3V (Pi pin 1)
- GND to any ground pin (Pi pin 6)
Verify the connection with i2cdetect -y 1. You should see address 0x2A (or 0x2B if you toggled the address switch). If nothing appears, check that I2C is enabled in raspi-config under Interface Options.
UART wiring (alternative, or for Waveshare/LD2410)
For UART sensors, connect:
- Sensor TX to Pi GPIO 15 / RXD (pin 10)
- Sensor RX to Pi GPIO 14 / TXD (pin 8)
- VCC to 5V (pin 2)
- GND to ground (pin 6)
Disable the serial console in raspi-config (Interface Options > Serial Port > No for login shell, Yes for serial hardware). On Raspberry Pi 5, the UART pins map to /dev/ttyAMA0 by default. On Pi 4, you may need to add dtoverlay=disable-bt to /boot/config.txt to reclaim the full UART from Bluetooth.
Reading data with Python
DFRobot provides a Python library (DFRobot_C4001) that handles the I2C register reads. A minimal polling loop:
import time
from DFRobot_C4001 import DFRobot_C4001_I2C
sensor = DFRobot_C4001_I2C(bus=1, addr=0x2A)
sensor.sensor_init()
while True:
presence = sensor.get_human_presence()
distance = sensor.get_target_distance()
speed = sensor.get_target_speed()
print(f"Present: {presence}, Distance: {distance}cm, Speed: {speed}cm/s")
time.sleep(0.5)
For UART-based sensors like the LD2410, the mmwave_presence library handles frame parsing:
import serial
from mmwave_presence import LD2410
ser = serial.Serial('/dev/ttyAMA0', 256000)
radar = LD2410(ser)
while True:
if radar.update():
print(f"Present: {radar.presence}, Distance: {radar.distance}mm")
Both approaches give you the raw data an OpenClaw agent needs: whether someone is there, how far away they are, and (for sensors that support it) how fast they are moving.
Persist your occupancy data where any team member can query it
Fast.io gives your OpenClaw agent 50 GB of indexed storage, Intelligence Mode for natural language queries over historical data, and workspace sharing so facilities teams access occupancy insights without SSH. No credit card, no expiration.
Building the OpenClaw Presence Detection Agent
With sensor data flowing, the next step is wrapping the polling logic in an OpenClaw skill that the agent can invoke, interpret, and act upon. The agent's job is not to replace the sensor's firmware detection, but to add temporal reasoning, multi-zone correlation, and decision-making that a bare sensor cannot do.
What the agent adds beyond raw sensor readings:
- Temporal smoothing: a 2-second absence does not mean the room is empty. The agent tracks presence state over a configurable window (e.g., 5 minutes of continuous absence before declaring a room vacant)
- Occupancy patterns: the agent learns that Conference Room B is always empty after 6 PM and can proactively dim lights or reduce HVAC without waiting for the timeout
- Multi-sensor fusion: if you deploy two C4001 sensors at different angles (using the dual I2C addresses), the agent correlates their readings to estimate occupant count and position within the room
- Context-aware responses: presence detected at 3 AM triggers a security alert. Presence detected at 9 AM triggers a welcome routine. Same sensor reading, different response based on time and history
Skill structure for OpenClaw
An OpenClaw skill for mmWave presence detection exposes the sensor interface to the agent's reasoning loop. The skill reads sensor data on a polling interval, maintains a state buffer of recent readings, and provides the agent with a structured summary it can query and reason about. The agent decides what to do with that information: log it, trigger an automation, send an alert, or combine it with data from other skills.
Decision logic examples:
The agent can implement graduated responses based on presence patterns:
- Presence detected, duration under 30 seconds: likely a passerby in a hallway, no action needed
- Presence detected, duration over 2 minutes: room is occupied, adjust lighting and HVAC
- Presence lost after extended occupancy: start a 5-minute cooldown timer before powering down
- Multiple transitions in short period: high-traffic area, keep systems active regardless of momentary absences
- Presence detected in zone A and zone B simultaneously: meeting in progress, suppress individual desk automations
This graduated logic is where the agent provides value that a simple sensor-to-relay wiring cannot. The sensor tells you what is happening right now. The agent tells you what it means and what to do about it.
Multi-Zone Occupancy and Cross-Room Automation
A single mmWave sensor covers one room. Real building automation requires coordinating across spaces. An OpenClaw agent running on a central Raspberry Pi can aggregate presence data from multiple sensors, each connected via I2C (using different addresses or I2C multiplexers) or via networked satellite Pis reporting back to the central node.
Scaling beyond one sensor
For two sensors in the same room (different angles for better coverage), use the C4001's dual I2C addresses: one at 0x2A, one at 0x2B on the same bus. For more than two sensors, add a TCA9548A I2C multiplexer to expand to 8 channels.
For multi-room deployments, a common pattern is one Raspberry Pi per floor or zone running OpenClaw, with each Pi polling its local sensors and reporting occupancy state to a central coordinator. The agent on each Pi makes local decisions (lights, HVAC for its zone) while sharing state for building-wide reasoning.
Cross-zone decision patterns
Multi-zone data enables automation that no single sensor can achieve:
- If all zones on a floor show no presence for 15 minutes, the agent can signal the BMS to enter energy-saving mode for the entire floor
- If occupancy in the break room spikes while the adjacent meeting room empties, the agent infers the meeting ended and a break started, pre-conditioning the meeting room for the next booking
- Sequential presence across hallway sensors indicates direction of travel, letting the agent pre-activate lighting and climate ahead of the occupant's path
- Unusual presence patterns (someone in the server room at 2 AM who was not detected entering the main office) trigger security logging
Logging and historical analysis
Raw presence events accumulate quickly. At 2 readings per second per sensor across 10 zones, you generate over 1.7 million data points per day. The agent should aggregate these into meaningful summaries: average occupancy per zone per hour, peak usage times, rooms that are booked but consistently empty.
Storing these summaries in a Fast.io workspace gives you persistent, searchable storage accessible from any device or agent. Enable Intelligence Mode on the workspace, and you can ask natural language questions about historical occupancy patterns: "Which meeting rooms had less than 30% utilization last month?" The indexed data becomes queryable without writing SQL or building a custom dashboard.
Persisting Occupancy Data and Handing Off to Building Teams
An OpenClaw agent running on a Pi generates valuable occupancy intelligence. That data needs to survive SD card failures, be accessible to facilities teams who are not SSH-ing into a Raspberry Pi, and works alongside existing building management workflows.
Local vs. cloud storage
Local SQLite on the Pi works for buffering recent data and making fast decisions. But SD cards wear out, Pis reboot, and nobody wants to pull occupancy reports by connecting to a headless device. A hybrid approach keeps the last 24 hours locally for real-time decisions and syncs daily summaries to cloud storage.
Options for cloud persistence:
- S3-compatible storage works but requires writing export scripts, managing credentials, and building a separate query interface
- Google Drive or Dropbox provide file sync but no built-in search or AI query capability
- A Fast.io workspace provides file storage with automatic indexing, meaning uploaded occupancy reports become searchable and queryable through Intelligence Mode without additional infrastructure
The handoff workflow
The OpenClaw agent generates occupancy reports (JSON or CSV summaries), uploads them to a shared workspace, and the facilities team accesses them through a web interface or asks questions about the data directly. No SSH access needed, no custom dashboard to build and maintain.
With Fast.io's MCP server, the agent can upload files, organize them into workspace folders by date or zone, and set permissions so that only the facilities team sees the occupancy data while the security team sees the after-hours alerts. The free tier provides 50 GB of storage and 5,000 AI credits per month, enough for years of compressed occupancy logs.
When it is time to hand the system off to a building operations team, workspace ownership transfer lets the agent's creator pass full control to the ops team while retaining admin access for maintenance. The team gets a working occupancy intelligence system without needing to understand OpenClaw, Python, or Raspberry Pi internals. They interact with the data through a web interface and natural language queries.
Integration with existing BMS
Most commercial building management systems accept inputs via MQTT, BACnet, or REST APIs. The OpenClaw agent can publish presence state changes to an MQTT broker that the BMS subscribes to, bridging the gap between a $15 mmWave sensor on a $60 Pi and enterprise building automation infrastructure. The agent acts as the intelligence layer between raw sensor hardware and the BMS control plane.
Frequently Asked Questions
What is mmWave radar presence detection?
mmWave (millimeter wave) radar presence detection uses 24GHz FMCW radio signals to detect human bodies in a space. The sensor transmits a continuous frequency-sweeping signal and analyzes the reflected waves. Unlike PIR sensors that only detect moving heat sources, mmWave radar can identify stationary humans by picking up micro-movements like breathing and heartbeat. Detection ranges typically span 8-12 meters with a horizontal field of view around 100 degrees.
Can mmWave radar detect stationary people?
Yes. This is the primary advantage of mmWave over PIR sensors. Even when a person is sitting completely still, their body produces micro-movements from respiration and cardiac activity. A 24GHz FMCW radar detects these as tiny Doppler frequency shifts in the reflected signal. The DFRobot C4001, for example, can detect stationary humans at up to 8 meters. This eliminates the common problem of lights turning off during meetings or while reading.
How do you connect a mmWave sensor to a Raspberry Pi?
Most mmWave sensors support UART or I2C. For I2C (e.g., DFRobot C4001), connect SDA to GPIO 2 (pin 3), SCL to GPIO 3 (pin 5), VCC to 3.3V, and GND to ground. For UART sensors, connect the sensor's TX to the Pi's RXD (GPIO 15, pin 10) and RX to TXD (GPIO 14, pin 8). Enable the appropriate interface in raspi-config. On Pi 5, UART maps to /dev/ttyAMA0 by default.
How is an AI agent different from a simple automation script?
A script executes fixed rules: if presence detected, turn on lights. An AI agent adds reasoning over time and context. It can distinguish between a brief hallway passerby and a settled room occupant, correlate presence across multiple zones to infer building-wide patterns, adjust responses based on time of day or historical occupancy, and make nuanced decisions like delaying HVAC shutdown because the meeting room is typically reoccupied within 10 minutes at this time of day.
Do mmWave sensors work through walls or enclosures?
mmWave signals pass through most non-metallic materials including plastic enclosures, drywall, wood, and glass. This means sensors can be mounted inside a housing for aesthetic purposes or behind a thin wall panel for concealed installation. However, they cannot penetrate metal or concrete. Thick glass or water-filled objects attenuate the signal . For multi-room deployments, each room needs its own sensor.
What Raspberry Pi model works best for a mmWave presence agent?
A Raspberry Pi 4 with 4GB or 8GB RAM runs OpenClaw and sensor polling comfortably. The Pi 5 offers faster processing but costs more and may be overkill for a sensor monitoring application. A Pi Zero 2 W is sufficient if you only need sensor reading and MQTT publishing without running the full OpenClaw agent locally. For multi-sensor deployments with on-device AI reasoning, the Pi 4 8GB or Pi 5 provides headroom for the language model inference.
Related Resources
Persist your occupancy data where any team member can query it
Fast.io gives your OpenClaw agent 50 GB of indexed storage, Intelligence Mode for natural language queries over historical data, and workspace sharing so facilities teams access occupancy insights without SSH. No credit card, no expiration.