<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>Human Pose on Producthunt daily</title>
        <link>https://producthunt.programnotes.cn/en/tags/human-pose/</link>
        <description>Recent content in Human Pose on Producthunt daily</description>
        <generator>Hugo -- gohugo.io</generator>
        <language>en</language>
        <lastBuildDate>Wed, 22 Apr 2026 16:52:45 +0800</lastBuildDate><atom:link href="https://producthunt.programnotes.cn/en/tags/human-pose/index.xml" rel="self" type="application/rss+xml" /><item>
        <title>RuView</title>
        <link>https://producthunt.programnotes.cn/en/p/ruview/</link>
        <pubDate>Wed, 22 Apr 2026 16:52:45 +0800</pubDate>
        
        <guid>https://producthunt.programnotes.cn/en/p/ruview/</guid>
        <description>&lt;img src="https://images.unsplash.com/photo-1535191198992-fe460a2d0af1?ixid=M3w0NjAwMjJ8MHwxfHJhbmRvbXx8fHx8fHx8fDE3NzY4NDc5MTh8&amp;ixlib=rb-4.1.0" alt="Featured image of post RuView" /&gt;&lt;h1 id=&#34;ruvnetruview&#34;&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvnet/RuView&lt;/a&gt;
&lt;/h1&gt;&lt;h1 id=&#34;π-ruview&#34;&gt;π RuView
&lt;/h1&gt;&lt;p align=&#34;center&#34;&gt;
  &lt;a href=&#34;https://x.com/rUv/status/2037556932802761004&#34;&gt;
    &lt;img src=&#34;assets/ruview-small-gemini.jpg&#34; alt=&#34;RuView - WiFi DensePose&#34; width=&#34;100%&#34;&gt;
  &lt;/a&gt;
&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Beta Software&lt;/strong&gt; — Under active development. APIs and firmware may change. Known limitations:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;ESP32-C3 and original ESP32 are not supported (single-core, insufficient for CSI DSP)&lt;/li&gt;
&lt;li&gt;Single ESP32 deployments have limited spatial resolution — use 2+ nodes or add a &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt; for best results&lt;/li&gt;
&lt;li&gt;Camera-free pose accuracy is limited — use &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-079-camera-ground-truth-training.md&#34; &gt;camera ground-truth training&lt;/a&gt; for 92.9% PCK@20&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Contributions and bug reports welcome at &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Issues&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2 id=&#34;see-through-walls-with-wifi&#34;&gt;&lt;strong&gt;See through walls with WiFi&lt;/strong&gt;
&lt;/h2&gt;&lt;p&gt;&lt;strong&gt;Turn ordinary WiFi into a sensing system.&lt;/strong&gt; Detect people, measure breathing and heart rate, track movement, and monitor rooms — through walls, in the dark, with no cameras or wearables. Just physics.&lt;/p&gt;
&lt;h3 id=&#34;π-ruview-is-a-wifi-sensing-platform-that-turns-radio-signals-into-spatial-intelligence&#34;&gt;π RuView is a WiFi sensing platform that turns radio signals into spatial intelligence.
&lt;/h3&gt;&lt;p&gt;Every WiFi router already fills your space with radio waves. When people move, breathe, or even sit still, they disturb those waves in measurable ways. RuView captures these disturbances using Channel State Information (CSI) from low-cost ESP32 sensors and turns them into actionable data: who&amp;rsquo;s there, what they&amp;rsquo;re doing, and whether they&amp;rsquo;re okay.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What it senses:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Presence and occupancy&lt;/strong&gt; — detect people through walls, count them, track entries and exits&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Vital signs&lt;/strong&gt; — breathing rate and heart rate, contactless, while sleeping or sitting&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Activity recognition&lt;/strong&gt; — walking, sitting, gestures, falls — from temporal CSI patterns&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Environment mapping&lt;/strong&gt; — RF fingerprinting identifies rooms, detects moved furniture, spots new objects&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Sleep quality&lt;/strong&gt; — overnight monitoring with sleep stage classification and apnea screening&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Built on &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector&lt;/a&gt; and &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt;, RuView runs entirely on edge hardware — an ESP32 mesh (as low as $9 per node) paired with a Cognitum Seed for persistent memory, cryptographic attestation, and AI integration. No cloud, no cameras, no internet required.&lt;/p&gt;
&lt;p&gt;The system learns each environment locally using spiking neural networks that adapt in under 30 seconds, with multi-frequency mesh scanning across 6 WiFi channels that uses your neighbors&amp;rsquo; routers as free radar illuminators. Every measurement is cryptographically attested via an Ed25519 witness chain.&lt;/p&gt;
&lt;p&gt;RuView also supports pose estimation (17 COCO keypoints via the WiFlow architecture), trained entirely without cameras using 10 sensor signals — a technique pioneered from the original &lt;em&gt;DensePose From WiFi&lt;/em&gt; research at Carnegie Mellon University.&lt;/p&gt;
&lt;h3 id=&#34;built-for-low-power-edge-applications&#34;&gt;Built for low-power edge applications
&lt;/h3&gt;&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;#edge-intelligence-adr-041&#34; &gt;Edge modules&lt;/a&gt; are small programs that run directly on the ESP32 sensor — no internet needed, no cloud fees, instant response.&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;https://www.rust-lang.org/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/badge/rust-1.85&amp;#43;-orange.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;Rust 1.85&amp;#43;&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;https://opensource.org/licenses/MIT&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/badge/License-MIT-yellow.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;License: MIT&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/badge/tests-1463%20passed-brightgreen.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;Tests: 1463&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;https://hub.docker.com/r/ruvnet/wifi-densepose&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/badge/docker-amd64%20%2B%20arm64-blue.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;Docker: multi-arch&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;#vital-sign-detection&#34; &gt;&lt;img src=&#34;https://img.shields.io/badge/vital%20signs-breathing%20%2B%20heartbeat-red.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;Vital Signs&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;#esp32-s3-hardware-pipeline&#34; &gt;&lt;img src=&#34;https://img.shields.io/badge/ESP32--S3-CSI%20streaming-purple.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;ESP32 Ready&#34;
	
	
&gt;&lt;/a&gt;
&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-ruvector.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;What&lt;/th&gt;
          &lt;th&gt;How&lt;/th&gt;
          &lt;th&gt;Speed&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Pose estimation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;CSI subcarrier amplitude/phase → 17 COCO keypoints&lt;/td&gt;
          &lt;td&gt;171K emb/s (M4 Pro)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Breathing detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Bandpass 0.1-0.5 Hz → zero-crossing BPM&lt;/td&gt;
          &lt;td&gt;6-30 BPM&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Heart rate&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Bandpass 0.8-2.0 Hz → zero-crossing BPM&lt;/td&gt;
          &lt;td&gt;40-120 BPM&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Presence sensing&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Trained model + PIR fusion — 100% accuracy&lt;/td&gt;
          &lt;td&gt;0.012 ms latency&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Through-wall&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Fresnel zone geometry + multipath modeling&lt;/td&gt;
          &lt;td&gt;Up to 5m depth&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Edge intelligence&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;8-dim feature vectors + RVF store on Cognitum Seed&lt;/td&gt;
          &lt;td&gt;$140 total BOM&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Camera-free training&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;10 sensor signals, no labels needed&lt;/td&gt;
          &lt;td&gt;84s on M4 Pro&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Camera-supervised training&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;MediaPipe + ESP32 CSI → 92.9% PCK@20&lt;/td&gt;
          &lt;td&gt;19 min on laptop&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Multi-frequency mesh&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Channel hopping across 6 bands, neighbor APs as illuminators&lt;/td&gt;
          &lt;td&gt;3x sensing bandwidth&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/blockquote&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Option 1: Docker (simulated data, no hardware needed)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 3000:3000 ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Open http://localhost:3000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Option 2: Live sensing with ESP32-S3 hardware ($9)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Flash firmware, provision WiFi, and start sensing:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m esptool --chip esp32s3 --port COM9 --baud &lt;span class=&#34;m&#34;&gt;460800&lt;/span&gt; &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  write_flash 0x0 bootloader.bin 0x8000 partition-table.bin &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  0xf000 ota_data_initial.bin 0x20000 esp32-csi-node.bin
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM9 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Option 3: Full system with Cognitum Seed ($140)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# ESP32 streams CSI → bridge forwards to Seed for persistent storage + kNN + witness chain&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/rf-scan.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;           &lt;span class=&#34;c1&#34;&gt;# Live RF room scan&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/snn-csi-processor.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;  &lt;span class=&#34;c1&#34;&gt;# SNN real-time learning&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/mincut-person-counter.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;  &lt;span class=&#34;c1&#34;&gt;# Correct person counting&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;blockquote&gt;
&lt;p&gt;[!NOTE]
&lt;strong&gt;CSI-capable hardware recommended.&lt;/strong&gt; Presence, vital signs, through-wall sensing, and all advanced capabilities require Channel State Information (CSI) from an ESP32-S3 ($9) or research NIC. The Docker image runs with simulated data for evaluation. Consumer WiFi laptops provide RSSI-only presence detection.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Hardware options&lt;/strong&gt; for live CSI capture:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Option&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;Cost&lt;/th&gt;
          &lt;th&gt;Full CSI&lt;/th&gt;
          &lt;th&gt;Capabilities&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ESP32 + Cognitum Seed&lt;/strong&gt; (recommended)&lt;/td&gt;
          &lt;td&gt;ESP32-S3 + &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;~$140&lt;/td&gt;
          &lt;td&gt;Yes&lt;/td&gt;
          &lt;td&gt;Pose, breathing, heartbeat, motion, presence + persistent vector store, kNN search, witness chain, MCP proxy&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ESP32 Mesh&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;3-6x ESP32-S3 + WiFi router&lt;/td&gt;
          &lt;td&gt;~$54&lt;/td&gt;
          &lt;td&gt;Yes&lt;/td&gt;
          &lt;td&gt;Pose, breathing, heartbeat, motion, presence&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Research NIC&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Intel 5300 / Atheros AR9580&lt;/td&gt;
          &lt;td&gt;~$50-100&lt;/td&gt;
          &lt;td&gt;Yes&lt;/td&gt;
          &lt;td&gt;Full CSI with 3x3 MIMO&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Any WiFi&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Windows, macOS, or Linux laptop&lt;/td&gt;
          &lt;td&gt;$0&lt;/td&gt;
          &lt;td&gt;No&lt;/td&gt;
          &lt;td&gt;RSSI-only: coarse presence and motion&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;No hardware? Verify the signal processing pipeline with the deterministic reference signal: &lt;code&gt;python v1/data/proof/verify.py&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;hr&gt;
&lt;h3 id=&#34;real-time-dense-point-cloud-new&#34;&gt;Real-Time Dense Point Cloud (NEW)
&lt;/h3&gt;&lt;p&gt;RuView now generates &lt;strong&gt;real-time 3D point clouds&lt;/strong&gt; by fusing camera depth + WiFi CSI + mmWave radar. All sensors stream simultaneously into a unified spatial model.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Sensor&lt;/th&gt;
          &lt;th&gt;Data&lt;/th&gt;
          &lt;th&gt;Integration&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Camera&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;MiDaS monocular depth (GPU)&lt;/td&gt;
          &lt;td&gt;640×480 → 19,200+ depth points per frame&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ESP32 CSI&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;ADR-018 binary frames (UDP)&lt;/td&gt;
          &lt;td&gt;RF tomography → 8×8×4 occupancy grid&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;WiFlow Pose&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;17 COCO keypoints from CSI&lt;/td&gt;
          &lt;td&gt;Skeleton overlay on point cloud&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Vital Signs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Breathing rate from CSI phase&lt;/td&gt;
          &lt;td&gt;Stored in ruOS brain every 60s&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Motion&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;CSI amplitude variance&lt;/td&gt;
          &lt;td&gt;Adaptive capture rate (skip depth when still)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Quick start:&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo build --release -p wifi-densepose-pointcloud
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Open http://localhost:9880 for live 3D viewer&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;CLI commands:&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud demo                            &lt;span class=&#34;c1&#34;&gt;# synthetic demo&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud serve --bind 127.0.0.1:9880     &lt;span class=&#34;c1&#34;&gt;# live server + Three.js viewer&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud capture --output room.ply       &lt;span class=&#34;c1&#34;&gt;# capture to PLY&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud train                           &lt;span class=&#34;c1&#34;&gt;# depth calibration + DPO pairs&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud cameras                         &lt;span class=&#34;c1&#34;&gt;# list available cameras&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud csi-test --count &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt;            &lt;span class=&#34;c1&#34;&gt;# send test CSI frames&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;ruview-pointcloud fingerprint office --seconds &lt;span class=&#34;m&#34;&gt;5&lt;/span&gt;  &lt;span class=&#34;c1&#34;&gt;# record named CSI room fingerprint&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;The HTTP/viewer server defaults to &lt;strong&gt;loopback (&lt;code&gt;127.0.0.1&lt;/code&gt;)&lt;/strong&gt; — exposing live camera/CSI/vitals on &lt;code&gt;0.0.0.0&lt;/code&gt; is an explicit opt-in. Brain URL defaults to &lt;code&gt;http://127.0.0.1:9876&lt;/code&gt; and is overridable via &lt;code&gt;RUVIEW_BRAIN_URL&lt;/code&gt; env var or the &lt;code&gt;--brain&lt;/code&gt; flag on &lt;code&gt;serve&lt;/code&gt;/&lt;code&gt;train&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;The pose overlay currently uses an &lt;strong&gt;amplitude-energy heuristic&lt;/strong&gt; (&lt;code&gt;heuristic_pose_from_amplitude&lt;/code&gt;) rather than trained WiFlow inference — real ONNX/Candle inference is tracked as a follow-up.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Performance:&lt;/strong&gt; 22ms pipeline, 905 req/s API, 40K voxel room model from 20 frames.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Brain integration:&lt;/strong&gt; Spatial observations (motion, vitals, skeleton, occupancy) sync to the ruOS brain every 60 seconds for agent reasoning.&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/pull/405&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;PR #405&lt;/a&gt; for full details.&lt;/p&gt;
&lt;h3 id=&#34;whats-new-in-v070&#34;&gt;What&amp;rsquo;s New in v0.7.0
&lt;/h3&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Camera Ground-Truth Training — 92.9% PCK@20&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;&lt;strong&gt;v0.7.0 adds camera-supervised pose training&lt;/strong&gt; using MediaPipe + real ESP32 CSI data:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;What it does&lt;/th&gt;
          &lt;th&gt;ADR&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Camera ground-truth collection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;MediaPipe PoseLandmarker captures 17 COCO keypoints at 30fps, synced with ESP32 CSI&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-079-camera-ground-truth-training.md&#34; &gt;ADR-079&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ruvector subcarrier selection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Variance-based top-K reduces input by 50% (70→35 subcarriers)&lt;/td&gt;
          &lt;td&gt;ADR-079 O6&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Stoer-Wagner min-cut&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Person-specific subcarrier cluster separation for multi-person training&lt;/td&gt;
          &lt;td&gt;ADR-079 O8&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Scalable WiFlow model&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;4 presets: lite (189K) → small (474K) → medium (800K) → full (7.7M params)&lt;/td&gt;
          &lt;td&gt;ADR-079&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Collect ground truth (camera + ESP32 simultaneously)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python scripts/collect-ground-truth.py --duration &lt;span class=&#34;m&#34;&gt;300&lt;/span&gt; --preview
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python scripts/record-csi-udp.py --duration &lt;span class=&#34;m&#34;&gt;300&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Align CSI windows with camera keypoints&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/align-ground-truth.js --gt data/ground-truth/*.jsonl --csi data/recordings/*.csi.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Train WiFlow model (start lite, scale up as data grows)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/train-wiflow-supervised.js --data data/paired/*.jsonl --scale lite
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Evaluate&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/eval-wiflow.js --model models/wiflow-real/wiflow-v1.json --data data/paired/*.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Result: 92.9% PCK@20&lt;/strong&gt; from a 5-minute data collection session with one ESP32-S3 and one webcam.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Metric&lt;/th&gt;
          &lt;th&gt;Before (proxy)&lt;/th&gt;
          &lt;th&gt;After (camera-supervised)&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;PCK@20&lt;/td&gt;
          &lt;td&gt;0%&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;92.9%&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Eval loss&lt;/td&gt;
          &lt;td&gt;0.700&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;0.082&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Bone constraint&lt;/td&gt;
          &lt;td&gt;N/A&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;0.008&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Training time&lt;/td&gt;
          &lt;td&gt;N/A&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;19 minutes&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Model size&lt;/td&gt;
          &lt;td&gt;N/A&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;974 KB&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;Pre-trained model: &lt;a class=&#34;link&#34; href=&#34;https://huggingface.co/ruv/ruview&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;HuggingFace ruv/ruview/wiflow-v1&lt;/a&gt;&lt;/p&gt;
&lt;/details&gt;
&lt;h3 id=&#34;pre-trained-models-v060--no-training-required&#34;&gt;Pre-Trained Models (v0.6.0) — No Training Required
&lt;/h3&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Download from HuggingFace and start sensing immediately&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;Pre-trained models are available on HuggingFace:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;a class=&#34;link&#34; href=&#34;https://huggingface.co/ruv/ruview&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;https://huggingface.co/ruv/ruview&lt;/a&gt;&lt;/strong&gt; (primary) | &lt;a class=&#34;link&#34; href=&#34;https://huggingface.co/ruvnet/wifi-densepose-pretrained&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;mirror&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Trained on 60,630 real-world samples from an 8-hour overnight collection. Just download and run — no datasets, no GPU, no training needed.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Model&lt;/th&gt;
          &lt;th&gt;Size&lt;/th&gt;
          &lt;th&gt;What it does&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;model.safetensors&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;48 KB&lt;/td&gt;
          &lt;td&gt;Contrastive encoder — 128-dim embeddings for presence, activity, environment&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;model-q4.bin&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;8 KB&lt;/td&gt;
          &lt;td&gt;4-bit quantized — fits in ESP32-S3 SRAM for edge inference&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;model-q2.bin&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;4 KB&lt;/td&gt;
          &lt;td&gt;2-bit ultra-compact for memory-constrained devices&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;presence-head.json&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;2.6 KB&lt;/td&gt;
          &lt;td&gt;100% accurate presence detection head&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;node-1.json&lt;/code&gt; / &lt;code&gt;node-2.json&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;21 KB&lt;/td&gt;
          &lt;td&gt;Per-room LoRA adapters (swap for new rooms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Download and use (Python)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install huggingface_hub
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;huggingface-cli download ruv/ruview --local-dir models/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Or use directly with the sensing pipeline&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/train-ruvllm.js --data data/recordings/*.csi.jsonl  &lt;span class=&#34;c1&#34;&gt;# retrain on your own data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/benchmark-ruvllm.js --model models/csi-ruvllm       &lt;span class=&#34;c1&#34;&gt;# benchmark&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Benchmarks (Apple M4 Pro, retrained on overnight data):&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;What we measured&lt;/th&gt;
          &lt;th&gt;Result&lt;/th&gt;
          &lt;th&gt;Why it matters&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Presence detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;100% accuracy&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Never misses a person, never false alarms&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Inference speed&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;0.008 ms&lt;/strong&gt; per embedding&lt;/td&gt;
          &lt;td&gt;125,000x faster than real-time&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Throughput&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;164,183 embeddings/sec&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;One Mac Mini handles 1,600+ ESP32 nodes&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Contrastive learning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;51.6% improvement&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Strong pattern learning from real overnight data&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Model size&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;8 KB&lt;/strong&gt; (4-bit quantized)&lt;/td&gt;
          &lt;td&gt;Fits in ESP32 SRAM — no server needed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Total hardware cost&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;$140&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;ESP32 ($9) + &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt; ($131)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;h3 id=&#34;17-sensing-applications-v060&#34;&gt;17 Sensing Applications (v0.6.0)
&lt;/h3&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Health, environment, security, and multi-frequency mesh sensing&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;All applications run from a single ESP32 + optional Cognitum Seed. No camera, no cloud, no internet.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Health &amp;amp; Wellness:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Application&lt;/th&gt;
          &lt;th&gt;Script&lt;/th&gt;
          &lt;th&gt;What it detects&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Sleep Monitor&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/sleep-monitor.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Sleep stages (deep/light/REM/awake), efficiency, hypnogram&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Apnea Detector&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/apnea-detector.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Breathing pauses &amp;gt;10s, AHI severity scoring&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Stress Monitor&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/stress-monitor.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Heart rate variability, LF/HF stress ratio&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Gait Analyzer&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/gait-analyzer.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Walking cadence, stride asymmetry, tremor detection&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Environment &amp;amp; Security:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Application&lt;/th&gt;
          &lt;th&gt;Script&lt;/th&gt;
          &lt;th&gt;What it detects&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Person Counter&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/mincut-person-counter.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Correct occupancy count (fixes #348)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Room Fingerprint&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/room-fingerprint.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Activity state clustering, daily patterns, anomalies&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Material Detector&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/material-detector.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;New/moved objects via subcarrier null changes&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Device Fingerprint&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/device-fingerprint.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Electronic device activity (printer, router, etc.)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Multi-Frequency Mesh&lt;/strong&gt; (requires &lt;code&gt;--hop-channels&lt;/code&gt; provisioning):&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Application&lt;/th&gt;
          &lt;th&gt;Script&lt;/th&gt;
          &lt;th&gt;What it detects&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;RF Tomography&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/rf-tomography.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;2D room imaging via RF backprojection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Passive Radar&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/passive-radar.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Neighbor WiFi APs as bistatic radar illuminators&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Material Classifier&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/material-classifier.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Metal/water/wood/glass from frequency response&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Through-Wall&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/through-wall-detector.js&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Motion behind walls using lower-frequency penetration&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;All scripts support &lt;code&gt;--replay data/recordings/*.csi.jsonl&lt;/code&gt; for offline analysis and &lt;code&gt;--json&lt;/code&gt; for programmatic output.&lt;/p&gt;
&lt;/details&gt;
&lt;h3 id=&#34;whats-new-in-v055&#34;&gt;What&amp;rsquo;s New in v0.5.5
&lt;/h3&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Advanced Sensing: SNN + MinCut + WiFlow + Multi-Frequency Mesh&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;&lt;strong&gt;v0.5.5 adds four new sensing capabilities&lt;/strong&gt; built on the &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector&lt;/a&gt; ecosystem:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;What it does&lt;/th&gt;
          &lt;th&gt;ADR&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Spiking Neural Network&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Adapts to your room in &amp;lt;30s with STDP online learning — no labels, no batches, 16-160x less compute&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-074-spiking-neural-csi-sensing.md&#34; &gt;ADR-074&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;MinCut Person Counting&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Stoer-Wagner min-cut on subcarrier correlation graph — &lt;strong&gt;fixes #348&lt;/strong&gt; (was always 4, now correct)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-075-mincut-person-separation.md&#34; &gt;ADR-075&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;CNN Spectrogram Embeddings&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Treat CSI as a 64×20 image → 128-dim embedding for environment fingerprinting (0.95+ similarity)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-076-csi-spectrogram-embeddings.md&#34; &gt;ADR-076&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;WiFlow SOTA Architecture&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;TCN + axial attention + pose decoder → 17 COCO keypoints, 1.8M params (881 KB at 4-bit)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-072-wiflow-architecture.md&#34; &gt;ADR-072&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Multi-Frequency Mesh&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Channel hopping across 6 bands, neighbor WiFi as passive radar illuminators&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-073-multifrequency-mesh-scan.md&#34; &gt;ADR-073&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Live RF room scan (spectrum visualization)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/rf-scan.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt; --duration &lt;span class=&#34;m&#34;&gt;30&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Correct person counting (fixes #348)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/mincut-person-counter.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# SNN real-time adaptation&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/snn-csi-processor.js --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# CNN spectrogram embeddings&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/csi-spectrogram.js --replay data/recordings/*.csi.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# WiFlow 17-keypoint pose training&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/train-wiflow.js --data data/recordings/*.csi.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Enable channel hopping on ESP32&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM9 --hop-channels &lt;span class=&#34;s2&#34;&gt;&amp;#34;1,6,11&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Validated benchmarks:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Metric&lt;/th&gt;
          &lt;th&gt;v0.5.4&lt;/th&gt;
          &lt;th&gt;v0.5.5&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Person counting&lt;/td&gt;
          &lt;td&gt;Broken (always 4)&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Correct&lt;/strong&gt; (MinCut, 24/24)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;WiFi channels&lt;/td&gt;
          &lt;td&gt;1&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;6&lt;/strong&gt; (multi-freq hopping)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Null subcarriers&lt;/td&gt;
          &lt;td&gt;19% blocked&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;16%&lt;/strong&gt; (frequency diversity)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Pose model&lt;/td&gt;
          &lt;td&gt;16K params (FC only)&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;1.8M params&lt;/strong&gt; (WiFlow)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Online adaptation&lt;/td&gt;
          &lt;td&gt;None&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;&amp;lt;30s&lt;/strong&gt; (SNN STDP)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Fingerprint dims&lt;/td&gt;
          &lt;td&gt;8&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;128&lt;/strong&gt; (CNN spectrogram)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Multi-node fusion&lt;/td&gt;
          &lt;td&gt;Average&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;GATv2 attention&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;New scripts&lt;/td&gt;
          &lt;td&gt;0&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;15+&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;New ADRs&lt;/td&gt;
          &lt;td&gt;3&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;8&lt;/strong&gt; (069-076)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;h3 id=&#34;whats-new-in-v054&#34;&gt;What&amp;rsquo;s New in v0.5.4
&lt;/h3&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Cognitum Seed Integration + Camera-Free Pose Training&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;&lt;strong&gt;v0.5.4 transforms RuView from a real-time sensing tool into a persistent edge AI system.&lt;/strong&gt; Your ESP32 now remembers what it senses, learns without cameras, and proves its data cryptographically.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;Details&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Persistent vector store&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Every sensing event stored as searchable 8-dim vector in RVF format&lt;/td&gt;
          &lt;td&gt;ESP32 + &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt; ($140)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;kNN similarity search&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&amp;ldquo;Find the 10 most similar states to right now&amp;rdquo; — anomaly detection, fingerprinting&lt;/td&gt;
          &lt;td&gt;Cognitum Seed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Witness chain&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;SHA-256 tamper-evident audit trail for every measurement (1,747 entries validated)&lt;/td&gt;
          &lt;td&gt;Cognitum Seed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Camera-free pose training&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;17 COCO keypoints from 10 sensor signals — PIR, RSSI triangulation, subcarrier asymmetry, vibration, BME280&lt;/td&gt;
          &lt;td&gt;2x ESP32 + Seed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Pre-trained model&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;82.8 KB (8 KB at 4-bit quantization), 100% presence accuracy, 0 skeleton violations&lt;/td&gt;
          &lt;td&gt;Download from release&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Sub-ms inference&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;0.012 ms latency, 171,472 embeddings/sec on M4 Pro&lt;/td&gt;
          &lt;td&gt;Any machine with Node.js&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;SONA adaptation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Adapts to new rooms in &amp;lt;1ms without retraining&lt;/td&gt;
          &lt;td&gt;ruvllm runtime&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;LoRA room adapters&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Per-node fine-tuning with 2,048 parameters per adapter&lt;/td&gt;
          &lt;td&gt;Automatic&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;114-tool MCP proxy&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;AI assistants (Claude, GPT) query sensors directly via JSON-RPC&lt;/td&gt;
          &lt;td&gt;Cognitum Seed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Multi-frequency mesh&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Channel hopping across ch 1/3/5/6/9/11 — neighbor WiFi as passive radar&lt;/td&gt;
          &lt;td&gt;2x ESP32 ($18)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;RF room scanner&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Real-time spectrum visualization: nulls, reflectors, movement, multipath&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;node scripts/rf-scan.js&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Security hardened&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Bearer tokens, TLS, source IP filtering, NaN rejection, credential rotation&lt;/td&gt;
          &lt;td&gt;All components&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Training pipeline (ruvllm, no PyTorch needed):&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Collect data (2 min, ESP32s must be streaming)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python scripts/collect-training-data.py --port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt; --duration &lt;span class=&#34;m&#34;&gt;120&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Train — contrastive pretraining + task heads + LoRA + quantization + EWC&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/train-ruvllm.js --data data/recordings/pretrain-*.csi.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Camera-free 17-keypoint pose (uses PIR + RSSI + vibration + subcarrier asymmetry)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/train-camera-free.js --data data/recordings/pretrain-*.csi.jsonl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Benchmark&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;node scripts/benchmark-ruvllm.js --model models/csi-ruvllm
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Benchmarks — validated on real hardware (Apple M4 Pro + ESP32-S3 + Cognitum Seed):&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;What we measured&lt;/th&gt;
          &lt;th&gt;Result&lt;/th&gt;
          &lt;th&gt;Why it matters&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Presence detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;100% accuracy&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Never misses a person, never false alarms&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Person counting&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;24/24 correct&lt;/strong&gt; (MinCut)&lt;/td&gt;
          &lt;td&gt;Fixed the #1 user-reported issue&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Inference speed&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;0.012 ms&lt;/strong&gt; per embedding&lt;/td&gt;
          &lt;td&gt;83,000x faster than real-time&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Throughput&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;171,472 embeddings/sec&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;One Mac Mini handles 1,700+ ESP32 nodes&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Training time&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;84 seconds&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;From zero to trained model in under 2 minutes&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Contrastive learning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;33.9% improvement&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Model learns meaningful patterns from CSI&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Model size&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;8 KB&lt;/strong&gt; (4-bit quantized)&lt;/td&gt;
          &lt;td&gt;Fits in ESP32 SRAM — no server needed&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Skeleton physics&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;0 violations&lt;/strong&gt; in 100 frames&lt;/td&gt;
          &lt;td&gt;Every pose is anatomically valid&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Pose keypoints&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;17 COCO keypoints&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Full body pose, no camera required&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;WiFi channels&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;6 simultaneous&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;3x more sensing data than single-channel&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Online adaptation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;&amp;lt;30 seconds&lt;/strong&gt; (SNN)&lt;/td&gt;
          &lt;td&gt;Learns a new room without retraining&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Witness chain&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;2,547 entries&lt;/strong&gt; verified&lt;/td&gt;
          &lt;td&gt;Cryptographic proof every measurement is real&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Test suite&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;1,463 tests passed&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Rock-solid foundation&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Total hardware cost&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;$140&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;ESP32 ($9) + &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt; ($131)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-069-cognitum-seed-csi-pipeline.md&#34; &gt;ADR-069&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-071-ruvllm-training-pipeline.md&#34; &gt;ADR-071&lt;/a&gt;, and the &lt;a class=&#34;link&#34; href=&#34;docs/tutorials/cognitum-seed-pretraining.md&#34; &gt;Cognitum Seed tutorial&lt;/a&gt; for full details.&lt;/p&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-documentation&#34;&gt;📖 Documentation
&lt;/h2&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Document&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/user-guide.md&#34; &gt;User Guide&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Step-by-step guide: installation, first run, API usage, hardware setup, training&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/build-guide.md&#34; &gt;Build Guide&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Building from source (Rust and Python)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/README.md&#34; &gt;Architecture Decisions&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;79 ADRs — why each technical choice was made, organized by domain (hardware, signal processing, ML, platform, infrastructure)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/ddd/README.md&#34; &gt;Domain Models&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;7 DDD models (RuvSense, Signal Processing, Training Pipeline, Hardware Platform, Sensing Server, WiFi-Mat, CHCI) — bounded contexts, aggregates, domain events, and ubiquitous language&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-desktop/README.md&#34; &gt;Desktop App&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;WIP&lt;/strong&gt; — Tauri v2 desktop app for node management, OTA updates, WASM deployment, and mesh visualization&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;examples/medical/README.md&#34; &gt;Medical Examples&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Contactless blood pressure, heart rate, breathing rate via 60 GHz mmWave radar — $15 hardware, no wearable&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;hr&gt;
  &lt;a href=&#34;https://ruvnet.github.io/RuView/&#34;&gt;
    &lt;img src=&#34;assets/v2-screen.png&#34; alt=&#34;WiFi DensePose — Live pose detection with setup guide&#34; width=&#34;800&#34;&gt;
  &lt;/a&gt;
  &lt;br&gt;
  &lt;em&gt;Real-time pose skeleton from WiFi CSI signals — no cameras, no wearables&lt;/em&gt;
  &lt;br&gt;&lt;br&gt;
  &lt;a href=&#34;https://ruvnet.github.io/RuView/&#34;&gt;&lt;strong&gt;▶ Live Observatory Demo&lt;/strong&gt;&lt;/a&gt;
  &amp;nbsp;|&amp;nbsp;
  &lt;a href=&#34;https://ruvnet.github.io/RuView/pose-fusion.html&#34;&gt;&lt;strong&gt;▶ Dual-Modal Pose Fusion Demo&lt;/strong&gt;&lt;/a&gt;
&lt;blockquote&gt;
&lt;p&gt;The &lt;a class=&#34;link&#34; href=&#34;#-quick-start&#34; &gt;server&lt;/a&gt; is optional for visualization and aggregation — the ESP32 &lt;a class=&#34;link&#34; href=&#34;#esp32-s3-hardware-pipeline&#34; &gt;runs independently&lt;/a&gt; for presence detection, vital signs, and fall alerts.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Live ESP32 pipeline&lt;/strong&gt;: Connect an ESP32-S3 node → run the &lt;a class=&#34;link&#34; href=&#34;#sensing-server&#34; &gt;sensing server&lt;/a&gt; → open the &lt;a class=&#34;link&#34; href=&#34;https://ruvnet.github.io/RuView/pose-fusion.html&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;pose fusion demo&lt;/a&gt; for real-time dual-modal pose estimation (webcam + WiFi CSI). See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-059-live-esp32-csi-pipeline.md&#34; &gt;ADR-059&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2 id=&#34;-key-features&#34;&gt;🚀 Key Features
&lt;/h2&gt;&lt;h3 id=&#34;sensing&#34;&gt;Sensing
&lt;/h3&gt;&lt;p&gt;See people, breathing, and heartbeats through walls — using only WiFi signals already in the room.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;&lt;/th&gt;
          &lt;th&gt;Feature&lt;/th&gt;
          &lt;th&gt;What It Means&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;🔒&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Privacy-First&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Tracks human pose using only WiFi signals — no cameras, no video, no images stored&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;💓&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Vital Signs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detects breathing rate (6-30 breaths/min) and heart rate (40-120 bpm) without any wearable&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;👥&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Multi-Person&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Tracks multiple people simultaneously, each with independent pose and vitals — no hard software limit (physics: ~3-5 per AP with 56 subcarriers, more with multi-AP)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🧱&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Through-Wall&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;WiFi passes through walls, furniture, and debris — works where cameras cannot&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🚑&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Disaster Response&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detects trapped survivors through rubble and classifies injury severity (START triage)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;📡&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Multistatic Mesh&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;4-6 low-cost sensor nodes work together, combining 12+ overlapping signal paths for full 360-degree room coverage with sub-inch accuracy and no person mix-ups (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-029-ruvsense-multistatic-sensing-mode.md&#34; &gt;ADR-029&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🌐&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Persistent Field Model&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;The system learns the RF signature of each room — then subtracts the room to isolate human motion, detect drift over days, predict intent before movement starts, and flag spoofing attempts (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-030-ruvsense-persistent-field-model.md&#34; &gt;ADR-030&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id=&#34;intelligence&#34;&gt;Intelligence
&lt;/h3&gt;&lt;p&gt;The system learns on its own and gets smarter over time — no hand-tuning, no labeled data required.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;&lt;/th&gt;
          &lt;th&gt;Feature&lt;/th&gt;
          &lt;th&gt;What It Means&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;🧠&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Self-Learning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Teaches itself from raw WiFi data — no labeled training sets, no cameras needed to bootstrap (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-024-contrastive-csi-embedding-model.md&#34; &gt;ADR-024&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🎯&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;AI Signal Processing&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Attention networks, graph algorithms, and smart compression replace hand-tuned thresholds — adapts to each room automatically (&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🌍&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Works Everywhere&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Train once, deploy in any room — adversarial domain generalization strips environment bias so models transfer across rooms, buildings, and hardware (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-027-cross-environment-domain-generalization.md&#34; &gt;ADR-027&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;👁️&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Cross-Viewpoint Fusion&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;AI combines what each sensor sees from its own angle — fills in blind spots and depth ambiguity that no single viewpoint can resolve on its own (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-031-ruview-sensing-first-rf-mode.md&#34; &gt;ADR-031&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔮&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Signal-Line Protocol&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;A 6-stage processing pipeline transforms raw WiFi signals into structured body representations — from signal cleanup through graph-based spatial reasoning to final pose output (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-033-crv-signal-line-sensing-integration.md&#34; &gt;ADR-033&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔒&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;QUIC Mesh Security&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;All sensor-to-sensor communication is encrypted end-to-end with tamper detection, replay protection, and seamless reconnection if a node moves or drops offline (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-032-multistatic-mesh-security-hardening.md&#34; &gt;ADR-032&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🎯&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Adaptive Classifier&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Records labeled CSI sessions, trains a 15-feature logistic regression model in pure Rust, and learns your room&amp;rsquo;s unique signal characteristics — replaces hand-tuned thresholds with data-driven classification (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-048-adaptive-csi-classifier.md&#34; &gt;ADR-048&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id=&#34;performance--deployment&#34;&gt;Performance &amp;amp; Deployment
&lt;/h3&gt;&lt;p&gt;Fast enough for real-time use, small enough for edge devices, simple enough for one-command setup.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;&lt;/th&gt;
          &lt;th&gt;Feature&lt;/th&gt;
          &lt;th&gt;What It Means&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;⚡&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Real-Time&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Analyzes WiFi signals in under 100 microseconds per frame — fast enough for live monitoring&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🦀&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;810x Faster&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Complete Rust rewrite: 54,000 frames/sec pipeline, multi-arch Docker image, 1,031+ tests&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🐳&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;One-Command Setup&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;docker pull ruvnet/wifi-densepose:latest&lt;/code&gt; — live sensing in 30 seconds, no toolchain needed (amd64 + arm64 / Apple Silicon)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;📡&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Fully Local&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Runs completely on a $9 ESP32 — no internet connection, no cloud account, no recurring fees. Detects presence, vital signs, and falls on-device with instant response&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;📦&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Portable Models&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Trained models package into a single &lt;code&gt;.rvf&lt;/code&gt; file — runs on edge, cloud, or browser (WASM)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔭&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Observatory Visualization&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cinematic Three.js dashboard with 5 holographic panels — subcarrier manifold, vital signs oracle, presence heatmap, phase constellation, convergence engine — all driven by live or demo CSI data (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-047-psychohistory-observatory-visualization.md&#34; &gt;ADR-047&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;📟&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;AMOLED Display&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;ESP32-S3 boards with built-in AMOLED screens show real-time presence, vital signs, and room status directly on the sensor — no phone or PC needed (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-045-amoled-display-support.md&#34; &gt;ADR-045&lt;/a&gt;)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-how-it-works&#34;&gt;🔬 How It Works
&lt;/h2&gt;&lt;p&gt;WiFi routers flood every room with radio waves. When a person moves — or even breathes — those waves scatter differently. WiFi DensePose reads that scattering pattern and reconstructs what happened:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;19
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;WiFi&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Router&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;radio&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;waves&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;pass&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;through&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;room&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;hit&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;human&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;body&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;scatter&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ESP32&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;mesh&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;4&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;6&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nodes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;captures&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;on&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;/&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;6&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;/&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;11&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;via&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;TDM&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;protocol&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Multi&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Band&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Fusion&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;3&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;×&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;56&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;168&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;virtual&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;per&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;link&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Multistatic&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Fusion&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;N&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;×&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;N&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;links&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;attention&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;weighted&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;cross&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;viewpoint&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;embedding&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Coherence&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Gate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;accept&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;/&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;reject&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;measurements&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stable&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;days&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;without&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;tuning&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Signal&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Processing&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Hampel&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;SpotFi&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Fresnel&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;BVP&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;spectrogram&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;clean&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;features&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;AI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Backbone&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;RuVector&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;):&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;attention&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;graph&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;algorithms&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;compression&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;field&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Signal&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Line&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Protocol&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;CRV&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;):&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;6&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;stage&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;gestalt&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sensory&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;topology&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;coherence&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;search&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Neural&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Network&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;processed&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;signals&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;17&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;body&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;keypoints&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;vital&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;signs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;room&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;↓&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Output&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;real&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;time&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pose&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;breathing&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;heart&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;rate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;room&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fingerprint&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;drift&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;alerts&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;No training cameras required — the &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-024-contrastive-csi-embedding-model.md&#34; &gt;Self-Learning system (ADR-024)&lt;/a&gt; bootstraps from raw WiFi data alone. &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-027-cross-environment-domain-generalization.md&#34; &gt;MERIDIAN (ADR-027)&lt;/a&gt; ensures the model works in any room, not just the one it trained in.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-use-cases--applications&#34;&gt;🏢 Use Cases &amp;amp; Applications
&lt;/h2&gt;&lt;p&gt;WiFi sensing works anywhere WiFi exists. No new hardware in most cases — just software on existing access points or a $8 ESP32 add-on. Because there are no cameras, deployments avoid privacy regulations (GDPR video, HIPAA imaging) by design.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Scaling:&lt;/strong&gt; Each AP distinguishes ~3-5 people (56 subcarriers). Multi-AP multiplies linearly — a 4-AP retail mesh covers ~15-20 occupants. No hard software limit; the practical ceiling is signal physics.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;&lt;/th&gt;
          &lt;th&gt;Why WiFi sensing wins&lt;/th&gt;
          &lt;th&gt;Traditional alternative&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;🔒&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;No video, no GDPR/HIPAA imaging rules&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cameras require consent, signage, data retention policies&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🧱&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Works through walls, shelving, debris&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cameras need line-of-sight per room&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🌙&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Works in total darkness&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cameras need IR or visible light&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;💰&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;$0-$8 per zone&lt;/strong&gt; (existing WiFi or ESP32)&lt;/td&gt;
          &lt;td&gt;Camera systems: $200-$2,000 per zone&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔌&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;WiFi already deployed everywhere&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;PIR/radar sensors require new wiring per room&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🏥 Everyday&lt;/strong&gt; — Healthcare, retail, office, hospitality (commodity WiFi)&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Use Case&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;Key Metric&lt;/th&gt;
          &lt;th&gt;Edge Module&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Elderly care / assisted living&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Fall detection, nighttime activity monitoring, breathing rate during sleep — no wearable compliance needed&lt;/td&gt;
          &lt;td&gt;1 ESP32-S3 per room ($8)&lt;/td&gt;
          &lt;td&gt;Fall alert &amp;lt;2s&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Sleep Apnea&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Gait Analysis&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Hospital patient monitoring&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Continuous breathing + heart rate for non-critical beds without wired sensors; nurse alert on anomaly&lt;/td&gt;
          &lt;td&gt;1-2 APs per ward&lt;/td&gt;
          &lt;td&gt;Breathing: 6-30 BPM&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Respiratory Distress&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Cardiac Arrhythmia&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Emergency room triage&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Automated occupancy count + wait-time estimation; detect patient distress (abnormal breathing) in waiting areas&lt;/td&gt;
          &lt;td&gt;Existing hospital WiFi&lt;/td&gt;
          &lt;td&gt;Occupancy accuracy &amp;gt;95%&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Queue Length&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Panic Motion&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Retail occupancy &amp;amp; flow&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Real-time foot traffic, dwell time by zone, queue length — no cameras, no opt-in, GDPR-friendly&lt;/td&gt;
          &lt;td&gt;Existing store WiFi + 1 ESP32&lt;/td&gt;
          &lt;td&gt;Dwell resolution ~1m&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Customer Flow&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Dwell Heatmap&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Office space utilization&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Which desks/rooms are actually occupied, meeting room no-shows, HVAC optimization based on real presence&lt;/td&gt;
          &lt;td&gt;Existing enterprise WiFi&lt;/td&gt;
          &lt;td&gt;Presence latency &amp;lt;1s&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Meeting Room&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;HVAC Presence&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Hotel &amp;amp; hospitality&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Room occupancy without door sensors, minibar/bathroom usage patterns, energy savings on empty rooms&lt;/td&gt;
          &lt;td&gt;Existing hotel WiFi&lt;/td&gt;
          &lt;td&gt;15-30% HVAC savings&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Energy Audit&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Lighting Zones&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Restaurants &amp;amp; food service&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Table turnover tracking, kitchen staff presence, restroom occupancy displays — no cameras in dining areas&lt;/td&gt;
          &lt;td&gt;Existing WiFi&lt;/td&gt;
          &lt;td&gt;Queue wait ±30s&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Table Turnover&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Queue Length&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Parking garages&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Pedestrian presence in stairwells and elevators where cameras have blind spots; security alert if someone lingers&lt;/td&gt;
          &lt;td&gt;Existing WiFi&lt;/td&gt;
          &lt;td&gt;Through-concrete walls&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Loitering&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Elevator Count&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🏟️ Specialized&lt;/strong&gt; — Events, fitness, education, civic (CSI-capable hardware)&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Use Case&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;Key Metric&lt;/th&gt;
          &lt;th&gt;Edge Module&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Smart home automation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Room-level presence triggers (lights, HVAC, music) that work through walls — no dead zones, no motion-sensor timeouts&lt;/td&gt;
          &lt;td&gt;2-3 ESP32-S3 nodes ($24)&lt;/td&gt;
          &lt;td&gt;Through-wall range ~5m&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;HVAC Presence&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Lighting Zones&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Fitness &amp;amp; sports&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Rep counting, posture correction, breathing cadence during exercise — no wearable, no camera in locker rooms&lt;/td&gt;
          &lt;td&gt;3+ ESP32-S3 mesh&lt;/td&gt;
          &lt;td&gt;Pose: 17 keypoints&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;Breathing Sync&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Gait Analysis&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Childcare &amp;amp; schools&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Naptime breathing monitoring, playground headcount, restricted-area alerts — privacy-safe for minors&lt;/td&gt;
          &lt;td&gt;2-4 ESP32-S3 per zone&lt;/td&gt;
          &lt;td&gt;Breathing: ±1 BPM&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Sleep Apnea&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Perimeter Breach&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Event venues &amp;amp; concerts&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Crowd density mapping, crush-risk detection via breathing compression, emergency evacuation flow tracking&lt;/td&gt;
          &lt;td&gt;Multi-AP mesh (4-8 APs)&lt;/td&gt;
          &lt;td&gt;Density per m²&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Customer Flow&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Panic Motion&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Stadiums &amp;amp; arenas&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Section-level occupancy for dynamic pricing, concession staffing, emergency egress flow modeling&lt;/td&gt;
          &lt;td&gt;Enterprise AP grid&lt;/td&gt;
          &lt;td&gt;15-20 per AP mesh&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Dwell Heatmap&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Queue Length&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Houses of worship&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Attendance counting without facial recognition — privacy-sensitive congregations, multi-room campus tracking&lt;/td&gt;
          &lt;td&gt;Existing WiFi&lt;/td&gt;
          &lt;td&gt;Zone-level accuracy&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Elevator Count&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;Energy Audit&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Warehouse &amp;amp; logistics&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Worker safety zones, forklift proximity alerts, occupancy in hazardous areas — works through shelving and pallets&lt;/td&gt;
          &lt;td&gt;Industrial AP mesh&lt;/td&gt;
          &lt;td&gt;Alert latency &amp;lt;500ms&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Forklift Proximity&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Confined Space&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Civic infrastructure&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Public restroom occupancy (no cameras possible), subway platform crowding, shelter headcount during emergencies&lt;/td&gt;
          &lt;td&gt;Municipal WiFi + ESP32&lt;/td&gt;
          &lt;td&gt;Real-time headcount&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Customer Flow&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Loitering&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Museums &amp;amp; galleries&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Visitor flow heatmaps, exhibit dwell time, crowd bottleneck alerts — no cameras near artwork (flash/theft risk)&lt;/td&gt;
          &lt;td&gt;Existing WiFi&lt;/td&gt;
          &lt;td&gt;Zone dwell ±5s&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Dwell Heatmap&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;Shelf Engagement&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🤖 Robotics &amp; Industrial&lt;/strong&gt; — Autonomous systems, manufacturing, android spatial awareness&lt;/summary&gt;
&lt;p&gt;WiFi sensing gives robots and autonomous systems a spatial awareness layer that works where LIDAR and cameras fail — through dust, smoke, fog, and around corners. The CSI signal field acts as a &amp;ldquo;sixth sense&amp;rdquo; for detecting humans in the environment without requiring line-of-sight.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Use Case&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;Key Metric&lt;/th&gt;
          &lt;th&gt;Edge Module&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Cobot safety zones&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detect human presence near collaborative robots — auto-slow or stop before contact, even behind obstructions&lt;/td&gt;
          &lt;td&gt;2-3 ESP32-S3 per cell&lt;/td&gt;
          &lt;td&gt;Presence latency &amp;lt;100ms&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Forklift Proximity&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Perimeter Breach&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Warehouse AMR navigation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Autonomous mobile robots sense humans around blind corners, through shelving racks — no LIDAR occlusion&lt;/td&gt;
          &lt;td&gt;ESP32 mesh along aisles&lt;/td&gt;
          &lt;td&gt;Through-shelf detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Forklift Proximity&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Loitering&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Android / humanoid spatial awareness&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Ambient human pose sensing for social robots — detect gestures, approach direction, and personal space without cameras always on&lt;/td&gt;
          &lt;td&gt;Onboard ESP32-S3 module&lt;/td&gt;
          &lt;td&gt;17-keypoint pose&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;Gesture Language&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;Emotion Detection&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Manufacturing line monitoring&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Worker presence at each station, ergonomic posture alerts, headcount for shift compliance — works through equipment&lt;/td&gt;
          &lt;td&gt;Industrial AP per zone&lt;/td&gt;
          &lt;td&gt;Pose + breathing&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Confined Space&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Gait Analysis&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Construction site safety&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Exclusion zone enforcement around heavy machinery, fall detection from scaffolding, personnel headcount&lt;/td&gt;
          &lt;td&gt;Ruggedized ESP32 mesh&lt;/td&gt;
          &lt;td&gt;Alert &amp;lt;2s, through-dust&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Panic Motion&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Structural Vibration&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Agricultural robotics&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detect farm workers near autonomous harvesters in dusty/foggy field conditions where cameras are unreliable&lt;/td&gt;
          &lt;td&gt;Weatherproof ESP32 nodes&lt;/td&gt;
          &lt;td&gt;Range ~10m open field&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Forklift Proximity&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;Rain Detection&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Drone landing zones&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Verify landing area is clear of humans — WiFi sensing works in rain, dust, and low light where downward cameras fail&lt;/td&gt;
          &lt;td&gt;Ground ESP32 nodes&lt;/td&gt;
          &lt;td&gt;Presence: &amp;gt;95% accuracy&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Perimeter Breach&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Tailgating&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Clean room monitoring&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Personnel tracking without cameras (particle contamination risk from camera fans) — gown compliance via pose&lt;/td&gt;
          &lt;td&gt;Existing cleanroom WiFi&lt;/td&gt;
          &lt;td&gt;No particulate emission&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Clean Room&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Livestock Monitor&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🔥 Extreme&lt;/strong&gt; — Through-wall, disaster, defense, underground&lt;/summary&gt;
&lt;p&gt;These scenarios exploit WiFi&amp;rsquo;s ability to penetrate solid materials — concrete, rubble, earth — where no optical or infrared sensor can reach. The WiFi-Mat disaster module (ADR-001) is specifically designed for this tier.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Use Case&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;Key Metric&lt;/th&gt;
          &lt;th&gt;Edge Module&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Search &amp;amp; rescue (WiFi-Mat)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detect survivors through rubble/debris via breathing signature, START triage color classification, 3D localization&lt;/td&gt;
          &lt;td&gt;Portable ESP32 mesh + laptop&lt;/td&gt;
          &lt;td&gt;Through 30cm concrete&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Respiratory Distress&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Seizure Detection&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Firefighting&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Locate occupants through smoke and walls before entry; breathing detection confirms life signs remotely&lt;/td&gt;
          &lt;td&gt;Portable mesh on truck&lt;/td&gt;
          &lt;td&gt;Works in zero visibility&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Sleep Apnea&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Panic Motion&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Prison &amp;amp; secure facilities&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cell occupancy verification, distress detection (abnormal vitals), perimeter sensing — no camera blind spots&lt;/td&gt;
          &lt;td&gt;Dedicated AP infrastructure&lt;/td&gt;
          &lt;td&gt;24/7 vital signs&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Cardiac Arrhythmia&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Loitering&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Military / tactical&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Through-wall personnel detection, room clearing confirmation, hostage vital signs at standoff distance&lt;/td&gt;
          &lt;td&gt;Directional WiFi + custom FW&lt;/td&gt;
          &lt;td&gt;Range: 5m through wall&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Perimeter Breach&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Weapon Detection&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Border &amp;amp; perimeter security&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detect human presence in tunnels, behind fences, in vehicles — passive sensing, no active illumination to reveal position&lt;/td&gt;
          &lt;td&gt;Concealed ESP32 mesh&lt;/td&gt;
          &lt;td&gt;Passive / covert&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Perimeter Breach&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Tailgating&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Mining &amp;amp; underground&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Worker presence in tunnels where GPS/cameras fail, breathing detection after collapse, headcount at safety points&lt;/td&gt;
          &lt;td&gt;Ruggedized ESP32 mesh&lt;/td&gt;
          &lt;td&gt;Through rock/earth&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Confined Space&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;Respiratory Distress&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Maritime &amp;amp; naval&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Below-deck personnel tracking through steel bulkheads (limited range, requires tuning), man-overboard detection&lt;/td&gt;
          &lt;td&gt;Ship WiFi + ESP32&lt;/td&gt;
          &lt;td&gt;Through 1-2 bulkheads&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Structural Vibration&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;Panic Motion&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Wildlife research&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Non-invasive animal activity monitoring in enclosures or dens — no light pollution, no visual disturbance&lt;/td&gt;
          &lt;td&gt;Weatherproof ESP32 nodes&lt;/td&gt;
          &lt;td&gt;Zero light emission&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;Livestock Monitor&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;Dream Stage&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;h3 id=&#34;edge-intelligence-adr-041&#34;&gt;Edge Intelligence (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-041-wasm-module-collection.md&#34; &gt;ADR-041&lt;/a&gt;)
&lt;/h3&gt;&lt;p&gt;Small programs that run directly on the ESP32 sensor — no internet needed, no cloud fees, instant response. Each module is a tiny WASM file (5-30 KB) that you upload to the device over-the-air. It reads WiFi signal data and makes decisions locally in under 10 ms. &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-041-wasm-module-collection.md&#34; &gt;ADR-041&lt;/a&gt; defines 60 modules across 13 categories — all 60 are implemented with 609 tests passing.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;&lt;/th&gt;
          &lt;th&gt;Category&lt;/th&gt;
          &lt;th&gt;Examples&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;🏥&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/medical.md&#34; &gt;&lt;strong&gt;Medical &amp;amp; Health&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sleep apnea detection, cardiac arrhythmia, gait analysis, seizure detection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔐&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/security.md&#34; &gt;&lt;strong&gt;Security &amp;amp; Safety&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Intrusion detection, perimeter breach, loitering, panic motion&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🏢&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/building.md&#34; &gt;&lt;strong&gt;Smart Building&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Zone occupancy, HVAC control, elevator counting, meeting room tracking&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🛒&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/retail.md&#34; &gt;&lt;strong&gt;Retail &amp;amp; Hospitality&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Queue length, dwell heatmaps, customer flow, table turnover&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🏭&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/industrial.md&#34; &gt;&lt;strong&gt;Industrial&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Forklift proximity, confined space monitoring, structural vibration&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🔮&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/exotic.md&#34; &gt;&lt;strong&gt;Exotic &amp;amp; Research&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sleep staging, emotion detection, sign language, breathing sync&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;📡&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/signal-intelligence.md&#34; &gt;&lt;strong&gt;Signal Intelligence&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Cleans and sharpens raw WiFi signals — focuses on important regions, filters noise, fills in missing data, and tracks which person is which&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🧠&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/adaptive-learning.md&#34; &gt;&lt;strong&gt;Adaptive Learning&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;The sensor learns new gestures and patterns on its own over time — no cloud needed, remembers what it learned even after updates&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🗺️&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/spatial-temporal.md&#34; &gt;&lt;strong&gt;Spatial Reasoning&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Figures out where people are in a room, which zones matter most, and tracks movement across areas using graph-based spatial logic&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;⏱️&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/spatial-temporal.md&#34; &gt;&lt;strong&gt;Temporal Analysis&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Learns daily routines, detects when patterns break (someone didn&amp;rsquo;t get up), and verifies safety rules are being followed over time&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🛡️&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/ai-security.md&#34; &gt;&lt;strong&gt;AI Security&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects signal replay attacks, WiFi jamming, injection attempts, and flags abnormal behavior that could indicate tampering&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;⚛️&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/autonomous.md&#34; &gt;&lt;strong&gt;Quantum-Inspired&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Uses quantum-inspired math to map room-wide signal coherence and search for optimal sensor configurations&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;🤖&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/autonomous.md&#34; &gt;&lt;strong&gt;Autonomous &amp;amp; Exotic&lt;/strong&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Self-managing sensor mesh — auto-heals dropped nodes, plans its own actions, and explores experimental signal representations&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;All implemented modules are &lt;code&gt;no_std&lt;/code&gt; Rust, share a &lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/vendor_common.rs&#34; &gt;common utility library&lt;/a&gt;, and talk to the host through a 12-function API. Full documentation: &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/README.md&#34; &gt;&lt;strong&gt;Edge Modules Guide&lt;/strong&gt;&lt;/a&gt;. See the &lt;a class=&#34;link&#34; href=&#34;#edge-module-list&#34; &gt;complete implemented module list&lt;/a&gt; below.&lt;/p&gt;
&lt;details id=&#34;edge-module-list&#34;&gt;
&lt;summary&gt;&lt;strong&gt;🧩 Edge Intelligence — &lt;a href=&#34;docs/edge-modules/README.md&#34;&gt;All 65 Modules Implemented&lt;/a&gt;&lt;/strong&gt; (ADR-041 complete)&lt;/summary&gt;
&lt;p&gt;All 60 modules are implemented, tested (609 tests passing), and ready to deploy. They compile to &lt;code&gt;wasm32-unknown-unknown&lt;/code&gt;, run on ESP32-S3 via WASM3, and share a &lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/vendor_common.rs&#34; &gt;common utility library&lt;/a&gt;. Source: &lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/&#34; &gt;&lt;code&gt;crates/wifi-densepose-wasm-edge/src/&lt;/code&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Core modules&lt;/strong&gt; (ADR-040 flagship + early implementations):&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Gesture Classifier&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/gesture.rs&#34; &gt;&lt;code&gt;gesture.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;DTW template matching for hand gestures&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Coherence Filter&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/coherence.rs&#34; &gt;&lt;code&gt;coherence.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Phase coherence gating for signal quality&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Adversarial Detector&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/adversarial.rs&#34; &gt;&lt;code&gt;adversarial.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects physically impossible signal patterns&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Intrusion Detector&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/intrusion.rs&#34; &gt;&lt;code&gt;intrusion.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Human vs non-human motion classification&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Occupancy Counter&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/occupancy.rs&#34; &gt;&lt;code&gt;occupancy.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Zone-level person counting&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Vital Trend&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/vital_trend.rs&#34; &gt;&lt;code&gt;vital_trend.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Long-term breathing and heart rate trending&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;RVF Parser&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/rvf.rs&#34; &gt;&lt;code&gt;rvf.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;RVF container format parsing&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Vendor-integrated modules&lt;/strong&gt; (24 modules, ADR-041 Category 7):&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;📡 Signal Intelligence&lt;/strong&gt; — Real-time CSI analysis and feature extraction&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Flash Attention&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_flash_attention.rs&#34; &gt;&lt;code&gt;sig_flash_attention.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tiled attention over 8 subcarrier groups — finds spatial focus regions and entropy&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Coherence Gate&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_coherence_gate.rs&#34; &gt;&lt;code&gt;sig_coherence_gate.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Z-score phasor gating with hysteresis: Accept / PredictOnly / Reject / Recalibrate&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Temporal Compress&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_temporal_compress.rs&#34; &gt;&lt;code&gt;sig_temporal_compress.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;3-tier adaptive quantization (8-bit hot / 5-bit warm / 3-bit cold)&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Sparse Recovery&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_sparse_recovery.rs&#34; &gt;&lt;code&gt;sig_sparse_recovery.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;ISTA L1 reconstruction for dropped subcarriers&lt;/td&gt;
          &lt;td&gt;H (&amp;lt;10ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Person Match&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_mincut_person_match.rs&#34; &gt;&lt;code&gt;sig_mincut_person_match.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Hungarian-lite bipartite assignment for multi-person tracking&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Optimal Transport&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sig_optimal_transport.rs&#34; &gt;&lt;code&gt;sig_optimal_transport.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sliced Wasserstein-1 distance with 4 projections&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🧠 Adaptive Learning&lt;/strong&gt; — On-device learning without cloud connectivity&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;DTW Gesture Learn&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/lrn_dtw_gesture_learn.rs&#34; &gt;&lt;code&gt;lrn_dtw_gesture_learn.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;User-teachable gesture recognition — 3-rehearsal protocol, 16 templates&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Anomaly Attractor&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/lrn_anomaly_attractor.rs&#34; &gt;&lt;code&gt;lrn_anomaly_attractor.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;4D dynamical system attractor classification with Lyapunov exponents&lt;/td&gt;
          &lt;td&gt;H (&amp;lt;10ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Meta Adapt&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/lrn_meta_adapt.rs&#34; &gt;&lt;code&gt;lrn_meta_adapt.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Hill-climbing self-optimization with safety rollback&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;EWC Lifelong&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/lrn_ewc_lifelong.rs&#34; &gt;&lt;code&gt;lrn_ewc_lifelong.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Elastic Weight Consolidation — remembers past tasks while learning new ones&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🗺️ Spatial Reasoning&lt;/strong&gt; — Location, proximity, and influence mapping&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;PageRank Influence&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/spt_pagerank_influence.rs&#34; &gt;&lt;code&gt;spt_pagerank_influence.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;4x4 cross-correlation graph with power iteration PageRank&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Micro HNSW&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/spt_micro_hnsw.rs&#34; &gt;&lt;code&gt;spt_micro_hnsw.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;64-vector navigable small-world graph for nearest-neighbor search&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Spiking Tracker&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/spt_spiking_tracker.rs&#34; &gt;&lt;code&gt;spt_spiking_tracker.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;32 LIF neurons + 4 output zone neurons with STDP learning&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;⏱️ Temporal Analysis&lt;/strong&gt; — Activity patterns, logic verification, autonomous planning&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Pattern Sequence&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/tmp_pattern_sequence.rs&#34; &gt;&lt;code&gt;tmp_pattern_sequence.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Activity routine detection and deviation alerts&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Temporal Logic Guard&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/tmp_temporal_logic_guard.rs&#34; &gt;&lt;code&gt;tmp_temporal_logic_guard.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;LTL formula verification on CSI event streams&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;GOAP Autonomy&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/tmp_goap_autonomy.rs&#34; &gt;&lt;code&gt;tmp_goap_autonomy.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Goal-Oriented Action Planning for autonomous module management&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🛡️ AI Security&lt;/strong&gt; — Tamper detection and behavioral anomaly profiling&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Prompt Shield&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ais_prompt_shield.rs&#34; &gt;&lt;code&gt;ais_prompt_shield.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;FNV-1a replay detection, injection detection (10x amplitude), jamming (SNR)&lt;/td&gt;
          &lt;td&gt;L (&amp;lt;2ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Behavioral Profiler&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ais_behavioral_profiler.rs&#34; &gt;&lt;code&gt;ais_behavioral_profiler.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;6D behavioral profile with Mahalanobis anomaly scoring&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;⚛️ Quantum-Inspired&lt;/strong&gt; — Quantum computing metaphors applied to CSI analysis&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Quantum Coherence&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/qnt_quantum_coherence.rs&#34; &gt;&lt;code&gt;qnt_quantum_coherence.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Bloch sphere mapping, Von Neumann entropy, decoherence detection&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Interference Search&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/qnt_interference_search.rs&#34; &gt;&lt;code&gt;qnt_interference_search.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;16 room-state hypotheses with Grover-inspired oracle + diffusion&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🤖 Autonomous Systems&lt;/strong&gt; — Self-governing and self-healing behaviors&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Psycho-Symbolic&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/aut_psycho_symbolic.rs&#34; &gt;&lt;code&gt;aut_psycho_symbolic.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;16-rule forward-chaining knowledge base with contradiction detection&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Self-Healing Mesh&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/aut_self_healing_mesh.rs&#34; &gt;&lt;code&gt;aut_self_healing_mesh.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;8-node mesh with health tracking, degradation/recovery, coverage healing&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🔮 Exotic (Vendor)&lt;/strong&gt; — Novel mathematical models for CSI interpretation&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Time Crystal&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_time_crystal.rs&#34; &gt;&lt;code&gt;exo_time_crystal.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Autocorrelation subharmonic detection in 256-frame history&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Hyperbolic Space&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_hyperbolic_space.rs&#34; &gt;&lt;code&gt;exo_hyperbolic_space.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Poincare ball embedding with 32 reference locations, hyperbolic distance&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🏥 Medical &amp;amp; Health&lt;/strong&gt; (Category 1) — Contactless health monitoring&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Sleep Apnea&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/med_sleep_apnea.rs&#34; &gt;&lt;code&gt;med_sleep_apnea.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects breathing pauses during sleep&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Cardiac Arrhythmia&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/med_cardiac_arrhythmia.rs&#34; &gt;&lt;code&gt;med_cardiac_arrhythmia.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Monitors heart rate for irregular rhythms&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Respiratory Distress&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/med_respiratory_distress.rs&#34; &gt;&lt;code&gt;med_respiratory_distress.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Alerts on abnormal breathing patterns&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Gait Analysis&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/med_gait_analysis.rs&#34; &gt;&lt;code&gt;med_gait_analysis.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tracks walking patterns and detects changes&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Seizure Detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/med_seizure_detect.rs&#34; &gt;&lt;code&gt;med_seizure_detect.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;6-state machine for tonic-clonic seizure recognition&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🔐 Security &amp;amp; Safety&lt;/strong&gt; (Category 2) — Perimeter and threat detection&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Perimeter Breach&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sec_perimeter_breach.rs&#34; &gt;&lt;code&gt;sec_perimeter_breach.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects boundary crossings with approach/departure&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Weapon Detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sec_weapon_detect.rs&#34; &gt;&lt;code&gt;sec_weapon_detect.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Metal anomaly detection via CSI amplitude shifts&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Tailgating&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sec_tailgating.rs&#34; &gt;&lt;code&gt;sec_tailgating.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects unauthorized follow-through at access points&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Loitering&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sec_loitering.rs&#34; &gt;&lt;code&gt;sec_loitering.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Alerts when someone lingers too long in a zone&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Panic Motion&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/sec_panic_motion.rs&#34; &gt;&lt;code&gt;sec_panic_motion.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects fleeing, struggling, or panic movement&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🏢 Smart Building&lt;/strong&gt; (Category 3) — Automation and energy efficiency&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;HVAC Presence&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/bld_hvac_presence.rs&#34; &gt;&lt;code&gt;bld_hvac_presence.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Occupancy-driven HVAC control with departure countdown&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Lighting Zones&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/bld_lighting_zones.rs&#34; &gt;&lt;code&gt;bld_lighting_zones.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Auto-dim/off lighting based on zone activity&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Elevator Count&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/bld_elevator_count.rs&#34; &gt;&lt;code&gt;bld_elevator_count.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Counts people entering/leaving with overload warning&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Meeting Room&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/bld_meeting_room.rs&#34; &gt;&lt;code&gt;bld_meeting_room.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tracks meeting lifecycle: start, headcount, end, availability&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Energy Audit&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/bld_energy_audit.rs&#34; &gt;&lt;code&gt;bld_energy_audit.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tracks after-hours usage and room utilization rates&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🛒 Retail &amp;amp; Hospitality&lt;/strong&gt; (Category 4) — Customer insights without cameras&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Queue Length&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ret_queue_length.rs&#34; &gt;&lt;code&gt;ret_queue_length.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Estimates queue size and wait times&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Dwell Heatmap&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ret_dwell_heatmap.rs&#34; &gt;&lt;code&gt;ret_dwell_heatmap.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Shows where people spend time (hot/cold zones)&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Customer Flow&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ret_customer_flow.rs&#34; &gt;&lt;code&gt;ret_customer_flow.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Counts ins/outs and tracks net occupancy&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Table Turnover&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ret_table_turnover.rs&#34; &gt;&lt;code&gt;ret_table_turnover.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Restaurant table lifecycle: seated, dining, vacated&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Shelf Engagement&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ret_shelf_engagement.rs&#34; &gt;&lt;code&gt;ret_shelf_engagement.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects browsing, considering, and reaching for products&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🏭 Industrial &amp;amp; Specialized&lt;/strong&gt; (Category 5) — Safety and compliance&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Forklift Proximity&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ind_forklift_proximity.rs&#34; &gt;&lt;code&gt;ind_forklift_proximity.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Warns when people get too close to vehicles&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Confined Space&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ind_confined_space.rs&#34; &gt;&lt;code&gt;ind_confined_space.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;OSHA-compliant worker monitoring with extraction alerts&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Clean Room&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ind_clean_room.rs&#34; &gt;&lt;code&gt;ind_clean_room.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Occupancy limits and turbulent motion detection&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Livestock Monitor&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ind_livestock_monitor.rs&#34; &gt;&lt;code&gt;ind_livestock_monitor.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Animal presence, stillness, and escape alerts&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Structural Vibration&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/ind_structural_vibration.rs&#34; &gt;&lt;code&gt;ind_structural_vibration.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Seismic events, mechanical resonance, structural drift&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;🔮 Exotic &amp;amp; Research&lt;/strong&gt; (Category 6) — Experimental sensing applications&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Budget&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Dream Stage&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_dream_stage.rs&#34; &gt;&lt;code&gt;exo_dream_stage.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Contactless sleep stage classification (wake/light/deep/REM)&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Emotion Detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_emotion_detect.rs&#34; &gt;&lt;code&gt;exo_emotion_detect.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Arousal, stress, and calm detection from micro-movements&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Gesture Language&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_gesture_language.rs&#34; &gt;&lt;code&gt;exo_gesture_language.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sign language letter recognition via WiFi&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Music Conductor&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_music_conductor.rs&#34; &gt;&lt;code&gt;exo_music_conductor.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tempo and dynamic tracking from conducting gestures&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Plant Growth&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_plant_growth.rs&#34; &gt;&lt;code&gt;exo_plant_growth.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Monitors plant growth, circadian rhythms, wilt detection&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Ghost Hunter&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_ghost_hunter.rs&#34; &gt;&lt;code&gt;exo_ghost_hunter.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Environmental anomaly classification (draft/insect/wind/unknown)&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Rain Detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_rain_detect.rs&#34; &gt;&lt;code&gt;exo_rain_detect.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects rain onset, intensity, and cessation via signal scatter&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Breathing Sync&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/wifi-densepose-wasm-edge/src/exo_breathing_sync.rs&#34; &gt;&lt;code&gt;exo_breathing_sync.rs&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Detects synchronized breathing between multiple people&lt;/td&gt;
          &lt;td&gt;S (&amp;lt;5ms)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🧠 Self-Learning WiFi AI (ADR-024)&lt;/strong&gt; — Adaptive recognition, self-optimization, and intelligent anomaly detection&lt;/summary&gt;
&lt;p&gt;Every WiFi signal that passes through a room creates a unique fingerprint of that space. WiFi-DensePose already reads these fingerprints to track people, but until now it threw away the internal &amp;ldquo;understanding&amp;rdquo; after each reading. The Self-Learning WiFi AI captures and preserves that understanding as compact, reusable vectors — and continuously optimizes itself for each new environment.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What it does in plain terms:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Turns any WiFi signal into a 128-number &amp;ldquo;fingerprint&amp;rdquo; that uniquely describes what&amp;rsquo;s happening in a room&lt;/li&gt;
&lt;li&gt;Learns entirely on its own from raw WiFi data — no cameras, no labeling, no human supervision needed&lt;/li&gt;
&lt;li&gt;Recognizes rooms, detects intruders, identifies people, and classifies activities using only WiFi&lt;/li&gt;
&lt;li&gt;Runs on an $8 ESP32 chip (the entire model fits in 55 KB of memory)&lt;/li&gt;
&lt;li&gt;Produces both body pose tracking AND environment fingerprints in a single computation&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Key Capabilities&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;What&lt;/th&gt;
          &lt;th&gt;How it works&lt;/th&gt;
          &lt;th&gt;Why it matters&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Self-supervised learning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;The model watches WiFi signals and teaches itself what &amp;ldquo;similar&amp;rdquo; and &amp;ldquo;different&amp;rdquo; look like, without any human-labeled data&lt;/td&gt;
          &lt;td&gt;Deploy anywhere — just plug in a WiFi sensor and wait 10 minutes&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Room identification&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Each room produces a distinct WiFi fingerprint pattern&lt;/td&gt;
          &lt;td&gt;Know which room someone is in without GPS or beacons&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Anomaly detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;An unexpected person or event creates a fingerprint that doesn&amp;rsquo;t match anything seen before&lt;/td&gt;
          &lt;td&gt;Automatic intrusion and fall detection as a free byproduct&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Person re-identification&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Each person disturbs WiFi in a slightly different way, creating a personal signature&lt;/td&gt;
          &lt;td&gt;Track individuals across sessions without cameras&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Environment adaptation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;MicroLoRA adapters (1,792 parameters per room) fine-tune the model for each new space&lt;/td&gt;
          &lt;td&gt;Adapts to a new room with minimal data — 93% less than retraining from scratch&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Memory preservation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;EWC++ regularization remembers what was learned during pretraining&lt;/td&gt;
          &lt;td&gt;Switching to a new task doesn&amp;rsquo;t erase prior knowledge&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Hard-negative mining&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Training focuses on the most confusing examples to learn faster&lt;/td&gt;
          &lt;td&gt;Better accuracy with the same amount of training data&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Architecture&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-fallback&#34; data-lang=&#34;fallback&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;WiFi Signal [56 channels] → Transformer + Graph Neural Network
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                                  ├→ 128-dim environment fingerprint (for search + identification)
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                                  └→ 17-joint body pose (for human tracking)
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Quick Start&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Step 1: Learn from raw WiFi data (no labels needed)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo run -p wifi-densepose-sensing-server -- --pretrain --dataset data/csi/ --pretrain-epochs &lt;span class=&#34;m&#34;&gt;50&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Step 2: Fine-tune with pose labels for full capability&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo run -p wifi-densepose-sensing-server -- --train --dataset data/mmfi/ --epochs &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt; --save-rvf model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Step 3: Use the model — extract fingerprints from live WiFi&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo run -p wifi-densepose-sensing-server -- --model model.rvf --embed
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Step 4: Search — find similar environments or detect anomalies&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo run -p wifi-densepose-sensing-server -- --model model.rvf --build-index env
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Training Modes&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Mode&lt;/th&gt;
          &lt;th&gt;What you need&lt;/th&gt;
          &lt;th&gt;What you get&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Self-Supervised&lt;/td&gt;
          &lt;td&gt;Just raw WiFi data&lt;/td&gt;
          &lt;td&gt;A model that understands WiFi signal structure&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Supervised&lt;/td&gt;
          &lt;td&gt;WiFi data + body pose labels&lt;/td&gt;
          &lt;td&gt;Full pose tracking + environment fingerprints&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Cross-Modal&lt;/td&gt;
          &lt;td&gt;WiFi data + camera footage&lt;/td&gt;
          &lt;td&gt;Fingerprints aligned with visual understanding&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Fingerprint Index Types&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Index&lt;/th&gt;
          &lt;th&gt;What it stores&lt;/th&gt;
          &lt;th&gt;Real-world use&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;env_fingerprint&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Average room fingerprint&lt;/td&gt;
          &lt;td&gt;&amp;ldquo;Is this the kitchen or the bedroom?&amp;rdquo;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;activity_pattern&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Activity boundaries&lt;/td&gt;
          &lt;td&gt;&amp;ldquo;Is someone cooking, sleeping, or exercising?&amp;rdquo;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;temporal_baseline&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Normal conditions&lt;/td&gt;
          &lt;td&gt;&amp;ldquo;Something unusual just happened in this room&amp;rdquo;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;person_track&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Individual movement signatures&lt;/td&gt;
          &lt;td&gt;&amp;ldquo;Person A just entered the living room&amp;rdquo;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Model Size&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Component&lt;/th&gt;
          &lt;th&gt;Parameters&lt;/th&gt;
          &lt;th&gt;Memory (on ESP32)&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Transformer backbone&lt;/td&gt;
          &lt;td&gt;~28,000&lt;/td&gt;
          &lt;td&gt;28 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Embedding projection head&lt;/td&gt;
          &lt;td&gt;~25,000&lt;/td&gt;
          &lt;td&gt;25 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Per-room MicroLoRA adapter&lt;/td&gt;
          &lt;td&gt;~1,800&lt;/td&gt;
          &lt;td&gt;2 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Total&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;~55,000&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;55 KB&lt;/strong&gt; (of 520 KB available)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;The self-learning system builds on the &lt;a class=&#34;link&#34; href=&#34;#ai-backbone-ruvector&#34; &gt;AI Backbone (RuVector)&lt;/a&gt; signal-processing layer — attention, graph algorithms, and compression — adding contrastive learning on top.&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-024-contrastive-csi-embedding-model.md&#34; &gt;&lt;code&gt;docs/adr/ADR-024-contrastive-csi-embedding-model.md&lt;/code&gt;&lt;/a&gt; for full architectural details.&lt;/p&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-installation&#34;&gt;📦 Installation
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Guided Installer&lt;/strong&gt; — Interactive hardware detection and profile selection&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./install.sh
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;The installer walks through 7 steps: system detection, toolchain check, WiFi hardware scan, profile recommendation, dependency install, build, and verification.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Profile&lt;/th&gt;
          &lt;th&gt;What it installs&lt;/th&gt;
          &lt;th&gt;Size&lt;/th&gt;
          &lt;th&gt;Requirements&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;verify&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Pipeline verification only&lt;/td&gt;
          &lt;td&gt;~5 MB&lt;/td&gt;
          &lt;td&gt;Python 3.8+&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;python&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Full Python API server + sensing&lt;/td&gt;
          &lt;td&gt;~500 MB&lt;/td&gt;
          &lt;td&gt;Python 3.8+&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;rust&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Rust pipeline (~810x faster)&lt;/td&gt;
          &lt;td&gt;~200 MB&lt;/td&gt;
          &lt;td&gt;Rust 1.70+&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;browser&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;WASM for in-browser execution&lt;/td&gt;
          &lt;td&gt;~10 MB&lt;/td&gt;
          &lt;td&gt;Rust + wasm-pack&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;iot&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;ESP32 sensor mesh + aggregator&lt;/td&gt;
          &lt;td&gt;varies&lt;/td&gt;
          &lt;td&gt;Rust + ESP-IDF&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;docker&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Docker-based deployment&lt;/td&gt;
          &lt;td&gt;~1 GB&lt;/td&gt;
          &lt;td&gt;Docker&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;field&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;WiFi-Mat disaster response kit&lt;/td&gt;
          &lt;td&gt;~62 MB&lt;/td&gt;
          &lt;td&gt;Rust + wasm-pack&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;full&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Everything available&lt;/td&gt;
          &lt;td&gt;~2 GB&lt;/td&gt;
          &lt;td&gt;All toolchains&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Non-interactive&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./install.sh --profile rust --yes
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Hardware check only&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./install.sh --check-only
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;From Source&lt;/strong&gt; — Rust (primary) or Python&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;git clone https://github.com/ruvnet/RuView.git
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; RuView
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Rust (primary — 810x faster)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo build --release
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; --workspace
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Python (legacy v1)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install -r requirements.txt
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install -e .
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Or via pip&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install wifi-densepose
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install wifi-densepose&lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;gpu&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;   &lt;span class=&#34;c1&#34;&gt;# GPU acceleration&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install wifi-densepose&lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;all&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;   &lt;span class=&#34;c1&#34;&gt;# All optional deps&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Docker&lt;/strong&gt; — Pre-built images, no toolchain needed&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Rust sensing server (132 MB — recommended)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 3000:3000 -p 3001:3001 -p 5005:5005/udp ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Python sensing pipeline (569 MB)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:python
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 8765:8765 -p 8080:8080 ruvnet/wifi-densepose:python
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Both via docker-compose&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; docker &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; docker compose up
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Export RVF model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run --rm -v &lt;span class=&#34;k&#34;&gt;$(&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;pwd&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;)&lt;/span&gt;:/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Image&lt;/th&gt;
          &lt;th&gt;Tag&lt;/th&gt;
          &lt;th&gt;Platforms&lt;/th&gt;
          &lt;th&gt;Ports&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;ruvnet/wifi-densepose&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;latest&lt;/code&gt;, &lt;code&gt;rust&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;linux/amd64, linux/arm64&lt;/td&gt;
          &lt;td&gt;3000 (REST), 3001 (WS), 5005/udp (ESP32)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;ruvnet/wifi-densepose&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;python&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;linux/amd64&lt;/td&gt;
          &lt;td&gt;8765 (WS), 8080 (UI)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;System Requirements&lt;/strong&gt;&lt;/summary&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Rust&lt;/strong&gt;: 1.70+ (primary runtime — install via &lt;a class=&#34;link&#34; href=&#34;https://rustup.rs/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;rustup&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Python&lt;/strong&gt;: 3.8+ (for verification and legacy v1 API)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;OS&lt;/strong&gt;: Linux (Ubuntu 18.04+), macOS (10.15+), Windows 10+&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Memory&lt;/strong&gt;: Minimum 4GB RAM, Recommended 8GB+&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Storage&lt;/strong&gt;: 2GB free space for models and data&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Network&lt;/strong&gt;: WiFi interface with CSI capability (optional — installer detects what you have)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;GPU&lt;/strong&gt;: Optional (NVIDIA CUDA or Apple Metal)&lt;/li&gt;
&lt;/ul&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Rust Crates&lt;/strong&gt; — Individual crates on crates.io&lt;/summary&gt;
&lt;p&gt;The Rust workspace consists of 15 crates, all published to &lt;a class=&#34;link&#34; href=&#34;https://crates.io/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crates.io&lt;/a&gt;:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Add individual crates to your Cargo.toml&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-core       &lt;span class=&#34;c1&#34;&gt;# Types, traits, errors&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-signal     &lt;span class=&#34;c1&#34;&gt;# CSI signal processing (6 SOTA algorithms)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-nn         &lt;span class=&#34;c1&#34;&gt;# Neural inference (ONNX, PyTorch, Candle)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-vitals     &lt;span class=&#34;c1&#34;&gt;# Vital sign extraction (breathing + heart rate)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-mat        &lt;span class=&#34;c1&#34;&gt;# Disaster response (MAT survivor detection)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-hardware   &lt;span class=&#34;c1&#34;&gt;# ESP32, Intel 5300, Atheros sensors&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-train      &lt;span class=&#34;c1&#34;&gt;# Training pipeline (MM-Fi dataset)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-wifiscan   &lt;span class=&#34;c1&#34;&gt;# Multi-BSSID WiFi scanning&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo add wifi-densepose-ruvector   &lt;span class=&#34;c1&#34;&gt;# RuVector v2.0.4 integration layer (ADR-017)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Crate&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;RuVector&lt;/th&gt;
          &lt;th&gt;crates.io&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-core&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Foundation types, traits, and utilities&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-core.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-signal&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-signal&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;SOTA CSI signal processing (SpotFi, FarSense, Widar 3.0)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;mincut&lt;/code&gt;, &lt;code&gt;attn-mincut&lt;/code&gt;, &lt;code&gt;attention&lt;/code&gt;, &lt;code&gt;solver&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-signal&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-signal.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-nn&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-nn&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Multi-backend inference (ONNX, PyTorch, Candle)&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-nn&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-nn.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-train&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-train&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Training pipeline with MM-Fi dataset (NeurIPS 2023)&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;All 5&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-train&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-train.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-mat&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-mat&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Mass Casualty Assessment Tool (disaster survivor detection)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;solver&lt;/code&gt;, &lt;code&gt;temporal-tensor&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-mat&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-mat.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-ruvector&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;RuVector v2.0.4 integration layer — 7 signal+MAT integration points (ADR-017)&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;All 5&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-ruvector.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-vitals&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-vitals&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Vital signs: breathing (6-30 BPM), heart rate (40-120 BPM)&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-vitals&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-vitals.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-hardware&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-hardware&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;ESP32, Intel 5300, Atheros CSI sensor interfaces&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-hardware&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-hardware.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-wifiscan&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-wifiscan&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Multi-BSSID WiFi scanning (Windows, macOS, Linux)&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-wifiscan&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-wifiscan.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-wasm&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-wasm&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;WebAssembly bindings for browser deployment&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-wasm&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-wasm.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-sensing-server&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-sensing-server&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Axum server: UDP ingestion, WebSocket broadcast&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-sensing-server&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-sensing-server.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-cli&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-cli&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Command-line tool for MAT disaster scanning&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-cli&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-cli.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-api&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-api&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;REST + WebSocket API layer&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-api&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-api.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-config&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-config&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Configuration management&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-config&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-config.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-db&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-db&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Database persistence (PostgreSQL, SQLite, Redis)&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-db&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://img.shields.io/crates/v/wifi-densepose-db.svg&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;crates.io&#34;
	
	
&gt;&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-pointcloud&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Real-time dense point cloud from camera + WiFi CSI fusion (Three.js viewer, brain bridge). Workspace-only for now.&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-geo&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Geospatial context (Sentinel-2 tiles, SRTM elevation, OSM, weather, night-mode). Workspace-only for now.&lt;/td&gt;
          &lt;td&gt;&amp;ndash;&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;All crates integrate with &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector v2.0.4&lt;/a&gt; — see &lt;a class=&#34;link&#34; href=&#34;#ai-backbone-ruvector&#34; &gt;AI Backbone&lt;/a&gt; below.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/ruv-neural/&#34; &gt;rUv Neural&lt;/a&gt;&lt;/strong&gt; — A separate 12-crate workspace for brain network topology analysis, neural decoding, and medical sensing. See &lt;a class=&#34;link&#34; href=&#34;#ruv-neural&#34; &gt;rUv Neural&lt;/a&gt; in Models &amp;amp; Training.&lt;/p&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-quick-start&#34;&gt;🚀 Quick Start
&lt;/h2&gt;&lt;details open&gt;
&lt;summary&gt;&lt;strong&gt;First API call in 3 commands&lt;/strong&gt;&lt;/summary&gt;
&lt;h3 id=&#34;1-install&#34;&gt;1. Install
&lt;/h3&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Fastest path — Docker&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 3000:3000 ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Or from source (Rust)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./install.sh --profile rust --yes
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h3 id=&#34;2-start-the-system&#34;&gt;2. Start the System
&lt;/h3&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;from&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;wifi_densepose&lt;/span&gt; &lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;WiFiDensePose&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;system&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;WiFiDensePose&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;system&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;start&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;poses&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;system&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;get_latest_poses&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;f&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;Detected &lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;len&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;poses&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt; persons&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;system&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;stop&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h3 id=&#34;3-rest-api&#34;&gt;3. REST API
&lt;/h3&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Health check&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/health
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Latest sensing frame&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/api/v1/sensing/latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Vital signs&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/api/v1/vital-signs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Pose estimation&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/api/v1/pose/current
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Server info&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/api/v1/info
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h3 id=&#34;4-real-time-websocket&#34;&gt;4. Real-time WebSocket
&lt;/h3&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;9
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;asyncio&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;websockets&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;nn&#34;&gt;json&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;async&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;def&lt;/span&gt; &lt;span class=&#34;nf&#34;&gt;stream&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;():&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;k&#34;&gt;async&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;with&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;websockets&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;connect&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;ws://localhost:3001/ws/sensing&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;as&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ws&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;k&#34;&gt;async&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;msg&lt;/span&gt; &lt;span class=&#34;ow&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ws&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;json&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;loads&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;msg&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;            &lt;span class=&#34;nb&#34;&gt;print&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;sa&#34;&gt;f&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;Persons: &lt;/span&gt;&lt;span class=&#34;si&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;len&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;get&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s1&#34;&gt;&amp;#39;persons&amp;#39;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[]))&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;asyncio&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;run&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;stream&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;())&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-table-of-contents&#34;&gt;📋 Table of Contents
&lt;/h2&gt;&lt;details open&gt;
&lt;summary&gt;&lt;strong&gt;📡 Signal Processing &amp; Sensing&lt;/strong&gt; — From raw WiFi frames to vital signs&lt;/summary&gt;
&lt;p&gt;The signal processing stack transforms raw WiFi Channel State Information into actionable human sensing data. Starting from 56-192 subcarrier complex values captured at 20 Hz, the pipeline applies research-grade algorithms (SpotFi phase correction, Hampel outlier rejection, Fresnel zone modeling) to extract breathing rate, heart rate, motion level, and multi-person body pose — all in pure Rust with zero external ML dependencies.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Docs&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#key-features&#34; &gt;Key Features&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sensing, Intelligence, and Performance &amp;amp; Deployment capabilities&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#how-it-works&#34; &gt;How It Works&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;End-to-end pipeline: radio waves → CSI capture → signal processing → AI → pose + vitals&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#esp32-s3-hardware-pipeline&#34; &gt;ESP32-S3 Hardware Pipeline&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;20 Hz CSI streaming, binary frame parsing, flash &amp;amp; provision&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-018-esp32-dev-implementation.md&#34; &gt;ADR-018&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/34&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #34&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#vital-sign-detection&#34; &gt;Vital Sign Detection&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Breathing 6-30 BPM, heartbeat 40-120 BPM, FFT peak detection&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-021-vital-sign-detection-rvdna-pipeline.md&#34; &gt;ADR-021&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#wifi-scan-domain-layer&#34; &gt;WiFi Scan Domain Layer&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;8-stage RSSI pipeline, multi-BSSID fingerprinting, Windows WiFi&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-022-windows-wifi-enhanced-fidelity-ruvector.md&#34; &gt;ADR-022&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/36&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #36&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#wifi-mat-disaster-response&#34; &gt;WiFi-Mat Disaster Response&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Search &amp;amp; rescue, START triage, 3D localization through debris&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-001-wifi-mat-disaster-detection.md&#34; &gt;ADR-001&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;docs/wifi-mat-user-guide.md&#34; &gt;User Guide&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#sota-signal-processing&#34; &gt;SOTA Signal Processing&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;SpotFi, Hampel, Fresnel, STFT spectrogram, subcarrier selection, BVP&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-014-sota-signal-processing.md&#34; &gt;ADR-014&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🧠 Models &amp; Training&lt;/strong&gt; — DensePose pipeline, RVF containers, SONA adaptation, RuVector integration&lt;/summary&gt;
&lt;p&gt;The neural pipeline uses a graph transformer with cross-attention to map CSI feature matrices to 17 COCO body keypoints and DensePose UV coordinates. Models are packaged as single-file &lt;code&gt;.rvf&lt;/code&gt; containers with progressive loading (Layer A instant, Layer B warm, Layer C full). SONA (Self-Optimizing Neural Architecture) enables continuous on-device adaptation via micro-LoRA + EWC++ without catastrophic forgetting. Signal processing is powered by 5 &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector&lt;/a&gt; crates (v2.0.4) with 7 integration points across the Rust workspace, plus 6 additional vendored crates for inference and graph intelligence.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Docs&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#rvf-model-container&#34; &gt;RVF Model Container&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Binary packaging with Ed25519 signing, progressive 3-layer loading, SIMD quantization&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md&#34; &gt;ADR-023&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#training--fine-tuning&#34; &gt;Training &amp;amp; Fine-Tuning&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;8-phase pure Rust pipeline (7,832 lines), MM-Fi/Wi-Pose pre-training, 6-term composite loss, SONA LoRA&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md&#34; &gt;ADR-023&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#ruvector-crates&#34; &gt;RuVector Crates&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;11 vendored Rust crates from &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector&lt;/a&gt;: attention, min-cut, solver, GNN, HNSW, temporal compression, sparse inference&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;GitHub&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;vendor/ruvector/&#34; &gt;Source&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#ruv-neural&#34; &gt;rUv Neural&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;12-crate brain topology analysis ecosystem: neural decoding, quantum sensor integration, cognitive state classification, BCI output&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/ruv-neural/README.md&#34; &gt;README&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#ai-backbone-ruvector&#34; &gt;AI Backbone (RuVector)&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;5 AI capabilities replacing hand-tuned thresholds: attention, graph min-cut, sparse solvers, tiered compression&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crates.io&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#self-learning-wifi-ai-adr-024&#34; &gt;Self-Learning WiFi AI (ADR-024)&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Contrastive self-supervised learning, room fingerprinting, anomaly detection, 55 KB model&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-024-contrastive-csi-embedding-model.md&#34; &gt;ADR-024&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-027-cross-environment-domain-generalization.md&#34; &gt;Cross-Environment Generalization (ADR-027)&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Domain-adversarial training, geometry-conditioned inference, hardware normalization, zero-shot deployment&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-027-cross-environment-domain-generalization.md&#34; &gt;ADR-027&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🖥️ Usage &amp; Configuration&lt;/strong&gt; — CLI flags, API endpoints, hardware setup&lt;/summary&gt;
&lt;p&gt;The Rust sensing server is the primary interface, offering a comprehensive CLI with flags for data source selection, model loading, training, benchmarking, and RVF export. A REST API (Axum) and WebSocket server provide real-time data access. The Python v1 CLI remains available for legacy workflows.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Docs&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#cli-usage&#34; &gt;CLI Usage&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;--source&lt;/code&gt;, &lt;code&gt;--train&lt;/code&gt;, &lt;code&gt;--benchmark&lt;/code&gt;, &lt;code&gt;--export-rvf&lt;/code&gt;, &lt;code&gt;--model&lt;/code&gt;, &lt;code&gt;--progressive&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#rest-api--websocket&#34; &gt;REST API &amp;amp; WebSocket&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;6 REST endpoints (sensing, vitals, BSSID, SONA), WebSocket real-time stream&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#hardware-support-1&#34; &gt;Hardware Support&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;ESP32-S3 ($8), Intel 5300 ($15), Atheros AR9580 ($20), Windows RSSI ($0)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-012-esp32-csi-sensor-mesh.md&#34; &gt;ADR-012&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-013-feature-level-sensing-commodity-gear.md&#34; &gt;ADR-013&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;⚙️ Development &amp; Testing&lt;/strong&gt; — 542+ tests, CI, deployment&lt;/summary&gt;
&lt;p&gt;The project maintains 542+ pure-Rust tests across 7 crate suites with zero mocks — every test runs against real algorithm implementations. Hardware-free simulation mode (&lt;code&gt;--source simulate&lt;/code&gt;) enables full-stack testing without physical devices. Docker images are published on Docker Hub for zero-setup deployment.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Docs&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#testing&#34; &gt;Testing&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;7 test suites: sensing-server (229), signal (83), mat (139), wifiscan (91), RVF (16), vitals (18)&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#deployment&#34; &gt;Deployment&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Docker images (132 MB Rust / 569 MB Python), docker-compose, env vars&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#contributing&#34; &gt;Contributing&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Fork → branch → test → PR workflow, Rust and Python dev setup&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;📊 Performance &amp; Benchmarks&lt;/strong&gt; — Measured throughput, latency, resource usage&lt;/summary&gt;
&lt;p&gt;All benchmarks are measured on the Rust sensing server using &lt;code&gt;cargo bench&lt;/code&gt; and the built-in &lt;code&gt;--benchmark&lt;/code&gt; CLI flag. The Rust v2 implementation delivers 810x end-to-end speedup over the Python v1 baseline, with motion detection reaching 5,400x improvement. The vital sign detector processes 11,665 frames/second in a single-threaded benchmark.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Key Metric&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#performance-metrics&#34; &gt;Performance Metrics&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Vital signs, CSI pipeline, motion detection, Docker image, memory&lt;/td&gt;
          &lt;td&gt;11,665 fps vitals · 54K fps pipeline&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#python-vs-rust&#34; &gt;Rust vs Python&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Side-by-side benchmarks across 5 operations&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;810x&lt;/strong&gt; full pipeline speedup&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;📄 Meta&lt;/strong&gt; — License, changelog, support&lt;/summary&gt;
&lt;p&gt;WiFi DensePose is MIT-licensed open source, developed by &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvnet&lt;/a&gt;. The project has been in active development since March 2025, with 3 major releases delivering the Rust port, SOTA signal processing, disaster response module, and end-to-end training pipeline.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Section&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
          &lt;th&gt;Link&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#changelog&#34; &gt;Changelog&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;v3.0.0 (AETHER AI + Docker), v2.0.0 (Rust port + SOTA + WiFi-Mat)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;CHANGELOG.md&#34; &gt;CHANGELOG.md&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#license&#34; &gt;License&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;MIT License&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;LICENSE&#34; &gt;LICENSE&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;#support&#34; &gt;Support&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Bug reports, feature requests, community discussion&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Issues&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/discussions&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Discussions&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🌍 Cross-Environment Generalization (ADR-027 — Project MERIDIAN)&lt;/strong&gt; — Train once, deploy in any room without retraining&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;What&lt;/th&gt;
          &lt;th&gt;How it works&lt;/th&gt;
          &lt;th&gt;Why it matters&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Gradient Reversal Layer&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;An adversarial classifier tries to guess which room the signal came from; the main network is trained to fool it&lt;/td&gt;
          &lt;td&gt;Forces the model to discard room-specific shortcuts&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Geometry Encoder (FiLM)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Transmitter/receiver positions are Fourier-encoded and injected as scale+shift conditioning on every layer&lt;/td&gt;
          &lt;td&gt;The model knows &lt;em&gt;where&lt;/em&gt; the hardware is, so it doesn&amp;rsquo;t need to memorize layout&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Hardware Normalizer&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Resamples any chipset&amp;rsquo;s CSI to a canonical 56-subcarrier format with standardized amplitude&lt;/td&gt;
          &lt;td&gt;Intel 5300 and ESP32 data look identical to the model&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Virtual Domain Augmentation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Generates synthetic environments with random room scale, wall reflections, scatterers, and noise profiles&lt;/td&gt;
          &lt;td&gt;Training sees 1000s of rooms even with data from just 2-3&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Rapid Adaptation (TTT)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Contrastive test-time training with LoRA weight generation from a few unlabeled frames&lt;/td&gt;
          &lt;td&gt;Zero-shot deployment — the model self-tunes on arrival&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Cross-Domain Evaluator&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Leave-one-out evaluation across all training environments with per-environment PCK/OKS metrics&lt;/td&gt;
          &lt;td&gt;Proves generalization, not just memorization&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Architecture&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Frame&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;any&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;chipset&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;HardwareNormalizer&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;canonical&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;56&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;N&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;amplitude&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Encoder&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;existing&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;latent&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;features&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;├──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Pose&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Head&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;──→&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;17&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;joint&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pose&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;environment&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;invariant&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;├──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Gradient&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Reversal&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Layer&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Domain&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Classifier&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;adversarial&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;         &lt;span class=&#34;err&#34;&gt;λ&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ramps&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;→&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;via&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;cosine&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;/&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;exponential&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;schedule&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;err&#34;&gt;└──→&lt;/span&gt; &lt;span class=&#34;ne&#34;&gt;Geometry&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Encoder&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;──→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;FiLM&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;conditioning&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;scale&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;shift&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;              &lt;span class=&#34;n&#34;&gt;Fourier&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;positional&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;encoding&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;DeepSets&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;per&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;layer&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;modulation&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Security hardening:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Bounded calibration buffer (max 10,000 frames) prevents memory exhaustion&lt;/li&gt;
&lt;li&gt;&lt;code&gt;adapt()&lt;/code&gt; returns &lt;code&gt;Result&amp;lt;_, AdaptError&amp;gt;&lt;/code&gt; — no panics on bad input&lt;/li&gt;
&lt;li&gt;Atomic instance counter ensures unique weight initialization across threads&lt;/li&gt;
&lt;li&gt;Division-by-zero guards on all augmentation parameters&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-027-cross-environment-domain-generalization.md&#34; &gt;&lt;code&gt;docs/adr/ADR-027-cross-environment-domain-generalization.md&lt;/code&gt;&lt;/a&gt; for full architectural details.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🔍 Independent Capability Audit (ADR-028)&lt;/strong&gt; — 1,031 tests, SHA-256 proof, self-verifying witness bundle&lt;/summary&gt;
&lt;p&gt;A &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-028-esp32-capability-audit.md&#34; &gt;3-agent parallel audit&lt;/a&gt; independently verified every claim in this repository — ESP32 hardware, signal processing, neural networks, training pipeline, deployment, and security. Results:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-fallback&#34; data-lang=&#34;fallback&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Rust tests:     1,031 passed, 0 failed
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Python proof:   VERDICT: PASS (SHA-256: 8c0680d7...)
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Bundle verify:  7/7 checks PASS
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;33-row attestation matrix:&lt;/strong&gt; 31 capabilities verified YES, 2 not measured at audit time (benchmark throughput, Kubernetes deploy).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Verify it yourself&lt;/strong&gt; (no hardware needed):&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;9
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Run all tests&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; --workspace --no-default-features
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Run the deterministic proof&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python v1/data/proof/verify.py
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Generate + verify the witness bundle&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;bash scripts/generate-witness-bundle.sh
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; dist/witness-bundle-ADR028-*/ &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; bash VERIFY.sh
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Document&lt;/th&gt;
          &lt;th&gt;What it contains&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-028-esp32-capability-audit.md&#34; &gt;ADR-028&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Full audit: ESP32 specs, signal algorithms, NN architectures, training phases, deployment infra&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/WITNESS-LOG-028.md&#34; &gt;Witness Log&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;11 reproducible verification steps + 33-row attestation matrix with evidence per row&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;scripts/generate-witness-bundle.sh&#34; &gt;&lt;code&gt;generate-witness-bundle.sh&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Creates self-contained tar.gz with test logs, proof output, firmware hashes, crate versions, VERIFY.sh&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;📡 Multistatic Sensing (ADR-029/030/031 — Project RuvSense + RuView)&lt;/strong&gt; — Multiple ESP32 nodes fuse viewpoints for production-grade pose, tracking, and exotic sensing&lt;/summary&gt;
&lt;p&gt;A single WiFi receiver can track people, but has blind spots — limbs behind the torso are invisible, depth is ambiguous, and two people at similar range create overlapping signals. RuvSense solves this by coordinating multiple ESP32 nodes into a &lt;strong&gt;multistatic mesh&lt;/strong&gt; where every node acts as both transmitter and receiver, creating N×(N-1) measurement links from N devices.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;What it does in plain terms:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;4 ESP32-S3 nodes ($48 total) provide 12 TX-RX measurement links covering 360 degrees&lt;/li&gt;
&lt;li&gt;Each node hops across WiFi channels 1/6/11, tripling effective bandwidth from 20→60 MHz&lt;/li&gt;
&lt;li&gt;Coherence gating rejects noisy frames automatically — no manual tuning, stable for days&lt;/li&gt;
&lt;li&gt;Two-person tracking at 20 Hz with zero identity swaps over 10 minutes&lt;/li&gt;
&lt;li&gt;The room itself becomes a persistent model — the system remembers, predicts, and explains&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Three ADRs, one pipeline:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;ADR&lt;/th&gt;
          &lt;th&gt;Codename&lt;/th&gt;
          &lt;th&gt;What it adds&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-029-ruvsense-multistatic-sensing-mode.md&#34; &gt;ADR-029&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;RuvSense&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Channel hopping, TDM protocol, multi-node fusion, coherence gating, 17-keypoint Kalman tracker&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-030-ruvsense-persistent-field-model.md&#34; &gt;ADR-030&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;RuvSense Field&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Room electromagnetic eigenstructure (SVD), RF tomography, longitudinal drift detection, intention prediction, gesture recognition, adversarial detection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-031-ruview-sensing-first-rf-mode.md&#34; &gt;ADR-031&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;RuView&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cross-viewpoint attention with geometric bias, viewpoint diversity optimization, embedding-level fusion&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Architecture&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;19
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;20
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;mi&#34;&gt;4&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;x&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ESP32&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;S3&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nodes&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;48&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;     &lt;span class=&#34;n&#34;&gt;TDM&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;each&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;transmits&lt;/span&gt; &lt;span class=&#34;ow&#34;&gt;in&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;turn&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;all&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;others&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;receive&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;Channel&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;hop&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;:&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ch1&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;→&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ch6&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;→&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ch11&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;per&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;dwell&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;50&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ms&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Per&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;ne&#34;&gt;Node&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Signal&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Processing&lt;/span&gt;   &lt;span class=&#34;n&#34;&gt;Phase&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sanitize&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Hampel&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;BVP&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarrier&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;select&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ADR&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;014&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;unchanged&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;per&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;viewpoint&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Multi&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Band&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Frame&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Fusion&lt;/span&gt;      &lt;span class=&#34;mi&#34;&gt;3&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;×&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;56&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;168&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;virtual&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;Cross&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;channel&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;phase&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;alignment&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;via&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;NeumannSolver&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Multistatic&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Viewpoint&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Fusion&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;N&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;nodes&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;attention&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;weighted&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fusion&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;single&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;embedding&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;Geometric&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;bias&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;from&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;node&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;placement&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;angles&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Coherence&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Gate&lt;/span&gt;               &lt;span class=&#34;n&#34;&gt;Accept&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;PredictOnly&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Reject&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;/&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Recalibrate&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;Prevents&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;model&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;drift&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;stable&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;days&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Persistent&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Field&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Model&lt;/span&gt;       &lt;span class=&#34;n&#34;&gt;SVD&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;baseline&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;→&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;body&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;observation&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;environment&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;RF&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;tomography&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;drift&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;detection&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;intention&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;signals&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;err&#34;&gt;▼&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Pose&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Tracker&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;DensePose&lt;/span&gt;     &lt;span class=&#34;mi&#34;&gt;17&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;keypoint&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Kalman&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;re&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ID&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;via&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;AETHER&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;embeddings&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                             &lt;span class=&#34;n&#34;&gt;Multi&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;person&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;min&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;cut&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;separation&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;zero&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ID&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;swaps&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Seven Exotic Sensing Tiers (ADR-030)&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Tier&lt;/th&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;What it detects&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;1&lt;/td&gt;
          &lt;td&gt;Field Normal Modes&lt;/td&gt;
          &lt;td&gt;Room electromagnetic eigenstructure via SVD&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;2&lt;/td&gt;
          &lt;td&gt;Coarse RF Tomography&lt;/td&gt;
          &lt;td&gt;3D occupancy volume from link attenuations&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;3&lt;/td&gt;
          &lt;td&gt;Intention Lead Signals&lt;/td&gt;
          &lt;td&gt;Pre-movement prediction 200-500ms before action&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;4&lt;/td&gt;
          &lt;td&gt;Longitudinal Biomechanics&lt;/td&gt;
          &lt;td&gt;Personal movement changes over days/weeks&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;5&lt;/td&gt;
          &lt;td&gt;Cross-Room Continuity&lt;/td&gt;
          &lt;td&gt;Identity preserved across rooms without cameras&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;6&lt;/td&gt;
          &lt;td&gt;Invisible Interaction&lt;/td&gt;
          &lt;td&gt;Multi-user gesture control through walls&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;7&lt;/td&gt;
          &lt;td&gt;Adversarial Detection&lt;/td&gt;
          &lt;td&gt;Physically impossible signal identification&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Acceptance Test&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Metric&lt;/th&gt;
          &lt;th&gt;Threshold&lt;/th&gt;
          &lt;th&gt;What it proves&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Torso keypoint jitter&lt;/td&gt;
          &lt;td&gt;&amp;lt; 30mm RMS&lt;/td&gt;
          &lt;td&gt;Precision sufficient for applications&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Identity swaps&lt;/td&gt;
          &lt;td&gt;0 over 10 minutes (12,000 frames)&lt;/td&gt;
          &lt;td&gt;Reliable multi-person tracking&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Update rate&lt;/td&gt;
          &lt;td&gt;20 Hz (50ms cycle)&lt;/td&gt;
          &lt;td&gt;Real-time response&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Breathing SNR&lt;/td&gt;
          &lt;td&gt;&amp;gt; 10 dB at 3m&lt;/td&gt;
          &lt;td&gt;Small-motion sensitivity confirmed&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;New Rust modules (9,000+ lines)&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Crate&lt;/th&gt;
          &lt;th&gt;New modules&lt;/th&gt;
          &lt;th&gt;Purpose&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-signal&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvsense/&lt;/code&gt; (10 modules)&lt;/td&gt;
          &lt;td&gt;Multiband fusion, phase alignment, multistatic fusion, coherence, field model, tomography, longitudinal drift, intention detection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-ruvector&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;viewpoint/&lt;/code&gt; (5 modules)&lt;/td&gt;
          &lt;td&gt;Cross-viewpoint attention with geometric bias, diversity index, coherence gating, fusion orchestrator&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-hardware&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;esp32/tdm.rs&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;TDM sensing protocol, sync beacons, clock drift compensation&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Firmware extensions (C, backward-compatible)&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;File&lt;/th&gt;
          &lt;th&gt;Addition&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;csi_collector.c&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Channel hop table, timer-driven hop, NDP injection stub&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;nvs_config.c&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;5 new NVS keys: hop_count, channel_list, dwell_ms, tdm_slot, tdm_node_count&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;DDD Domain Model&lt;/strong&gt; — 6 bounded contexts: Multistatic Sensing, Coherence, Pose Tracking, Field Model, Cross-Room Identity, Adversarial Detection. Full specification: &lt;a class=&#34;link&#34; href=&#34;docs/ddd/ruvsense-domain-model.md&#34; &gt;&lt;code&gt;docs/ddd/ruvsense-domain-model.md&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;See the ADR documents for full architectural details, GOAP integration plans, and research references.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;b&gt;🔮 Signal-Line Protocol (CRV)&lt;/b&gt;&lt;/summary&gt;
&lt;h3 id=&#34;6-stage-csi-signal-line&#34;&gt;6-Stage CSI Signal Line
&lt;/h3&gt;&lt;p&gt;Maps the CRV (Coordinate Remote Viewing) signal-line methodology to WiFi CSI processing via &lt;code&gt;ruvector-crv&lt;/code&gt;:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Stage&lt;/th&gt;
          &lt;th&gt;CRV Name&lt;/th&gt;
          &lt;th&gt;WiFi CSI Mapping&lt;/th&gt;
          &lt;th&gt;ruvector Component&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;I&lt;/td&gt;
          &lt;td&gt;Ideograms&lt;/td&gt;
          &lt;td&gt;Raw CSI gestalt (manmade/natural/movement/energy)&lt;/td&gt;
          &lt;td&gt;Poincare ball hyperbolic embeddings&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;II&lt;/td&gt;
          &lt;td&gt;Sensory&lt;/td&gt;
          &lt;td&gt;Amplitude textures, phase patterns, frequency colors&lt;/td&gt;
          &lt;td&gt;Multi-head attention vectors&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;III&lt;/td&gt;
          &lt;td&gt;Dimensional&lt;/td&gt;
          &lt;td&gt;AP mesh spatial topology, node geometry&lt;/td&gt;
          &lt;td&gt;GNN graph topology&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;IV&lt;/td&gt;
          &lt;td&gt;Emotional/AOL&lt;/td&gt;
          &lt;td&gt;Coherence gating — signal vs noise separation&lt;/td&gt;
          &lt;td&gt;SNN temporal encoding&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;V&lt;/td&gt;
          &lt;td&gt;Interrogation&lt;/td&gt;
          &lt;td&gt;Cross-stage probing — query pose against CSI history&lt;/td&gt;
          &lt;td&gt;Differentiable search&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;VI&lt;/td&gt;
          &lt;td&gt;3D Model&lt;/td&gt;
          &lt;td&gt;Composite person estimation, MinCut partitioning&lt;/td&gt;
          &lt;td&gt;Graph partitioning&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Cross-Session Convergence&lt;/strong&gt;: When multiple AP clusters observe the same person, CRV convergence analysis finds agreement in their signal embeddings — directly mapping to cross-room identity continuity.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-rust&#34; data-lang=&#34;rust&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;use&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;wifi_densepose_ruvector&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;crv&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;WifiCrvPipeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;mut&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;pipeline&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;WifiCrvPipeline&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;WifiCrvConfig&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;default&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;());&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;pipeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;create_session&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;room-a&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;person-001&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// Process CSI frames through 6-stage pipeline
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;result&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;pipeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;process_csi_frame&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;room-a&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;amp;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;amplitudes&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;amp;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;phases&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// result.gestalt = Movement, confidence = 0.87
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// result.sensory_embedding = [0.12, -0.34, ...]
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;// Cross-room identity matching via convergence
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;convergence&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;pipeline&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;find_cross_room_convergence&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;person-001&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;0.75&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Architecture&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;CsiGestaltClassifier&lt;/code&gt; — Maps CSI amplitude/phase patterns to 6 gestalt types&lt;/li&gt;
&lt;li&gt;&lt;code&gt;CsiSensoryEncoder&lt;/code&gt; — Extracts texture/color/temperature/luminosity features from subcarriers&lt;/li&gt;
&lt;li&gt;&lt;code&gt;MeshTopologyEncoder&lt;/code&gt; — Encodes AP mesh as GNN graph (Stage III)&lt;/li&gt;
&lt;li&gt;&lt;code&gt;CoherenceAolDetector&lt;/code&gt; — Maps coherence gate states to AOL noise detection (Stage IV)&lt;/li&gt;
&lt;li&gt;&lt;code&gt;WifiCrvPipeline&lt;/code&gt; — Orchestrates all 6 stages into unified sensing session&lt;/li&gt;
&lt;/ul&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-signal-processing--sensing&#34;&gt;📡 Signal Processing &amp;amp; Sensing
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;esp32-s3-hardware-pipeline&#34;&gt;&lt;/a&gt;&lt;strong&gt;📡 ESP32-S3 Hardware Pipeline (ADR-018)&lt;/strong&gt; — 28 Hz CSI streaming, flash &amp; provision&lt;/summary&gt;
&lt;p&gt;A single ESP32-S3 board (~$9) captures WiFi signal data 28 times per second and streams it over UDP. A host server can visualize and record the data, but the ESP32 can also run on its own — detecting presence, measuring breathing and heart rate, and alerting on falls without any server at all.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ESP32&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;S3&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;node&lt;/span&gt;                    &lt;span class=&#34;n&#34;&gt;UDP&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;/&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;5005&lt;/span&gt;        &lt;span class=&#34;n&#34;&gt;Host&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;server&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;optional&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;┌───────────────────────┐&lt;/span&gt;      &lt;span class=&#34;err&#34;&gt;──────────&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;      &lt;span class=&#34;err&#34;&gt;┌──────────────────────┐&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Captures&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;WiFi&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;signals&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;      &lt;span class=&#34;n&#34;&gt;binary&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;frames&lt;/span&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Parses&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;frames&lt;/span&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;28&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Hz&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;up&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;to&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;192&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;sub&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;      &lt;span class=&#34;ow&#34;&gt;or&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;32&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;byte&lt;/span&gt;       &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Visualizes&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;poses&lt;/span&gt;     &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;carriers&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;per&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;frame&lt;/span&gt;     &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;      &lt;span class=&#34;n&#34;&gt;vitals&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;packets&lt;/span&gt;   &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Records&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;     &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                       &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;REST&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;API&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;WebSocket&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;On&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;device&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;optional&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;):&lt;/span&gt;  &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;                       &lt;span class=&#34;err&#34;&gt;└──────────────────────┘&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;Presence&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;detection&lt;/span&gt;    &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;Breathing&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;heart&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;rate&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;Fall&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;detection&lt;/span&gt;        &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;WASM&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;custom&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;modules&lt;/span&gt;   &lt;span class=&#34;err&#34;&gt;│&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;err&#34;&gt;└───────────────────────┘&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Metric&lt;/th&gt;
          &lt;th&gt;Measured on hardware&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;CSI frame rate&lt;/td&gt;
          &lt;td&gt;28.5 Hz (channel 5, BW20)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Subcarriers per frame&lt;/td&gt;
          &lt;td&gt;64 / 128 / 192 (depends on WiFi mode)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;UDP latency&lt;/td&gt;
          &lt;td&gt;&amp;lt; 1 ms on local network&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Presence detection range&lt;/td&gt;
          &lt;td&gt;Reliable at 3 m through walls&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Binary size&lt;/td&gt;
          &lt;td&gt;990 KB (8MB flash) / 773 KB (4MB flash)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Boot to ready&lt;/td&gt;
          &lt;td&gt;~3.9 seconds&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id=&#34;flash-and-provision&#34;&gt;Flash and provision
&lt;/h3&gt;&lt;p&gt;Download a pre-built binary — no build toolchain needed:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Release&lt;/th&gt;
          &lt;th&gt;What&amp;rsquo;s included&lt;/th&gt;
          &lt;th&gt;Tag&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.7.0&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.7.0&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Latest&lt;/strong&gt; — Camera-supervised WiFlow model (92.9% PCK@20), ground-truth training pipeline, ruvector optimizations&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.7.0&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.6.0-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.6.0&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://huggingface.co/ruv/ruview&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Pre-trained models on HuggingFace&lt;/a&gt;, 17 sensing apps, 51.6% contrastive improvement, 0.008ms inference&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.6.0-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.5.5-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.5.5&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;SNN + MinCut (#348 fix) + CNN spectrogram + WiFlow + multi-freq mesh + graph transformer&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.5.5-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.5.4-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.5.4&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Cognitum Seed integration (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-069-cognitum-seed-csi-pipeline.md&#34; &gt;ADR-069&lt;/a&gt;), 8-dim feature vectors, RVF store, witness chain, security hardening&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.5.4-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.5.0-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.5.0&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;mmWave sensor fusion (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-063-mmwave-sensor-fusion.md&#34; &gt;ADR-063&lt;/a&gt;), auto-detect MR60BHA2/LD2410, 48-byte fused vitals, all v0.4.3.1 fixes&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.5.0-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.4.3.1-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.4.3.1&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Fall detection fix (&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/263&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;#263&lt;/a&gt;), 4MB flash (&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/265&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;#265&lt;/a&gt;), watchdog fix (&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/266&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;#266&lt;/a&gt;)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.4.3.1-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.4.1-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.4.1&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;CSI build fix, compile guard, AMOLED display, edge intelligence (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-057-firmware-csi-build-guard.md&#34; &gt;ADR-057&lt;/a&gt;)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.4.1-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.3.0-alpha-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.3.0-alpha&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Alpha — adds on-device edge intelligence and WASM modules (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-039-esp32-edge-intelligence.md&#34; &gt;ADR-039&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-040-wasm-programmable-sensing.md&#34; &gt;ADR-040&lt;/a&gt;)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.3.0-alpha-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/releases/tag/v0.2.0-esp32&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;v0.2.0&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Raw CSI streaming, multi-node TDM, channel hopping&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;v0.2.0-esp32&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;19
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 1. Flash the firmware to your ESP32-S3 (8MB flash — most boards)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m esptool --chip esp32s3 --port COM7 --baud &lt;span class=&#34;m&#34;&gt;460800&lt;/span&gt; &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  write_flash --flash-mode dio --flash-size 8MB --flash-freq 80m &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  0x0 bootloader.bin 0x8000 partition-table.bin &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  0xf000 ota_data_initial.bin 0x20000 esp32-csi-node.bin
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 1b. For 4MB flash boards (e.g. ESP32-S3 SuperMini 4MB) — use the 4MB binaries:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m esptool --chip esp32s3 --port COM7 --baud &lt;span class=&#34;m&#34;&gt;460800&lt;/span&gt; &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  write_flash --flash-mode dio --flash-size 4MB --flash-freq 80m &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  0x0 bootloader.bin 0x8000 partition-table-4mb.bin &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  0xF000 ota_data_initial.bin 0x20000 esp32-csi-node-4mb.bin
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 2. Set WiFi credentials and server address (stored in flash, survives reboots)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM7 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 3. (Optional) Start the host server to visualize data&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo run -p wifi-densepose-sensing-server -- --http-port &lt;span class=&#34;m&#34;&gt;3000&lt;/span&gt; --source auto
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Open http://localhost:3000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h3 id=&#34;multi-node-mesh&#34;&gt;Multi-node mesh
&lt;/h3&gt;&lt;p&gt;For better accuracy and room coverage, deploy 3-6 nodes with time-division multiplexing (TDM) so they take turns transmitting:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;9
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Node 0 of a 3-node mesh&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM7 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --node-id &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; --tdm-slot &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; --tdm-total &lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Node 1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM8 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --node-id &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; --tdm-slot &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; --tdm-total &lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;Nodes can also hop across WiFi channels (1, 6, 11) to increase sensing bandwidth — configured via &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-029-ruvsense-multistatic-sensing-mode.md&#34; &gt;ADR-029&lt;/a&gt; channel hopping.&lt;/p&gt;
&lt;h3 id=&#34;cognitum-seed-integration-adr-069&#34;&gt;Cognitum Seed integration (ADR-069)
&lt;/h3&gt;&lt;p&gt;Connect an ESP32 to a &lt;a class=&#34;link&#34; href=&#34;https://cognitum.one&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Cognitum Seed&lt;/a&gt; ($131) for persistent vector storage, kNN search, cryptographic witness chain, and AI-accessible MCP proxy:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;ESP32&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;S3&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;9&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;  &lt;span class=&#34;err&#34;&gt;──&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;UDP&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;──&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;Host&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;bridge&lt;/span&gt;  &lt;span class=&#34;err&#34;&gt;──&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;HTTPS&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;──&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;  &lt;span class=&#34;n&#34;&gt;Cognitum&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Seed&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;$&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;15&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;capture&lt;/span&gt;              &lt;span class=&#34;n&#34;&gt;seed_csi_bridge&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;py&lt;/span&gt;         &lt;span class=&#34;n&#34;&gt;RVF&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;vector&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;store&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;mi&#34;&gt;8&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;dim&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;features&lt;/span&gt; &lt;span class=&#34;err&#34;&gt;@&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;1&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Hz&lt;/span&gt;                              &lt;span class=&#34;n&#34;&gt;kNN&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;similarity&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;search&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  &lt;span class=&#34;n&#34;&gt;Vitals&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;+&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;presence&lt;/span&gt;                                  &lt;span class=&#34;n&#34;&gt;Ed25519&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;witness&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;chain&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                                                     &lt;span class=&#34;mi&#34;&gt;114&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;tool&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;MCP&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;proxy&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 1. Provision ESP32 to send features to your laptop&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM9 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20 --target-port &lt;span class=&#34;m&#34;&gt;5006&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 2. Run the bridge (forwards to Seed via HTTPS)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;export&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;SEED_TOKEN&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;your-pairing-token&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python scripts/seed_csi_bridge.py &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --seed-url https://169.254.42.1:8443 --token &lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;nv&#34;&gt;$SEED_TOKEN&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt; --validate
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# 3. Check Seed stats&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python scripts/seed_csi_bridge.py --token &lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span class=&#34;nv&#34;&gt;$SEED_TOKEN&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;&lt;/span&gt; --stats
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;The 8-dim feature vector captures: presence, motion, breathing rate, heart rate, phase variance, person count, fall detection, and RSSI — all normalized to [0.0, 1.0]. See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-069-cognitum-seed-csi-pipeline.md&#34; &gt;ADR-069&lt;/a&gt; for the full architecture.&lt;/p&gt;
&lt;h3 id=&#34;on-device-intelligence-v030-alpha&#34;&gt;On-device intelligence (v0.3.0-alpha)
&lt;/h3&gt;&lt;p&gt;The alpha firmware can analyze signals locally and send compact results instead of raw data. This means the ESP32 works standalone — no server needed for basic sensing. Disabled by default for backward compatibility.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Tier&lt;/th&gt;
          &lt;th&gt;What it does&lt;/th&gt;
          &lt;th&gt;RAM used&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;0&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Off — streams raw CSI only (same as v0.2.0)&lt;/td&gt;
          &lt;td&gt;0 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;1&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cleans up signals, picks the best subcarriers, compresses data (saves 30-50% bandwidth)&lt;/td&gt;
          &lt;td&gt;~30 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;2&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Everything in Tier 1 + detects presence, measures breathing and heart rate, detects falls&lt;/td&gt;
          &lt;td&gt;~33 KB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;3&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Everything in Tier 2 + runs custom WASM modules (gesture recognition, intrusion detection, and &lt;a class=&#34;link&#34; href=&#34;docs/edge-modules/README.md&#34; &gt;63 more&lt;/a&gt;)&lt;/td&gt;
          &lt;td&gt;~160 KB/module&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;Enable without reflashing — just reprovision:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Turn on Tier 2 (vitals) on an already-flashed node&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM7 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --ssid &lt;span class=&#34;s2&#34;&gt;&amp;#34;YourWiFi&amp;#34;&lt;/span&gt; --password &lt;span class=&#34;s2&#34;&gt;&amp;#34;secret&amp;#34;&lt;/span&gt; --target-ip 192.168.1.20 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --edge-tier &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Fine-tune detection thresholds (fall-thresh in milli-units: 15000 = 15.0 rad/s²)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python firmware/esp32-csi-node/provision.py --port COM7 &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --edge-tier &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; --vital-int &lt;span class=&#34;m&#34;&gt;500&lt;/span&gt; --fall-thresh &lt;span class=&#34;m&#34;&gt;15000&lt;/span&gt; --subk-count &lt;span class=&#34;m&#34;&gt;16&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;When Tier 2 is active, the node sends a 32-byte vitals packet once per second containing: presence, motion level, breathing BPM, heart rate BPM, confidence scores, fall alert flag, and occupancy count.&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;firmware/esp32-csi-node/README.md&#34; &gt;firmware/esp32-csi-node/README.md&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-039-esp32-edge-intelligence.md&#34; &gt;ADR-039&lt;/a&gt;, &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-044-provisioning-tool-enhancements.md&#34; &gt;ADR-044&lt;/a&gt;, and &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/34&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #34&lt;/a&gt;.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🦀 Rust Implementation (v2)&lt;/strong&gt; — 810x faster, 54K fps pipeline&lt;/summary&gt;
&lt;h3 id=&#34;performance-benchmarks-validated&#34;&gt;Performance Benchmarks (Validated)
&lt;/h3&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Operation&lt;/th&gt;
          &lt;th&gt;Python (v1)&lt;/th&gt;
          &lt;th&gt;Rust (v2)&lt;/th&gt;
          &lt;th&gt;Speedup&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;CSI Preprocessing (4x64)&lt;/td&gt;
          &lt;td&gt;~5ms&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;5.19 µs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~1000x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Phase Sanitization (4x64)&lt;/td&gt;
          &lt;td&gt;~3ms&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;3.84 µs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~780x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Feature Extraction (4x64)&lt;/td&gt;
          &lt;td&gt;~8ms&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;9.03 µs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~890x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Motion Detection&lt;/td&gt;
          &lt;td&gt;~1ms&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;186 ns&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~5400x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Full Pipeline&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~15ms&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;18.47 µs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~810x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Vital Signs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;N/A&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;86 µs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;11,665 fps&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Resource&lt;/th&gt;
          &lt;th&gt;Python (v1)&lt;/th&gt;
          &lt;th&gt;Rust (v2)&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Memory&lt;/td&gt;
          &lt;td&gt;~500 MB&lt;/td&gt;
          &lt;td&gt;~100 MB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Docker Image&lt;/td&gt;
          &lt;td&gt;569 MB&lt;/td&gt;
          &lt;td&gt;132 MB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Tests&lt;/td&gt;
          &lt;td&gt;41&lt;/td&gt;
          &lt;td&gt;542+&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;WASM Support&lt;/td&gt;
          &lt;td&gt;No&lt;/td&gt;
          &lt;td&gt;Yes&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo build --release
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; --workspace
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo bench --package wifi-densepose-signal
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;vital-sign-detection&#34;&gt;&lt;/a&gt;&lt;strong&gt;💓 Vital Sign Detection (ADR-021)&lt;/strong&gt; — Breathing and heartbeat via FFT&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;Range&lt;/th&gt;
          &lt;th&gt;Method&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Breathing Rate&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;6-30 BPM (0.1-0.5 Hz)&lt;/td&gt;
          &lt;td&gt;Bandpass filter + FFT peak detection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Heart Rate&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;40-120 BPM (0.8-2.0 Hz)&lt;/td&gt;
          &lt;td&gt;Bandpass filter + FFT peak detection&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Sampling Rate&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;20 Hz (ESP32 CSI)&lt;/td&gt;
          &lt;td&gt;Real-time streaming&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Confidence&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;0.0-1.0 per sign&lt;/td&gt;
          &lt;td&gt;Spectral coherence + signal quality&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --source simulate --http-port &lt;span class=&#34;m&#34;&gt;3000&lt;/span&gt; --ws-port &lt;span class=&#34;m&#34;&gt;3001&lt;/span&gt; --ui-path ../../ui
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;curl http://localhost:3000/api/v1/vital-signs
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-021-vital-sign-detection-rvdna-pipeline.md&#34; &gt;ADR-021&lt;/a&gt;.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;wifi-scan-domain-layer&#34;&gt;&lt;/a&gt;&lt;strong&gt;📡 WiFi Scan Domain Layer (ADR-022/025)&lt;/strong&gt; — 8-stage RSSI pipeline for Windows, macOS, and Linux WiFi&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Stage&lt;/th&gt;
          &lt;th&gt;Purpose&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Predictive Gating&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Pre-filter scan results using temporal prediction&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Attention Weighting&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Weight BSSIDs by signal relevance&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Spatial Correlation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Cross-AP spatial signal correlation&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Motion Estimation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Detect movement from RSSI variance&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Breathing Extraction&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Extract respiratory rate from sub-Hz oscillations&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Quality Gating&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Reject low-confidence estimates&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Fingerprint Matching&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Location and posture classification via RF fingerprints&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Orchestration&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Fuse all stages into unified sensing output&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; -p wifi-densepose-wifiscan
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-022-windows-wifi-enhanced-fidelity-ruvector.md&#34; &gt;ADR-022&lt;/a&gt; and &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/36&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #36&lt;/a&gt;.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;wifi-mat-disaster-response&#34;&gt;&lt;/a&gt;&lt;strong&gt;🚨 WiFi-Mat: Disaster Response&lt;/strong&gt; — Search &amp; rescue, START triage, 3D localization&lt;/summary&gt;
&lt;p&gt;WiFi signals penetrate non-metallic debris (concrete, wood, drywall) where cameras and thermal sensors cannot reach. The WiFi-Mat module (&lt;code&gt;wifi-densepose-mat&lt;/code&gt;, 139 tests) uses CSI analysis to detect survivors trapped under rubble, classify their condition using the START triage protocol, and estimate their 3D position — giving rescue teams actionable intelligence within seconds of deployment.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;How It Works&lt;/th&gt;
          &lt;th&gt;Performance Target&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Breathing Detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Bandpass 0.07-1.0 Hz + Fresnel zone modeling detects chest displacement of 5-10mm at 5 GHz&lt;/td&gt;
          &lt;td&gt;4-60 BPM, &amp;lt;500ms latency&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Heartbeat Detection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Micro-Doppler shift extraction from fine-grained CSI phase variation&lt;/td&gt;
          &lt;td&gt;Via ruvector-temporal-tensor&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;3D Localization&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Multi-AP triangulation + CSI fingerprint matching + depth estimation through rubble layers&lt;/td&gt;
          &lt;td&gt;3-5m penetration&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;START Triage&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Ensemble classifier votes on breathing + movement + vital stability → P1-P4 priority&lt;/td&gt;
          &lt;td&gt;&amp;lt;1% false negative&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Zone Scanning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;16+ concurrent scan zones with periodic re-scan and audit logging&lt;/td&gt;
          &lt;td&gt;Full disaster site&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Triage classification (START protocol compatible):&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Status&lt;/th&gt;
          &lt;th&gt;Color&lt;/th&gt;
          &lt;th&gt;Detection Criteria&lt;/th&gt;
          &lt;th&gt;Priority&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Immediate&lt;/td&gt;
          &lt;td&gt;Red&lt;/td&gt;
          &lt;td&gt;Breathing detected, no movement&lt;/td&gt;
          &lt;td&gt;P1&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Delayed&lt;/td&gt;
          &lt;td&gt;Yellow&lt;/td&gt;
          &lt;td&gt;Movement + breathing, stable vitals&lt;/td&gt;
          &lt;td&gt;P2&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Minor&lt;/td&gt;
          &lt;td&gt;Green&lt;/td&gt;
          &lt;td&gt;Strong movement, responsive patterns&lt;/td&gt;
          &lt;td&gt;P3&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Deceased&lt;/td&gt;
          &lt;td&gt;Black&lt;/td&gt;
          &lt;td&gt;No vitals for &amp;gt;30 min continuous scan&lt;/td&gt;
          &lt;td&gt;P4&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Deployment modes:&lt;/strong&gt; portable (single TX/RX handheld), distributed (multiple APs around collapse site), drone-mounted (UAV scanning), vehicle-mounted (mobile command post).&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-rust&#34; data-lang=&#34;rust&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;k&#34;&gt;use&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;wifi_densepose_mat&lt;/span&gt;::&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterResponse&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterConfig&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterType&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ScanZone&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ZoneBounds&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;};&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;config&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterConfig&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;builder&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;()&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;disaster_type&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterType&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;Earthquake&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;sensitivity&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;0.85&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;max_depth&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;5.0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;build&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;();&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;let&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;mut&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;response&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;DisasterResponse&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;config&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;initialize_event&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;location&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;Building collapse&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;add_zone&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ScanZone&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;North Wing&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;ZoneBounds&lt;/span&gt;::&lt;span class=&#34;n&#34;&gt;rectangle&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;0.0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;0.0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;30.0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;mf&#34;&gt;20.0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)))&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;response&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;start_scanning&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;().&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;await&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;?&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Safety guarantees:&lt;/strong&gt; fail-safe defaults (assume life present on ambiguous signals), redundant multi-algorithm voting, complete audit trail, offline-capable (no network required).&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;docs/wifi-mat-user-guide.md&#34; &gt;WiFi-Mat User Guide&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-001-wifi-mat-disaster-detection.md&#34; &gt;ADR-001&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;docs/ddd/wifi-mat-domain-model.md&#34; &gt;Domain Model&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;sota-signal-processing&#34;&gt;&lt;/a&gt;&lt;strong&gt;🔬 SOTA Signal Processing (ADR-014)&lt;/strong&gt; — 6 research-grade algorithms&lt;/summary&gt;
&lt;p&gt;The signal processing layer bridges the gap between raw commodity WiFi hardware output and research-grade sensing accuracy. Each algorithm addresses a specific limitation of naive CSI processing — from hardware-induced phase corruption to environment-dependent multipath interference. All six are implemented in &lt;code&gt;wifi-densepose-signal/src/&lt;/code&gt; with deterministic tests and no mock data.&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Algorithm&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Why It Matters&lt;/th&gt;
          &lt;th&gt;Math&lt;/th&gt;
          &lt;th&gt;Source&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Conjugate Multiplication&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Multiplies CSI antenna pairs: &lt;code&gt;H₁[k] × conj(H₂[k])&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Cancels CFO, SFO, and packet detection delay that corrupt raw phase — preserves only environment-caused phase differences&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;CSI_ratio[k] = H₁[k] * conj(H₂[k])&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://dl.acm.org/doi/10.1145/2789168.2790124&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;SpotFi&lt;/a&gt; (SIGCOMM 2015)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Hampel Filter&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Replaces outliers using running median ± scaled MAD&lt;/td&gt;
          &lt;td&gt;Z-score uses mean/std which are corrupted by the very outliers it detects (masking effect). Hampel uses median/MAD, resisting up to 50% contamination&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;σ̂ = 1.4826 × MAD&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Standard DSP; WiGest (2015)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Fresnel Zone Model&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Models signal variation from chest displacement crossing Fresnel zone boundaries&lt;/td&gt;
          &lt;td&gt;Zero-crossing counting fails in multipath-rich environments. Fresnel predicts &lt;em&gt;where&lt;/em&gt; breathing should appear based on TX-RX-body geometry&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ΔΦ = 2π × 2Δd / λ&lt;/code&gt;, &lt;code&gt;A = |sin(ΔΦ/2)|&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://dl.acm.org/doi/10.1145/3300061.3345431&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;FarSense&lt;/a&gt; (MobiCom 2019)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;CSI Spectrogram&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Sliding-window FFT (STFT) per subcarrier → 2D time-frequency matrix&lt;/td&gt;
          &lt;td&gt;Breathing = 0.2-0.4 Hz band, walking = 1-2 Hz, static = noise. 2D structure enables CNN spatial pattern recognition that 1D features miss&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;S[t,f] = |Σₙ x[n] w[n-t] e^{-j2πfn}|²&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Standard since 2018&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Subcarrier Selection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Ranks subcarriers by motion sensitivity (variance ratio) and selects top-K&lt;/td&gt;
          &lt;td&gt;Not all subcarriers respond to motion — some sit in multipath nulls. Selecting the 10-20 most sensitive improves SNR by 6-10 dB&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;sensitivity[k] = var_motion / var_static&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://dl.acm.org/doi/10.1145/3117811.3117826&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;WiDance&lt;/a&gt; (MobiCom 2017)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Body Velocity Profile&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Extracts velocity distribution from Doppler shifts across subcarriers&lt;/td&gt;
          &lt;td&gt;BVP is domain-independent — same velocity profile regardless of room layout, furniture, or AP placement. Basis for cross-environment recognition&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;BVP[v,t] = Σₖ |STFTₖ[v,t]|&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://dl.acm.org/doi/10.1145/3328916&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Widar 3.0&lt;/a&gt; (MobiSys 2019)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Processing pipeline order:&lt;/strong&gt; Raw CSI → Conjugate multiplication (phase cleaning) → Hampel filter (outlier removal) → Subcarrier selection (top-K) → CSI spectrogram (time-frequency) → Fresnel model (breathing) + BVP (activity)&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-014-sota-signal-processing.md&#34; &gt;ADR-014&lt;/a&gt; for full mathematical derivations.&lt;/p&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-models--training&#34;&gt;🧠 Models &amp;amp; Training
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;ai-backbone-ruvector&#34;&gt;&lt;/a&gt;&lt;strong&gt;🤖 AI Backbone: RuVector&lt;/strong&gt; — Attention, graph algorithms, and edge-AI compression powering the sensing pipeline&lt;/summary&gt;
&lt;p&gt;Raw WiFi signals are noisy, redundant, and environment-dependent. &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector&lt;/a&gt; is the AI intelligence layer that transforms them into clean, structured input for the DensePose neural network. It uses &lt;strong&gt;attention mechanisms&lt;/strong&gt; to learn which signals to trust, &lt;strong&gt;graph algorithms&lt;/strong&gt; that automatically discover which WiFi channels are sensitive to body motion, and &lt;strong&gt;compressed representations&lt;/strong&gt; that make edge inference possible on an $8 microcontroller.&lt;/p&gt;
&lt;p&gt;Without RuVector, WiFi DensePose would need hand-tuned thresholds, brute-force matrix math, and 4x more memory — making real-time edge inference impossible.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-gdscript3&#34; data-lang=&#34;gdscript3&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Raw&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;WiFi&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;CSI&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;mi&#34;&gt;56&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;subcarriers&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noisy&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;o&#34;&gt;+--&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ruvector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mincut&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;----------&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Which&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;channels&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;carry&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;body&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;motion&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;signal&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;?&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;learned&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;graph&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;partitioning&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;o&#34;&gt;+--&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ruvector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;attn&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;mincut&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-----&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Which&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;time&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;frames&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;are&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;signal&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;vs&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;noise&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;?&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;attention&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;gated&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;filtering&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;o&#34;&gt;+--&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ruvector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;attention&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-------&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;How&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;to&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;fuse&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;multi&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;antenna&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;data&lt;/span&gt;&lt;span class=&#34;err&#34;&gt;?&lt;/span&gt; &lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;learned&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;weighted&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;aggregation&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;o&#34;&gt;|&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;n&#34;&gt;v&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;n&#34;&gt;Clean&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;structured&lt;/span&gt; &lt;span class=&#34;k&#34;&gt;signal&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;--&amp;gt;&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;DensePose&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Neural&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Network&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;--&amp;gt;&lt;/span&gt; &lt;span class=&#34;mi&#34;&gt;17&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;keypoint&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;body&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;pose&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                         &lt;span class=&#34;o&#34;&gt;--&amp;gt;&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;FFT&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Vital&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;Signs&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;-----------&amp;gt;&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;breathing&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;rate&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;heart&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;rate&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;                         &lt;span class=&#34;o&#34;&gt;--&amp;gt;&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;ruvector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;solver&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;------------&amp;gt;&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;physics&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;-&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;based&lt;/span&gt; &lt;span class=&#34;n&#34;&gt;localization&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;The &lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;wifi-densepose-ruvector&lt;/code&gt;&lt;/a&gt; crate (&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-017-ruvector-signal-mat-integration.md&#34; &gt;ADR-017&lt;/a&gt;) connects all 7 integration points:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;AI Capability&lt;/th&gt;
          &lt;th&gt;What It Replaces&lt;/th&gt;
          &lt;th&gt;RuVector Crate&lt;/th&gt;
          &lt;th&gt;Result&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Self-optimizing channel selection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Hand-tuned thresholds that break when rooms change&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-mincut&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Graph min-cut adapts to any environment automatically&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Attention-based signal cleaning&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Fixed energy cutoffs that miss subtle breathing&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-attn-mincut&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Learned gating amplifies body signals, suppresses noise&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Learned signal fusion&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Simple averaging where one bad channel corrupts all&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-attention&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Transformer-style attention downweights corrupted channels&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Physics-informed localization&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Expensive nonlinear solvers&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-solver&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Sparse least-squares Fresnel geometry in real-time&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;O(1) survivor triangulation&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;O(N^3) matrix inversion&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-solver&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Neumann series linearization for instant position updates&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;75% memory compression&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;13.4 MB breathing buffers that overflow edge devices&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-temporal-tensor&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Tiered 3-8 bit quantization fits 60s of vitals in 3.4 MB&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/67&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;issue #67&lt;/a&gt; for a deep dive with code examples, or &lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;cargo add wifi-densepose-ruvector&lt;/code&gt;&lt;/a&gt; to use it directly.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;rvf-model-container&#34;&gt;&lt;/a&gt;&lt;strong&gt;📦 RVF Model Container&lt;/strong&gt; — Single-file deployment with progressive loading&lt;/summary&gt;
&lt;p&gt;The &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/rvf&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;RuVector Format (RVF)&lt;/a&gt; packages an entire trained model — weights, HNSW indexes, quantization codebooks, SONA adaptation deltas, and WASM inference runtime — into a single self-contained binary file. No external dependencies are needed at deployment time.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Container structure:&lt;/strong&gt;&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;19
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-fallback&#34; data-lang=&#34;fallback&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;┌──────────────────────────────────────────────────────┐
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│ RVF Container (.rvf)                                  │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│                                                       │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ┌─────────────┐  64-byte header per segment          │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ Manifest     │  Magic: 0x52564653 (&amp;#34;RVFS&amp;#34;)         │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤  Type + content hash + compression   │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ Weights      │  Model parameters (f32/f16/u8)      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ HNSW Index   │  Vector search index                │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ Quant        │  Quantization codebooks              │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ SONA Profile │  LoRA deltas + EWC++ Fisher matrix  │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ Witness      │  Ed25519 training proof              │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  ├─────────────┤                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  │ Vitals Config│  Breathing/HR filter parameters     │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;│  └─────────────┘                                      │
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;└──────────────────────────────────────────────────────┘
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Deployment targets:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Target&lt;/th&gt;
          &lt;th&gt;Quantization&lt;/th&gt;
          &lt;th&gt;Size&lt;/th&gt;
          &lt;th&gt;Load Time&lt;/th&gt;
          &lt;th&gt;Use Case&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ESP32 / IoT&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;int4&lt;/td&gt;
          &lt;td&gt;~0.7 MB&lt;/td&gt;
          &lt;td&gt;&amp;lt;5ms (Layer A)&lt;/td&gt;
          &lt;td&gt;Presence + breathing only&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Mobile / WebView&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;int8&lt;/td&gt;
          &lt;td&gt;~6 MB&lt;/td&gt;
          &lt;td&gt;~200ms (Layer B)&lt;/td&gt;
          &lt;td&gt;Pose estimation on phone&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Browser (WASM)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;int8&lt;/td&gt;
          &lt;td&gt;~10 MB&lt;/td&gt;
          &lt;td&gt;~500ms (Layer B)&lt;/td&gt;
          &lt;td&gt;In-browser demo&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Field (WiFi-Mat)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;fp16&lt;/td&gt;
          &lt;td&gt;~62 MB&lt;/td&gt;
          &lt;td&gt;~2s (Layer C)&lt;/td&gt;
          &lt;td&gt;Full DensePose + disaster triage&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Server / Cloud&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;f32&lt;/td&gt;
          &lt;td&gt;~50+ MB&lt;/td&gt;
          &lt;td&gt;~3s (Layer C)&lt;/td&gt;
          &lt;td&gt;Training + full inference&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Property&lt;/th&gt;
          &lt;th&gt;Detail&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Format&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Segment-based binary, 20+ segment types, CRC32 integrity per segment&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Progressive Loading&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;Layer A&lt;/strong&gt; (&amp;lt;5ms): manifest + entry points → &lt;strong&gt;Layer B&lt;/strong&gt; (100ms-1s): hot weights + adjacency → &lt;strong&gt;Layer C&lt;/strong&gt; (seconds): full graph&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Signing&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Ed25519 training proofs for verifiable provenance — chain of custody from training data to deployed model&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Quantization&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Per-segment temperature-tiered: f32 (full), f16 (half), u8 (int8), int4 — with SIMD-accelerated distance computation&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;CLI&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;--export-rvf&lt;/code&gt; (generate), &lt;code&gt;--load-rvf&lt;/code&gt; (config), &lt;code&gt;--save-rvf&lt;/code&gt; (persist), &lt;code&gt;--model&lt;/code&gt; (inference), &lt;code&gt;--progressive&lt;/code&gt; (3-layer load)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Export model package&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --export-rvf wifi-densepose-v1.rvf
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Load and run with progressive loading&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --model wifi-densepose-v1.rvf --progressive
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Export via Docker&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run --rm -v &lt;span class=&#34;k&#34;&gt;$(&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;pwd&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;)&lt;/span&gt;:/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;Built on the &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/rvf&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;rvf&lt;/a&gt; crate family (rvf-types, rvf-wire, rvf-manifest, rvf-index, rvf-quant, rvf-crypto, rvf-runtime). See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md&#34; &gt;ADR-023&lt;/a&gt;.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;training--fine-tuning&#34;&gt;&lt;/a&gt;&lt;strong&gt;🧬 Training &amp; Fine-Tuning&lt;/strong&gt; — MM-Fi/Wi-Pose pre-training, SONA adaptation&lt;/summary&gt;
&lt;p&gt;The training pipeline implements 8 phases in pure Rust (7,832 lines, zero external ML dependencies). It trains a graph transformer with cross-attention to map CSI feature matrices to 17 COCO body keypoints and DensePose UV coordinates — following the approach from the CMU &amp;ldquo;DensePose From WiFi&amp;rdquo; paper (&lt;a class=&#34;link&#34; href=&#34;https://arxiv.org/abs/2301.00250&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;arXiv:2301.00250&lt;/a&gt;). RuVector crates provide the core building blocks: &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-attention&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-attention&lt;/a&gt; for cross-attention layers, &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-mincut&lt;/a&gt; for multi-person matching, and &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-temporal-tensor&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-temporal-tensor&lt;/a&gt; for CSI buffer compression.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Three-tier data strategy:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Tier&lt;/th&gt;
          &lt;th&gt;Method&lt;/th&gt;
          &lt;th&gt;Purpose&lt;/th&gt;
          &lt;th&gt;RuVector Integration&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;1. Pre-train&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;MM-Fi + Wi-Pose public datasets&lt;/td&gt;
          &lt;td&gt;Cross-environment generalization (multi-subject, multi-room)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-temporal-tensor&lt;/code&gt; compresses CSI windows (114→56 subcarrier resampling)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;2. Fine-tune&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;ESP32 CSI + camera pseudo-labels&lt;/td&gt;
          &lt;td&gt;Environment-specific multipath adaptation&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-solver&lt;/code&gt; for Fresnel geometry, &lt;code&gt;ruvector-attn-mincut&lt;/code&gt; for subcarrier gating&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;3. SONA adapt&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Micro-LoRA (rank-4) + EWC++&lt;/td&gt;
          &lt;td&gt;Continuous on-device learning without catastrophic forgetting&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/sona&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;SONA&lt;/a&gt; architecture (Self-Optimizing Neural Architecture)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;Training pipeline components:&lt;/strong&gt;&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Phase&lt;/th&gt;
          &lt;th&gt;Module&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;RuVector Crate&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;1&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;dataset.rs&lt;/code&gt; (850 lines)&lt;/td&gt;
          &lt;td&gt;MM-Fi &lt;code&gt;.npy&lt;/code&gt; + Wi-Pose &lt;code&gt;.mat&lt;/code&gt; loaders, subcarrier resampling (114→56, 30→56), windowing&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-temporal-tensor&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-temporal-tensor&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;2&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;graph_transformer.rs&lt;/code&gt; (855 lines)&lt;/td&gt;
          &lt;td&gt;COCO BodyGraph (17 kp, 16 edges), AntennaGraph, multi-head CrossAttention, GCN message passing&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-attention&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-attention&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;3&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;trainer.rs&lt;/code&gt; (881 lines)&lt;/td&gt;
          &lt;td&gt;6-term composite loss (MSE, CE, UV, temporal, bone, symmetry), SGD+momentum, cosine+warmup, PCK/OKS&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-mincut&lt;/a&gt; (person matching)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;4&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;sona.rs&lt;/code&gt; (639 lines)&lt;/td&gt;
          &lt;td&gt;LoRA adapters (A×B delta), EWC++ Fisher regularization, EnvironmentDetector (3-sigma drift)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/sona&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;sona&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;5&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;sparse_inference.rs&lt;/code&gt; (753 lines)&lt;/td&gt;
          &lt;td&gt;NeuronProfiler hot/cold partitioning, SparseLinear (skip cold rows), INT8/FP16 quantization&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-sparse-inference&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-sparse-inference&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;6&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;rvf_pipeline.rs&lt;/code&gt; (1,027 lines)&lt;/td&gt;
          &lt;td&gt;Progressive 3-layer loader, HNSW index, OverlayGraph, &lt;code&gt;RvfModelBuilder&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;ruvector-core&lt;/a&gt; (HNSW)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;7&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;rvf_container.rs&lt;/code&gt; (914 lines)&lt;/td&gt;
          &lt;td&gt;Binary container format, 6+ segment types, CRC32 integrity&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/rvf&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;rvf&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;8&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;main.rs&lt;/code&gt; integration&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;--train&lt;/code&gt;, &lt;code&gt;--model&lt;/code&gt;, &lt;code&gt;--progressive&lt;/code&gt; CLI flags, REST endpoints&lt;/td&gt;
          &lt;td&gt;—&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;SONA (Self-Optimizing Neural Architecture)&lt;/strong&gt; — the continuous adaptation system:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Component&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Why It Matters&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Micro-LoRA (rank-4)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Trains small A×B weight deltas instead of full weights&lt;/td&gt;
          &lt;td&gt;100x fewer parameters to update → runs on ESP32&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;EWC++ (Fisher matrix)&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Penalizes changes to important weights from previous environments&lt;/td&gt;
          &lt;td&gt;Prevents catastrophic forgetting when moving between rooms&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;EnvironmentDetector&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Monitors CSI feature drift with 3-sigma threshold&lt;/td&gt;
          &lt;td&gt;Auto-triggers adaptation when the model is moved to a new space&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Best-epoch snapshot&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Saves best validation loss weights, restores before export&lt;/td&gt;
          &lt;td&gt;Prevents shipping overfit final-epoch parameters&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;9
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Pre-train on MM-Fi dataset&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --train --dataset data/ --dataset-type mmfi --epochs &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Train and export to RVF in one step&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --train --dataset data/ --epochs &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt; --save-rvf model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Via Docker (no toolchain needed)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run --rm -v &lt;span class=&#34;k&#34;&gt;$(&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;pwd&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;)&lt;/span&gt;/data:/data ruvnet/wifi-densepose:latest &lt;span class=&#34;se&#34;&gt;\
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  --train --dataset /data --epochs &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt; --export-rvf /data/model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md&#34; &gt;ADR-023&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/sona&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;SONA crate&lt;/a&gt; · &lt;a class=&#34;link&#34; href=&#34;https://arxiv.org/abs/2301.00250&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;arXiv:2301.00250&lt;/a&gt;&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;ruvector-crates&#34;&gt;&lt;/a&gt;&lt;strong&gt;🔩 RuVector Crates&lt;/strong&gt; — 11 vendored signal intelligence crates from &lt;a href=&#34;https://github.com/ruvnet/ruvector&#34;&gt;github.com/ruvnet/ruvector&lt;/a&gt;&lt;/summary&gt;
&lt;p&gt;&lt;strong&gt;5 directly-used crates&lt;/strong&gt; (v2.0.4, declared in &lt;code&gt;Cargo.toml&lt;/code&gt;, 7 integration points):&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Crate&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Where It&amp;rsquo;s Used in WiFi-DensePose&lt;/th&gt;
          &lt;th&gt;Source&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-attention&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-attention&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Scaled dot-product attention, MoE routing, sparse attention&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;model.rs&lt;/code&gt; (spatial attention), &lt;code&gt;bvp.rs&lt;/code&gt; (sensitivity-weighted velocity profiles)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-attention&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-mincut&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Subpolynomial dynamic min-cut O(n^1.5 log n)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;metrics.rs&lt;/code&gt; (DynamicPersonMatcher — multi-person assignment), &lt;code&gt;subcarrier_selection.rs&lt;/code&gt; (sensitive/insensitive split)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-attn-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-attn-mincut&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Attention-gated spectrogram noise suppression&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;model.rs&lt;/code&gt; (antenna attention gating), &lt;code&gt;spectrogram.rs&lt;/code&gt; (gate noisy time-frequency bins)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-attn-mincut&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-solver&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-solver&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Sparse Neumann series solver O(sqrt(n))&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;fresnel.rs&lt;/code&gt; (TX-body-RX geometry), &lt;code&gt;triangulation.rs&lt;/code&gt; (3D localization), &lt;code&gt;subcarrier.rs&lt;/code&gt; (sparse interpolation 114→56)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-solver&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-temporal-tensor&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-temporal-tensor&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Tiered temporal compression (8/7/5/3-bit)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;dataset.rs&lt;/code&gt; (CSI buffer compression), &lt;code&gt;breathing.rs&lt;/code&gt; + &lt;code&gt;heartbeat.rs&lt;/code&gt; (compressed vital sign spectrograms)&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-temporal-tensor&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&lt;strong&gt;6 additional vendored crates&lt;/strong&gt; (used by training pipeline and inference):&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Crate&lt;/th&gt;
          &lt;th&gt;What It Does&lt;/th&gt;
          &lt;th&gt;Source&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-core&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;VectorDB engine, HNSW index, SIMD distance functions, quantization codebooks&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-gnn&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-gnn&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Graph neural network layers, graph attention, EWC-regularized training&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-gnn&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-graph-transformer&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-graph-transformer&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Proof-gated graph transformer with cross-attention&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-graph-transformer&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-sparse-inference&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-sparse-inference&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;PowerInfer-style hot/cold neuron partitioning, skip cold rows at runtime&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-sparse-inference&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-nervous-system&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-nervous-system&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;PredictiveLayer, OscillatoryRouter, Hopfield associative memory&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-nervous-system&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector/tree/main/crates/ruvector-coherence&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;code&gt;ruvector-coherence&lt;/code&gt;&lt;/a&gt;&lt;/td&gt;
          &lt;td&gt;Spectral coherence monitoring, HNSW graph health, Fiedler connectivity&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/ruvector-coherence&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crate&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;The full RuVector ecosystem includes 90+ crates. See &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;github.com/ruvnet/ruvector&lt;/a&gt; for the complete library, and &lt;a class=&#34;link&#34; href=&#34;vendor/ruvector/&#34; &gt;&lt;code&gt;vendor/ruvector/&lt;/code&gt;&lt;/a&gt; for the vendored source in this project.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;ruv-neural&#34;&gt;&lt;/a&gt;&lt;strong&gt;🧠 rUv Neural&lt;/strong&gt; — Brain topology analysis ecosystem for neural decoding and medical sensing&lt;/summary&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;rust-port/wifi-densepose-rs/crates/ruv-neural/README.md&#34; &gt;&lt;strong&gt;rUv Neural&lt;/strong&gt;&lt;/a&gt; is a 12-crate Rust ecosystem that extends RuView&amp;rsquo;s signal processing into brain network topology analysis. It transforms neural magnetic field measurements from quantum sensors (NV diamond magnetometers, optically pumped magnetometers) into dynamic connectivity graphs, using minimum cut algorithms to detect cognitive state transitions in real time. The ecosystem includes crates for signal processing (&lt;code&gt;ruv-neural-signal&lt;/code&gt;), graph construction (&lt;code&gt;ruv-neural-graph&lt;/code&gt;), HNSW-indexed pattern memory (&lt;code&gt;ruv-neural-memory&lt;/code&gt;), graph embeddings (&lt;code&gt;ruv-neural-embed&lt;/code&gt;), cognitive state decoding (&lt;code&gt;ruv-neural-decoder&lt;/code&gt;), and ESP32/WASM edge targets. Medical and research applications include early neurological disease detection via topology signatures, brain-computer interfaces, clinical neurofeedback, and non-invasive biomedical sensing &amp;ndash; bridging RuView&amp;rsquo;s RF sensing architecture with the emerging field of quantum biomedical diagnostics.&lt;/p&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;🏗️ System Architecture&lt;/strong&gt; — End-to-end data flow from CSI capture to REST/WebSocket API&lt;/summary&gt;
&lt;h3 id=&#34;end-to-end-pipeline&#34;&gt;End-to-End Pipeline
&lt;/h3&gt;&lt;pre class=&#34;mermaid&#34;&gt;
  graph TB
    subgraph HW [&amp;#34;📡 Hardware Layer&amp;#34;]
        direction LR
        R1[&amp;#34;WiFi Router 1&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;CSI Source&amp;lt;/small&amp;gt;&amp;#34;]
        R2[&amp;#34;WiFi Router 2&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;CSI Source&amp;lt;/small&amp;gt;&amp;#34;]
        R3[&amp;#34;WiFi Router 3&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;CSI Source&amp;lt;/small&amp;gt;&amp;#34;]
        ESP[&amp;#34;ESP32-S3 Mesh&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;20 Hz · 56 subcarriers&amp;lt;/small&amp;gt;&amp;#34;]
        WIN[&amp;#34;Windows WiFi&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;RSSI scanning&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph INGEST [&amp;#34;⚡ Ingestion&amp;#34;]
        AGG[&amp;#34;Aggregator&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;UDP :5005 · ADR-018 frames&amp;lt;/small&amp;gt;&amp;#34;]
        BRIDGE[&amp;#34;Bridge&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;I/Q → amplitude + phase&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph SIGNAL [&amp;#34;🔬 Signal Processing — RuVector v2.0.4&amp;#34;]
        direction TB
        PHASE[&amp;#34;Phase Sanitization&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;SpotFi conjugate multiply&amp;lt;/small&amp;gt;&amp;#34;]
        HAMPEL[&amp;#34;Hampel Filter&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Outlier rejection · σ=3&amp;lt;/small&amp;gt;&amp;#34;]
        SUBSEL[&amp;#34;Subcarrier Selection&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-mincut · sensitive/insensitive split&amp;lt;/small&amp;gt;&amp;#34;]
        SPEC[&amp;#34;Spectrogram&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-attn-mincut · gated STFT&amp;lt;/small&amp;gt;&amp;#34;]
        FRESNEL[&amp;#34;Fresnel Geometry&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-solver · TX-body-RX distance&amp;lt;/small&amp;gt;&amp;#34;]
        BVP[&amp;#34;Body Velocity Profile&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-attention · weighted BVP&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph ML [&amp;#34;🧠 Neural Pipeline&amp;#34;]
        direction TB
        GRAPH[&amp;#34;Graph Transformer&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;17 COCO keypoints · 16 edges&amp;lt;/small&amp;gt;&amp;#34;]
        CROSS[&amp;#34;Cross-Attention&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;CSI features → body pose&amp;lt;/small&amp;gt;&amp;#34;]
        SONA[&amp;#34;SONA Adapter&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;LoRA rank-4 · EWC++&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph VITAL [&amp;#34;💓 Vital Signs&amp;#34;]
        direction LR
        BREATH[&amp;#34;Breathing&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;0.1–0.5 Hz · FFT peak&amp;lt;/small&amp;gt;&amp;#34;]
        HEART[&amp;#34;Heart Rate&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;0.8–2.0 Hz · FFT peak&amp;lt;/small&amp;gt;&amp;#34;]
        MOTION[&amp;#34;Motion Level&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Variance + band power&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph API [&amp;#34;🌐 Output Layer&amp;#34;]
        direction LR
        REST[&amp;#34;REST API&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Axum :3000 · 6 endpoints&amp;lt;/small&amp;gt;&amp;#34;]
        WS[&amp;#34;WebSocket&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;:3001 · real-time stream&amp;lt;/small&amp;gt;&amp;#34;]
        ANALYTICS[&amp;#34;Analytics&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Fall · Activity · START triage&amp;lt;/small&amp;gt;&amp;#34;]
        UI[&amp;#34;Web UI&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Three.js · Gaussian splats&amp;lt;/small&amp;gt;&amp;#34;]
    end

    R1 &amp;amp; R2 &amp;amp; R3 --&amp;gt; AGG
    ESP --&amp;gt; AGG
    WIN --&amp;gt; BRIDGE
    AGG --&amp;gt; BRIDGE
    BRIDGE --&amp;gt; PHASE
    PHASE --&amp;gt; HAMPEL
    HAMPEL --&amp;gt; SUBSEL
    SUBSEL --&amp;gt; SPEC
    SPEC --&amp;gt; FRESNEL
    FRESNEL --&amp;gt; BVP
    BVP --&amp;gt; GRAPH
    GRAPH --&amp;gt; CROSS
    CROSS --&amp;gt; SONA
    SONA --&amp;gt; BREATH &amp;amp; HEART &amp;amp; MOTION
    BREATH &amp;amp; HEART &amp;amp; MOTION --&amp;gt; REST &amp;amp; WS &amp;amp; ANALYTICS
    WS --&amp;gt; UI

    style HW fill:#1a1a2e,stroke:#e94560,color:#eee
    style INGEST fill:#16213e,stroke:#0f3460,color:#eee
    style SIGNAL fill:#0f3460,stroke:#533483,color:#eee
    style ML fill:#533483,stroke:#e94560,color:#eee
    style VITAL fill:#2d132c,stroke:#e94560,color:#eee
    style API fill:#1a1a2e,stroke:#0f3460,color:#eee
&lt;/pre&gt;

&lt;h3 id=&#34;signal-processing-detail&#34;&gt;Signal Processing Detail
&lt;/h3&gt;&lt;pre class=&#34;mermaid&#34;&gt;
  graph LR
    subgraph RAW [&amp;#34;Raw CSI Frame&amp;#34;]
        IQ[&amp;#34;I/Q Samples&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;56–192 subcarriers × N antennas&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph CLEAN [&amp;#34;Phase Cleanup&amp;#34;]
        CONJ[&amp;#34;Conjugate Multiply&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Remove carrier freq offset&amp;lt;/small&amp;gt;&amp;#34;]
        UNWRAP[&amp;#34;Phase Unwrap&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Remove 2π discontinuities&amp;lt;/small&amp;gt;&amp;#34;]
        HAMPEL2[&amp;#34;Hampel Filter&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Remove impulse noise&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph SELECT [&amp;#34;Subcarrier Intelligence&amp;#34;]
        MINCUT[&amp;#34;Min-Cut Partition&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-mincut&amp;lt;/small&amp;gt;&amp;#34;]
        GATE[&amp;#34;Attention Gate&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-attn-mincut&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph EXTRACT [&amp;#34;Feature Extraction&amp;#34;]
        STFT[&amp;#34;STFT Spectrogram&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Time-frequency decomposition&amp;lt;/small&amp;gt;&amp;#34;]
        FRESNELZ[&amp;#34;Fresnel Zones&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-solver&amp;lt;/small&amp;gt;&amp;#34;]
        BVPE[&amp;#34;BVP Estimation&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;ruvector-attention&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph OUT [&amp;#34;Output Features&amp;#34;]
        AMP[&amp;#34;Amplitude Matrix&amp;#34;]
        PHASE2[&amp;#34;Phase Matrix&amp;#34;]
        DOPPLER[&amp;#34;Doppler Shifts&amp;#34;]
        VITALS[&amp;#34;Vital Band Power&amp;#34;]
    end

    IQ --&amp;gt; CONJ --&amp;gt; UNWRAP --&amp;gt; HAMPEL2
    HAMPEL2 --&amp;gt; MINCUT --&amp;gt; GATE
    GATE --&amp;gt; STFT --&amp;gt; FRESNELZ --&amp;gt; BVPE
    BVPE --&amp;gt; AMP &amp;amp; PHASE2 &amp;amp; DOPPLER &amp;amp; VITALS

    style RAW fill:#0d1117,stroke:#58a6ff,color:#c9d1d9
    style CLEAN fill:#161b22,stroke:#58a6ff,color:#c9d1d9
    style SELECT fill:#161b22,stroke:#d29922,color:#c9d1d9
    style EXTRACT fill:#161b22,stroke:#3fb950,color:#c9d1d9
    style OUT fill:#0d1117,stroke:#8b949e,color:#c9d1d9
&lt;/pre&gt;

&lt;h3 id=&#34;deployment-topology&#34;&gt;Deployment Topology
&lt;/h3&gt;&lt;pre class=&#34;mermaid&#34;&gt;
  graph TB
    subgraph EDGE [&amp;#34;Edge (ESP32-S3 Mesh)&amp;#34;]
        E1[&amp;#34;Node 1&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Kitchen&amp;lt;/small&amp;gt;&amp;#34;]
        E2[&amp;#34;Node 2&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Living room&amp;lt;/small&amp;gt;&amp;#34;]
        E3[&amp;#34;Node 3&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Bedroom&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph SERVER [&amp;#34;Server (Rust · 132 MB Docker)&amp;#34;]
        SENSE[&amp;#34;Sensing Server&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;:3000 REST · :3001 WS · :5005 UDP&amp;lt;/small&amp;gt;&amp;#34;]
        RVF[&amp;#34;RVF Model&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Progressive 3-layer load&amp;lt;/small&amp;gt;&amp;#34;]
        STORE[&amp;#34;Time-Series Store&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;In-memory ring buffer&amp;lt;/small&amp;gt;&amp;#34;]
    end

    subgraph CLIENT [&amp;#34;Clients&amp;#34;]
        BROWSER[&amp;#34;Browser&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;Three.js UI · Gaussian splats&amp;lt;/small&amp;gt;&amp;#34;]
        MOBILE[&amp;#34;Mobile App&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;WebSocket stream&amp;lt;/small&amp;gt;&amp;#34;]
        DASH[&amp;#34;Dashboard&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;REST polling&amp;lt;/small&amp;gt;&amp;#34;]
        IOT[&amp;#34;Home Automation&amp;lt;br/&amp;gt;&amp;lt;small&amp;gt;MQTT bridge&amp;lt;/small&amp;gt;&amp;#34;]
    end

    E1 --&amp;gt;|&amp;#34;UDP :5005&amp;lt;br/&amp;gt;ADR-018 frames&amp;#34;| SENSE
    E2 --&amp;gt;|&amp;#34;UDP :5005&amp;#34;| SENSE
    E3 --&amp;gt;|&amp;#34;UDP :5005&amp;#34;| SENSE
    SENSE &amp;lt;--&amp;gt; RVF
    SENSE &amp;lt;--&amp;gt; STORE
    SENSE --&amp;gt;|&amp;#34;WS :3001&amp;lt;br/&amp;gt;real-time JSON&amp;#34;| BROWSER &amp;amp; MOBILE
    SENSE --&amp;gt;|&amp;#34;REST :3000&amp;lt;br/&amp;gt;on-demand&amp;#34;| DASH &amp;amp; IOT

    style EDGE fill:#1a1a2e,stroke:#e94560,color:#eee
    style SERVER fill:#16213e,stroke:#533483,color:#eee
    style CLIENT fill:#0f3460,stroke:#0f3460,color:#eee
&lt;/pre&gt;

&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Component&lt;/th&gt;
          &lt;th&gt;Crate / Module&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Aggregator&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-hardware&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;ESP32 UDP listener, ADR-018 frame parser, I/Q → amplitude/phase bridge&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Signal Processor&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-signal&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;SpotFi phase sanitization, Hampel filter, STFT spectrogram, Fresnel geometry, BVP&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Subcarrier Selection&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-mincut&lt;/code&gt; + &lt;code&gt;ruvector-attn-mincut&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Dynamic sensitive/insensitive partitioning, attention-gated noise suppression&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Fresnel Solver&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ruvector-solver&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Sparse Neumann series O(sqrt(n)) for TX-body-RX distance estimation&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Graph Transformer&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-train&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;COCO BodyGraph (17 kp, 16 edges), cross-attention CSI→pose, GCN message passing&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;SONA&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;sona&lt;/code&gt; crate&lt;/td&gt;
          &lt;td&gt;Micro-LoRA (rank-4) adaptation, EWC++ catastrophic forgetting prevention&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Vital Signs&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-signal&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;FFT-based breathing (0.1-0.5 Hz) and heartbeat (0.8-2.0 Hz) extraction&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;REST API&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-sensing-server&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Axum server: &lt;code&gt;/api/v1/sensing&lt;/code&gt;, &lt;code&gt;/health&lt;/code&gt;, &lt;code&gt;/vital-signs&lt;/code&gt;, &lt;code&gt;/bssid&lt;/code&gt;, &lt;code&gt;/sona&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;WebSocket&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-sensing-server&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Real-time pose, sensing, and vital sign streaming on &lt;code&gt;:3001&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Analytics&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;wifi-densepose-mat&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Fall detection, activity recognition, START triage (WiFi-Mat disaster module)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Web UI&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;ui/&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Three.js scene, Gaussian splat visualization, signal dashboard&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-cli-usage&#34;&gt;🖥️ CLI Usage
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Rust Sensing Server&lt;/strong&gt; — Primary CLI interface&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;16
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;17
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;18
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;19
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;20
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Start with simulated data (no hardware)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --source simulate --ui-path ../../ui
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Start with ESP32 CSI hardware&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --source esp32 --udp-port &lt;span class=&#34;m&#34;&gt;5005&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Start with Windows WiFi RSSI&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --source wifi
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Run vital sign benchmark&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --benchmark
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Export RVF model package&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --export-rvf model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Train a model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --train --dataset data/ --epochs &lt;span class=&#34;m&#34;&gt;100&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Load trained model with progressive loading&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --model wifi-densepose-v1.rvf --progressive
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Flag&lt;/th&gt;
          &lt;th&gt;Description&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--source&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Data source: &lt;code&gt;auto&lt;/code&gt;, &lt;code&gt;wifi&lt;/code&gt;, &lt;code&gt;esp32&lt;/code&gt;, &lt;code&gt;simulate&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--http-port&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;HTTP port for UI and REST API (default: 8080)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--ws-port&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;WebSocket port (default: 8765)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--udp-port&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;UDP port for ESP32 CSI frames (default: 5005)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--benchmark&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Run vital sign benchmark (1000 frames) and exit&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--export-rvf&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Export RVF container package and exit&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--load-rvf&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Load model config from RVF container&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--save-rvf&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Save model state on shutdown&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--model&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Load trained &lt;code&gt;.rvf&lt;/code&gt; model for inference&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--progressive&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Enable progressive loading (Layer A instant start)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--train&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Train a model and exit&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--dataset&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Path to dataset directory (MM-Fi or Wi-Pose)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;code&gt;--epochs&lt;/code&gt;&lt;/td&gt;
          &lt;td&gt;Training epochs (default: 100)&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;rest-api--websocket&#34;&gt;&lt;/a&gt;&lt;strong&gt;REST API &amp; WebSocket&lt;/strong&gt; — Endpoints reference&lt;/summary&gt;
&lt;h4 id=&#34;rest-api-rust-sensing-server&#34;&gt;REST API (Rust Sensing Server)
&lt;/h4&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;GET  /api/v1/sensing              &lt;span class=&#34;c1&#34;&gt;# Latest sensing frame&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;GET  /api/v1/vital-signs          &lt;span class=&#34;c1&#34;&gt;# Breathing, heart rate, confidence&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;GET  /api/v1/bssid                &lt;span class=&#34;c1&#34;&gt;# Multi-BSSID registry&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;GET  /api/v1/model/layers         &lt;span class=&#34;c1&#34;&gt;# Progressive loading status&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;GET  /api/v1/model/sona/profiles  &lt;span class=&#34;c1&#34;&gt;# SONA profiles&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;POST /api/v1/model/sona/activate  &lt;span class=&#34;c1&#34;&gt;# Activate SONA profile&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;WebSocket: &lt;code&gt;ws://localhost:3001/ws/sensing&lt;/code&gt; (real-time sensing + vital signs)&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Default ports (Docker): HTTP 3000, WS 3001. Binary defaults: HTTP 8080, WS 8765. Override with &lt;code&gt;--http-port&lt;/code&gt; / &lt;code&gt;--ws-port&lt;/code&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;a id=&#34;hardware-support-1&#34;&gt;&lt;/a&gt;&lt;strong&gt;Hardware Support&lt;/strong&gt; — Devices, cost, and guides&lt;/summary&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Hardware&lt;/th&gt;
          &lt;th&gt;CSI&lt;/th&gt;
          &lt;th&gt;Cost&lt;/th&gt;
          &lt;th&gt;Guide&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;ESP32-S3&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;Native&lt;/td&gt;
          &lt;td&gt;~$8&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/34&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #34&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Intel 5300&lt;/td&gt;
          &lt;td&gt;Firmware mod&lt;/td&gt;
          &lt;td&gt;~$15&lt;/td&gt;
          &lt;td&gt;Linux &lt;code&gt;iwl-csi&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Atheros AR9580&lt;/td&gt;
          &lt;td&gt;ath9k patch&lt;/td&gt;
          &lt;td&gt;~$20&lt;/td&gt;
          &lt;td&gt;Linux only&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Any Windows WiFi&lt;/td&gt;
          &lt;td&gt;RSSI only&lt;/td&gt;
          &lt;td&gt;$0&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues/36&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Tutorial #36&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Any macOS WiFi&lt;/td&gt;
          &lt;td&gt;RSSI only (CoreWLAN)&lt;/td&gt;
          &lt;td&gt;$0&lt;/td&gt;
          &lt;td&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-025-macos-corewlan-wifi-sensing.md&#34; &gt;ADR-025&lt;/a&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Any Linux WiFi&lt;/td&gt;
          &lt;td&gt;RSSI only (&lt;code&gt;iw&lt;/code&gt;)&lt;/td&gt;
          &lt;td&gt;$0&lt;/td&gt;
          &lt;td&gt;Requires &lt;code&gt;iw&lt;/code&gt; + &lt;code&gt;CAP_NET_ADMIN&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;QEMU Firmware Testing (ADR-061) — 9-Layer Platform&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;Test ESP32-S3 firmware without physical hardware using Espressif&amp;rsquo;s QEMU fork. The platform provides 9 layers of testing capability:&lt;/p&gt;
&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Layer&lt;/th&gt;
          &lt;th&gt;Capability&lt;/th&gt;
          &lt;th&gt;Script / Config&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;1&lt;/td&gt;
          &lt;td&gt;Mock CSI generator (10 physics-based scenarios)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;firmware/esp32-csi-node/main/mock_csi.c&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;2&lt;/td&gt;
          &lt;td&gt;Single-node QEMU runner + UART validation (16 checks)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;scripts/qemu-esp32s3-test.sh&lt;/code&gt;, &lt;code&gt;scripts/validate_qemu_output.py&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;3&lt;/td&gt;
          &lt;td&gt;Multi-node TDM mesh simulation (TAP networking)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;scripts/qemu-mesh-test.sh&lt;/code&gt;, &lt;code&gt;scripts/validate_mesh_test.py&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;4&lt;/td&gt;
          &lt;td&gt;GDB remote debugging (VS Code integration)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;.vscode/launch.json&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;5&lt;/td&gt;
          &lt;td&gt;Code coverage (gcov/lcov via apptrace)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;firmware/esp32-csi-node/sdkconfig.coverage&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;6&lt;/td&gt;
          &lt;td&gt;Fuzz testing (libFuzzer + ASAN/UBSAN)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;firmware/esp32-csi-node/test/fuzz_*.c&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;7&lt;/td&gt;
          &lt;td&gt;NVS provisioning matrix (14 configs)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;scripts/generate_nvs_matrix.py&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;8&lt;/td&gt;
          &lt;td&gt;Snapshot regression (sub-second VM restore)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;scripts/qemu-snapshot-test.sh&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;9&lt;/td&gt;
          &lt;td&gt;Chaos testing (fault injection + health monitoring)&lt;/td&gt;
          &lt;td&gt;&lt;code&gt;scripts/qemu-chaos-test.sh&lt;/code&gt;, &lt;code&gt;scripts/inject_fault.py&lt;/code&gt;, &lt;code&gt;scripts/check_health.py&lt;/code&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Quick start: build + run + validate&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; firmware/esp32-csi-node
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;idf.py -D &lt;span class=&#34;nv&#34;&gt;SDKCONFIG_DEFAULTS&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;sdkconfig.defaults;sdkconfig.qemu&amp;#34;&lt;/span&gt; build
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Single-node test (builds, merges flash, runs QEMU, validates output)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;bash scripts/qemu-esp32s3-test.sh
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Multi-node mesh test (3 QEMU instances with TDM)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;sudo bash scripts/qemu-mesh-test.sh &lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Fuzz testing (60 seconds per target)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; firmware/esp32-csi-node/test &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; make all &lt;span class=&#34;nv&#34;&gt;CC&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;clang &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; make run_serialize &lt;span class=&#34;nv&#34;&gt;FUZZ_DURATION&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;60&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Chaos testing (fault injection resilience)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;bash scripts/qemu-chaos-test.sh --faults all --duration &lt;span class=&#34;m&#34;&gt;120&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;10 test scenarios&lt;/strong&gt;: empty room, static person, walking, fall, multi-person, channel sweep, MAC filter, ring overflow, boundary RSSI, zero-length frames.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;14 NVS configs&lt;/strong&gt;: default, WiFi-only, full ADR-060, edge tiers 0/1/2, TDM mesh, WASM signed/unsigned, 5GHz, boundary max/min, power-save, empty-strings.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;CI&lt;/strong&gt;: GitHub Actions workflow runs 7 NVS matrix configs, 3 fuzz targets, and NVS binary validation on every push to &lt;code&gt;firmware/&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-061-qemu-esp32s3-firmware-testing.md&#34; &gt;ADR-061&lt;/a&gt; for the full architecture.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;QEMU Swarm Configurator (ADR-062)&lt;/strong&gt;&lt;/summary&gt;
&lt;p&gt;Test multiple ESP32-S3 nodes simultaneously using a YAML-driven orchestrator. Define node roles, network topologies, and validation assertions in a config file.&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Quick smoke test (2 nodes, 15 seconds)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python3 scripts/qemu_swarm.py --preset smoke
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Standard 3-node test (coordinator + 2 sensors)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python3 scripts/qemu_swarm.py --preset standard
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# See all presets&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python3 scripts/qemu_swarm.py --list-presets
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Preview without running&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python3 scripts/qemu_swarm.py --preset standard --dry-run
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;p&gt;&lt;strong&gt;Topologies&lt;/strong&gt;: star (sensors → coordinator), mesh (fully connected), line (relay chain), ring (circular).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Node roles&lt;/strong&gt;: sensor (generates CSI), coordinator (aggregates), gateway (bridges to host).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;7 presets&lt;/strong&gt;: smoke, standard, ci-matrix, large-mesh, line-relay, ring-fault, heterogeneous.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;9 swarm assertions&lt;/strong&gt;: boot check, crash detection, TDM collision, frame production, coordinator reception, fall detection, frame rate, boot time, heap health.&lt;/p&gt;
&lt;p&gt;See &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-062-qemu-swarm-configurator.md&#34; &gt;ADR-062&lt;/a&gt; and the &lt;a class=&#34;link&#34; href=&#34;docs/user-guide.md#testing-firmware-without-hardware-qemu&#34; &gt;User Guide&lt;/a&gt; for step-by-step instructions.&lt;/p&gt;
&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Python Legacy CLI&lt;/strong&gt; — v1 API server commands&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;8
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose start                    &lt;span class=&#34;c1&#34;&gt;# Start API server&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose -c config.yaml start     &lt;span class=&#34;c1&#34;&gt;# Custom config&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose -v start                 &lt;span class=&#34;c1&#34;&gt;# Verbose logging&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose status                   &lt;span class=&#34;c1&#34;&gt;# Check status&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose stop                     &lt;span class=&#34;c1&#34;&gt;# Stop server&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose config show              &lt;span class=&#34;c1&#34;&gt;# Show configuration&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose db init                  &lt;span class=&#34;c1&#34;&gt;# Initialize database&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;wifi-densepose tasks list               &lt;span class=&#34;c1&#34;&gt;# List background tasks&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Documentation Links&lt;/strong&gt;&lt;/summary&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;docs/user-guide.md&#34; &gt;User Guide&lt;/a&gt; — installation, first run, API, hardware setup, QEMU testing&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;docs/wifi-mat-user-guide.md&#34; &gt;WiFi-Mat User Guide&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;docs/ddd/wifi-mat-domain-model.md&#34; &gt;Domain Model&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-061-qemu-esp32s3-firmware-testing.md&#34; &gt;ADR-061&lt;/a&gt; QEMU platform | &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-062-qemu-swarm-configurator.md&#34; &gt;ADR-062&lt;/a&gt; Swarm configurator&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-021-vital-sign-detection-rvdna-pipeline.md&#34; &gt;ADR-021&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-022-windows-wifi-enhanced-fidelity-ruvector.md&#34; &gt;ADR-022&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md&#34; &gt;ADR-023&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-testing&#34;&gt;🧪 Testing
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;542+ tests across 7 suites&lt;/strong&gt; — zero mocks, hardware-free simulation&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;14
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;15
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Rust tests (primary — 542+ tests)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; --workspace
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Sensing server tests (229 tests)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; -p wifi-densepose-sensing-server
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Vital sign benchmark&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./target/release/sensing-server --benchmark
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Python tests&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m pytest v1/tests/ -v
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Pipeline verification (no hardware needed)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;./verify
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Suite&lt;/th&gt;
          &lt;th&gt;Tests&lt;/th&gt;
          &lt;th&gt;What It Covers&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;sensing-server lib&lt;/td&gt;
          &lt;td&gt;147&lt;/td&gt;
          &lt;td&gt;Graph transformer, trainer, SONA, sparse inference, RVF&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;sensing-server bin&lt;/td&gt;
          &lt;td&gt;48&lt;/td&gt;
          &lt;td&gt;CLI integration, WebSocket, REST API&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;RVF integration&lt;/td&gt;
          &lt;td&gt;16&lt;/td&gt;
          &lt;td&gt;Container build, read, progressive load&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Vital signs integration&lt;/td&gt;
          &lt;td&gt;18&lt;/td&gt;
          &lt;td&gt;FFT detection, breathing, heartbeat&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;wifi-densepose-signal&lt;/td&gt;
          &lt;td&gt;83&lt;/td&gt;
          &lt;td&gt;SOTA algorithms, Doppler, Fresnel&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;wifi-densepose-mat&lt;/td&gt;
          &lt;td&gt;139&lt;/td&gt;
          &lt;td&gt;Disaster response, triage, localization&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;wifi-densepose-wifiscan&lt;/td&gt;
          &lt;td&gt;91&lt;/td&gt;
          &lt;td&gt;8-stage RSSI pipeline&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-deployment&#34;&gt;🚀 Deployment
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Docker deployment&lt;/strong&gt; — Production setup with docker-compose&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;13
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Rust sensing server (132 MB)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 3000:3000 -p 3001:3001 -p 5005:5005/udp ruvnet/wifi-densepose:latest
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Python pipeline (569 MB)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker pull ruvnet/wifi-densepose:python
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run -p 8765:8765 -p 8080:8080 ruvnet/wifi-densepose:python
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Both via docker-compose&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; docker &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; docker compose up
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Export RVF model&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;docker run --rm -v &lt;span class=&#34;k&#34;&gt;$(&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;pwd&lt;/span&gt;&lt;span class=&#34;k&#34;&gt;)&lt;/span&gt;:/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h3 id=&#34;environment-variables&#34;&gt;Environment Variables
&lt;/h3&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;4
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;RUST_LOG&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;info                    &lt;span class=&#34;c1&#34;&gt;# Logging level&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;WIFI_INTERFACE&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;wlan0             &lt;span class=&#34;c1&#34;&gt;# WiFi interface for RSSI&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;POSE_CONFIDENCE_THRESHOLD&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;0.7    &lt;span class=&#34;c1&#34;&gt;# Minimum confidence&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nv&#34;&gt;POSE_MAX_PERSONS&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;10&lt;/span&gt;              &lt;span class=&#34;c1&#34;&gt;# Max tracked individuals&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-performance-metrics&#34;&gt;📊 Performance Metrics
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Measured benchmarks&lt;/strong&gt; — Rust sensing server, validated via cargo bench&lt;/summary&gt;
&lt;h3 id=&#34;rust-sensing-server&#34;&gt;Rust Sensing Server
&lt;/h3&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Metric&lt;/th&gt;
          &lt;th&gt;Value&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;Vital sign detection&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;11,665 fps&lt;/strong&gt; (86 µs/frame)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Full CSI pipeline&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;54,000 fps&lt;/strong&gt; (18.47 µs/frame)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Motion detection&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;186 ns&lt;/strong&gt; (~5,400x vs Python)&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Docker image&lt;/td&gt;
          &lt;td&gt;132 MB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Memory usage&lt;/td&gt;
          &lt;td&gt;~100 MB&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Test count&lt;/td&gt;
          &lt;td&gt;542+&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id=&#34;python-vs-rust&#34;&gt;Python vs Rust
&lt;/h3&gt;&lt;table&gt;
  &lt;thead&gt;
      &lt;tr&gt;
          &lt;th&gt;Operation&lt;/th&gt;
          &lt;th&gt;Python&lt;/th&gt;
          &lt;th&gt;Rust&lt;/th&gt;
          &lt;th&gt;Speedup&lt;/th&gt;
      &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
      &lt;tr&gt;
          &lt;td&gt;CSI Preprocessing&lt;/td&gt;
          &lt;td&gt;~5 ms&lt;/td&gt;
          &lt;td&gt;5.19 µs&lt;/td&gt;
          &lt;td&gt;1000x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Phase Sanitization&lt;/td&gt;
          &lt;td&gt;~3 ms&lt;/td&gt;
          &lt;td&gt;3.84 µs&lt;/td&gt;
          &lt;td&gt;780x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Feature Extraction&lt;/td&gt;
          &lt;td&gt;~8 ms&lt;/td&gt;
          &lt;td&gt;9.03 µs&lt;/td&gt;
          &lt;td&gt;890x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;Motion Detection&lt;/td&gt;
          &lt;td&gt;~1 ms&lt;/td&gt;
          &lt;td&gt;186 ns&lt;/td&gt;
          &lt;td&gt;5400x&lt;/td&gt;
      &lt;/tr&gt;
      &lt;tr&gt;
          &lt;td&gt;&lt;strong&gt;Full Pipeline&lt;/strong&gt;&lt;/td&gt;
          &lt;td&gt;~15 ms&lt;/td&gt;
          &lt;td&gt;18.47 µs&lt;/td&gt;
          &lt;td&gt;&lt;strong&gt;810x&lt;/strong&gt;&lt;/td&gt;
      &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-contributing&#34;&gt;🤝 Contributing
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Dev setup, code standards, PR process&lt;/strong&gt;&lt;/summary&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt; 1
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 2
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 3
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 4
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 5
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 6
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 7
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 8
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt; 9
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;10
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;11
&lt;/span&gt;&lt;span class=&#34;lnt&#34;&gt;12
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;git clone https://github.com/ruvnet/RuView.git
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; RuView
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Rust development&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;cd&lt;/span&gt; rust-port/wifi-densepose-rs
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo build --release
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cargo &lt;span class=&#34;nb&#34;&gt;test&lt;/span&gt; --workspace
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# Python development&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;python -m venv venv &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;source&lt;/span&gt; venv/bin/activate
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pip install -r requirements-dev.txt &lt;span class=&#34;o&#34;&gt;&amp;amp;&amp;amp;&lt;/span&gt; pip install -e .
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;pre-commit install
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Fork&lt;/strong&gt; the repository&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Create&lt;/strong&gt; a feature branch (&lt;code&gt;git checkout -b feature/amazing-feature&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Commit&lt;/strong&gt; your changes&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Push&lt;/strong&gt; and open a Pull Request&lt;/li&gt;
&lt;/ol&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-changelog&#34;&gt;📄 Changelog
&lt;/h2&gt;&lt;details&gt;
&lt;summary&gt;&lt;strong&gt;Release history&lt;/strong&gt;&lt;/summary&gt;
&lt;h3 id=&#34;v320--2026-03-03&#34;&gt;v3.2.0 — 2026-03-03
&lt;/h3&gt;&lt;p&gt;Edge intelligence: 24 hot-loadable WASM modules for on-device CSI processing on ESP32-S3.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;ADR-041 Edge Intelligence Modules&lt;/strong&gt; — 24 &lt;code&gt;no_std&lt;/code&gt; Rust modules compiled to &lt;code&gt;wasm32-unknown-unknown&lt;/code&gt;, loaded via WASM3 on ESP32; 8 categories covering signal intelligence, adaptive learning, spatial reasoning, temporal analysis, AI security, quantum-inspired, autonomous systems, and exotic algorithms&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Vendor Integration&lt;/strong&gt; — Algorithms ported from &lt;code&gt;midstream&lt;/code&gt; (DTW, attractors, Flash Attention, min-cut, optimal transport) and &lt;code&gt;sublinear-time-solver&lt;/code&gt; (PageRank, HNSW, sparse recovery, spiking NN)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;On-device gesture learning&lt;/strong&gt; — User-teachable DTW gesture recognition with 3-rehearsal protocol and 16 template slots&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Lifelong learning (EWC++)&lt;/strong&gt; — Elastic Weight Consolidation prevents catastrophic forgetting when learning new tasks&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI security modules&lt;/strong&gt; — FNV-1a replay detection, injection/jamming detection, 6D behavioral anomaly profiling with Mahalanobis scoring&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Self-healing mesh&lt;/strong&gt; — 8-node mesh with health tracking, degradation/recovery hysteresis, and coverage redistribution&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Common utility library&lt;/strong&gt; — &lt;code&gt;vendor_common.rs&lt;/code&gt; shared across all 24 modules: CircularBuffer, EMA, WelfordStats, DTW, FixedPriorityQueue, vector math&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;243 tests passing&lt;/strong&gt; — All modules include comprehensive inline tests; 0 failures&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Security audit&lt;/strong&gt; — 15 findings addressed (1 critical, 3 high, 6 medium, 5 low)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;v310--2026-03-02&#34;&gt;v3.1.0 — 2026-03-02
&lt;/h3&gt;&lt;p&gt;Multistatic sensing, persistent field model, and cross-viewpoint fusion — the biggest capability jump since v2.0.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Project RuvSense (ADR-029)&lt;/strong&gt; — Multistatic mesh: TDM protocol, channel hopping (ch1/6/11), multi-band frame fusion, coherence gating, 17-keypoint Kalman tracker with re-ID; 10 new signal modules (5,300+ lines)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;RuvSense Persistent Field Model (ADR-030)&lt;/strong&gt; — 7 exotic sensing tiers: field normal modes (SVD), RF tomography, longitudinal drift detection, intention prediction, cross-room identity, gesture classification, adversarial detection&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Project RuView (ADR-031)&lt;/strong&gt; — Cross-viewpoint attention with geometric bias, Geometric Diversity Index, viewpoint fusion orchestrator; 5 new ruvector modules (2,200+ lines)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;TDM Hardware Protocol&lt;/strong&gt; — ESP32 sensing coordinator: sync beacons, slot scheduling, clock drift compensation (±10ppm), 20 Hz aggregate rate&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Channel-Hopping Firmware&lt;/strong&gt; — ESP32 firmware extended with hop table, timer-driven channel switching, NDP injection stub; NVS config for all TDM parameters; fully backward-compatible&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;DDD Domain Model&lt;/strong&gt; — 6 bounded contexts, ubiquitous language, aggregate roots, domain events, full event bus specification&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;code&gt;ruvector-crv&lt;/code&gt; 6-stage CRV signal-line integration (ADR-033)&lt;/strong&gt; — Maps Coordinate Remote Viewing methodology to WiFi CSI: gestalt classification, sensory encoding, GNN topology, SNN coherence gating, differentiable search, MinCut partitioning; cross-session convergence for multi-room identity continuity&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ADR-032 multistatic mesh security hardening&lt;/strong&gt; — HMAC-SHA256 beacon auth, SipHash-2-4 frame integrity, NDP rate limiter, coherence gate timeout, bounded buffers, NVS credential zeroing, atomic firmware state&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ADR-032a QUIC transport layer&lt;/strong&gt; — &lt;code&gt;midstreamer-quic&lt;/code&gt; TLS 1.3 AEAD for aggregator nodes, dual-mode security (ManualCrypto/QuicTransport), QUIC stream mapping, connection migration, congestion control&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ADR-033 CRV signal-line sensing integration&lt;/strong&gt; — Architecture decision record for the 6-stage CRV pipeline mapping to ruvector components&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Temporal gesture matching&lt;/strong&gt; — &lt;code&gt;midstreamer-temporal-compare&lt;/code&gt; DTW/LCS/edit-distance gesture classification with quantized feature comparison&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Attractor drift analysis&lt;/strong&gt; — &lt;code&gt;midstreamer-attractor&lt;/code&gt; Takens&amp;rsquo; theorem phase-space embedding with Lyapunov exponent regime detection (Stable/Periodic/Chaotic)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;v0.3.0 published&lt;/strong&gt; — All 15 workspace crates published to &lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-core&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;crates.io&lt;/a&gt; with updated dependencies&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;28,000+ lines of new Rust code&lt;/strong&gt; across 26 modules with 400+ tests&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Security hardened&lt;/strong&gt; — Bounded buffers, NaN guards, no panics in public APIs, input validation at all boundaries&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;v300--2026-03-01&#34;&gt;v3.0.0 — 2026-03-01
&lt;/h3&gt;&lt;p&gt;Major release: AETHER contrastive embedding model, AI signal processing backbone, cross-platform adapters, Docker Hub images, and comprehensive README overhaul.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Project AETHER (ADR-024)&lt;/strong&gt; — Self-supervised contrastive learning for WiFi CSI fingerprinting, similarity search, and anomaly detection; 55 KB model fits on ESP32&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI Backbone (&lt;code&gt;wifi-densepose-ruvector&lt;/code&gt;)&lt;/strong&gt; — 7 RuVector integration points replacing hand-tuned thresholds with attention, graph algorithms, and smart compression; &lt;a class=&#34;link&#34; href=&#34;https://crates.io/crates/wifi-densepose-ruvector&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;published to crates.io&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Cross-platform RSSI adapters&lt;/strong&gt; — macOS CoreWLAN and Linux &lt;code&gt;iw&lt;/code&gt; Rust adapters with &lt;code&gt;#[cfg(target_os)]&lt;/code&gt; gating (ADR-025)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Docker images published&lt;/strong&gt; — &lt;code&gt;ruvnet/wifi-densepose:latest&lt;/code&gt; (132 MB Rust) and &lt;code&gt;:python&lt;/code&gt; (569 MB)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Project MERIDIAN (ADR-027)&lt;/strong&gt; — Cross-environment domain generalization: gradient reversal, geometry-conditioned FiLM, virtual domain augmentation, contrastive test-time training; zero-shot room transfer&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;10-phase DensePose training pipeline (ADR-023/027)&lt;/strong&gt; — Graph transformer, 6-term composite loss, SONA adaptation, RVF packaging, hardware normalization, domain-adversarial training&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Vital sign detection (ADR-021)&lt;/strong&gt; — FFT-based breathing (6-30 BPM) and heartbeat (40-120 BPM), 11,665 fps&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;WiFi scan domain layer (ADR-022/025)&lt;/strong&gt; — 8-stage signal intelligence pipeline for Windows, macOS, and Linux&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;700+ Rust tests&lt;/strong&gt; — All passing, zero mocks&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;v200--2026-02-28&#34;&gt;v2.0.0 — 2026-02-28
&lt;/h3&gt;&lt;p&gt;Complete Rust sensing server, SOTA signal processing, WiFi-Mat disaster response, ESP32 hardware, RuVector integration, guided installer, and security hardening.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Rust sensing server&lt;/strong&gt; — Axum REST API + WebSocket, 810x speedup over Python, 54K fps pipeline&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;RuVector integration&lt;/strong&gt; — 11 vendored crates for HNSW, attention, GNN, temporal compression, min-cut, solver&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;6 SOTA signal algorithms (ADR-014)&lt;/strong&gt; — SpotFi, Hampel, Fresnel, spectrogram, subcarrier selection, BVP&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;WiFi-Mat disaster response&lt;/strong&gt; — START triage, 3D localization, priority alerts — 139 tests&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ESP32 CSI hardware&lt;/strong&gt; — Binary frame parsing, $54 starter kit, 20 Hz streaming&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Guided installer&lt;/strong&gt; — 7-step hardware detection, 8 install profiles&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Three.js visualization&lt;/strong&gt; — 3D body model, 17 joints, real-time WebSocket&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Security hardening&lt;/strong&gt; — 10 vulnerabilities fixed&lt;/li&gt;
&lt;/ul&gt;
&lt;/details&gt;
&lt;hr&gt;
&lt;h2 id=&#34;-license&#34;&gt;📄 License
&lt;/h2&gt;&lt;p&gt;MIT License — see &lt;a class=&#34;link&#34; href=&#34;LICENSE&#34; &gt;LICENSE&lt;/a&gt; for details.&lt;/p&gt;
&lt;h2 id=&#34;-support&#34;&gt;📞 Support
&lt;/h2&gt;&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/issues&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;GitHub Issues&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;https://github.com/ruvnet/RuView/discussions&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;Discussions&lt;/a&gt; | &lt;a class=&#34;link&#34; href=&#34;https://pypi.org/project/wifi-densepose/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;PyPI&lt;/a&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;strong&gt;WiFi DensePose&lt;/strong&gt; — Privacy-preserving human pose estimation through WiFi signals.&lt;/p&gt;
</description>
        </item>
        
    </channel>
</rss>
