Software AI is saturated.
Physical AI is the frontier.

"Bridging the physical AI gap requires more than software. It demands an environment where algorithms breathe, fail, and learn in the real world."

Get started

01. The hardware

Modular Car Kit

A robust, extensible hardware platform designed for the rigors of physical AI training. Built to withstand real-world friction, interference, and dynamic environments.

Sensory Array
LIDAR, Vision, IMU
Compute Unit
Edge TPU Integration
Autonomous modular robotic car chassis on a laboratory surface.
Overhead view of a structured dark testing arena with grid lines.

02. The environment

Smart Game Field

A dynamic, sensor-rich environment that provides ground-truth data. It acts as the physical counterpart to simulated training grounds, offering unpredictable, real-world variables.

Explore specs

03. The intelligence

Real-time Dashboard

The neural center of the AIoScout ecosystem. Visualize telemetry, adjust hyperparameters on the fly, and monitor the learning curve as physical agents interact with the Smart Game Field.

Dark mode telemetry dashboard with charts and robotic agent feed.

Built for Classroom

Junior Form

  1. 01.

    Introduction to robotics

    Master the fundamentals of differential drive mechanics. Students utilize block coding to translate logic into precise physical movement and actuator states.

  2. 02.

    Sensing the world

    Enable autonomous environmental interaction through ultrasonic and IR sensors. Introduces cloud-based IoT logic using real-time traffic signal synchronization.

  3. 03.

    Act like human

    Bridge the gap to Computer Vision. Students implement Edge-AI models to recognize road signs and execute autonomous navigation based on visual input.

Senior Form

  1. 01.

    Advance control

    Move beyond basic logic to industrial-grade movement. Students implement PID control loops to manage mechanical error and ensure high-precision navigation.

  2. 02.

    Sensor fusion

    Advanced IoT integration. Students combine telemetry data from LiDAR and IMU sensors into a live dashboard for fleet monitoring and predictive modeling.

  3. 03.

    Self-driving robot

    The capstone project. Integration of path-planning algorithms and vision systems to navigate a dynamic game field with zero manual intervention.

Sustainability and Scale

Hardware Acquisition

Schools purchase the physical ecosystem (Car Kits, Game Fields) once.

Platform Subscription

Ongoing access to the AI Dashboard and telemetry tools. Tiered by site or per student.

Content Ecosystem

On-demand purchase of specialized courses and teaching packages.

Development Trajectory

  1. Q4 2024

    Alpha Hardware Revision

    Finalizing the sensor suite and edge-compute integration for the initial batch of modular kits.

  2. Q2 2025

    Beta Dashboard Deployment

    Cloud infrastructure for real-time telemetry and fleet management enters closed beta.

  3. July 2026

    HKUST Workshop Showcase

    Full system deployment at the Hong Kong University of Science and Technology. First public demonstration of synchronized fleet learning.

About Us

Portrait of Prof. SONG Sheung Hui.

Prof. SONG Sheung Hui

Technical consultant

Former deep-learning researcher with a focus on sustainable AI architectures and neural efficiency.

Portrait of Elena Moretti.

Elena Moretti

Hardware engineering lead

Specializes in edge-computing hardware and low-latency systems designed for educational environments.

Portrait of Dr. Julian Vance.

Dr. Julian Vance

Curriculum specialist

Pioneered pedagogical frameworks for complex system thinking and human-AI collaborative learning.