iLiDAR is an iOS application that transforms your iPhone into a powerful multi-modal visual sensor by leveraging the built-in LiDAR scanner and RGB camera. The app enables real-time streaming of both depth maps and color images to a PC server for data collection and analysis.
iLiDAR provides an intuitive interface for capturing and streaming LiDAR depth data alongside RGB video. The app features:
- Real-time Depth Mapping: Captures high-resolution depth data using iPhone's LiDAR scanner
- Synchronized RGB Video: Records color images simultaneously with depth data
- Network Streaming: Streams data in real-time to a PC server over Wi-Fi
- Configurable Settings: Adjustable transmission frequency and depth filtering
- Server Management: Built-in IP address management for easy server connection
Only specific iPhone models equipped with LiDAR are supported. Ensure your device is listed in the supported devices list. Generally, all iPhone Pro models from iPhone 12 Pro and later, as well as the latest iPad Pro models, are supported.
- Open
iLiDAR.xcworkspacein Xcode (macOS required) - Configure your development account and handle any permission requests
- Connect your iPhone (Pro model) via USB or wirelessly
- Build and transfer the app to your iPhone
Important: When prompted with "Allow APP to use camera" or "Allow APP to find local network devices?", select "Allow."
- Ensure both your iPhone and PC are connected to the same Wi-Fi network
- Navigate to the Server directory and run:
conda create -n ilidar python=3.10 -y
conda activate
pip install -r requirements.txt
cd Server
python ios_driver.pyThe server will listen on port 5678 and create an uploads folder to store incoming data. You should see:
[*] Server listening on 0.0.0.0:5678- Open the iLiDAR app on your iPhone
- Enter your PC's IP address (e.g.,
192.168.1.10) - Tap "Connect" to establish the connection
- Tap "Enable Network Transfer" to begin streaming
If successful, you'll see logs like:
[>] Received Packet - Filename: 20241208_223229_20241208_223232_76_frame000316.jpg, Type: JPG, Seq: 55, IsLast: False, Size: 1024 bytesNote: The app cannot run on the iOS Simulator as it requires the LiDAR API, which is not available in the simulator.
Use the provided scripts in Server/read_depth_data.py to analyze the received depth and RGB data. Sample files are available in Server/example_data/ for testing.
We release components immediately after validation to make polished code and reproducible experiments available as soon as possible. Recent updates include:
- Add Homepage picture
- Enhanced local processing scripts
- Local IP address management
- Improved network status indicators
- Released iLiDAR app
- Updated network support
- Enhanced UI support
For detailed modification instructions, refer to the Full Tutorial.
The system follows a structured naming convention:
- Event Timestamp: Each transfer session is marked with
yyyyMMdd_HHmmss - Frame Timestamp: Individual frames use
yyyyMMdd_HHmmss_SSwith sequential IDs - File Formats:
- RGB images:
[event_timestamp]_[frame_timestamp]_frame%08d.jpg. It will recount when the count reach 99,999,999. - Depth data:
[event_timestamp]_[frame_timestamp]_frame%08d.bin. It will recount when the count reach 99,999,999. - Camera parameters:
[event_timestamp].csv.
- RGB images:
- Image Compression: RGB images are compressed to 0.4 quality by default. Modify
DataStorage.compressionQualitybased on your network conditions. - Frame Rate: Default transmission rate is 30 FPS but you can adjust it on the settings.
This project is built upon the Apple Official Depth Camera Example.
This project is licensed under the MIT License - see the LICENSE file for details.


