- General purpose indoor positioning sensor, good for robots, drones, etc.
- 3d position accuracy: currently ~10mm; less than 2mm possible with additional work.
- Update frequency: 30 Hz
- Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works)
- HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible.
- Positioning volume: same as HTC Vive, approx up to 4x4x3 meters.
- Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135))
- Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project.
- License: MIT
|Demo showing raw XYZ position:||Indoor hold position for a drone:|
How it works
Lighthouse position tracking system consists of:
– two stationary infrared-emitting base stations (we'll use existing HTC Vive setup),
– IR receiving sensor and processing module (this is what we'll create).
The base stations are usually placed high in the room corners and "overlook" the room. Each station has an IR LED array and two rotating laser planes, horizontal and vertical. Each cycle, after LED array flash (sync pulse), laser planes sweep the room horizontally/vertically with constant rotation speed. This means that the time between the sync pulse and the laser plane "touching" sensor is proportional to horizontal/vertical angle from base station's center direction. Using this timing information, we can calculate 3d lines from each base station to sensor, the crossing of which yields 3d coordinates of our sensor (see calculation details). Great thing about this approach is that it doesn't depend on light intensity and can be made very precise with cheap hardware.
The sensor we're building is the receiving side of the Lighthouse. It will receive, recognize the IR pulses, calculate the angles and produce 3d coordinates.
How it works – details
Base stations are synchronized and work in tandem (they see each other's pulses). Each cycle only one laser plane sweeps the room, so we fully update 3d position every 4 cycles (2 stations * horizontal/vertical sweep). Cycles are 8.333ms long, which is exactly 120Hz. Laser plane rotation speed is exactly 180deg per cycle.
Each cycle, as received by sensor, has the following pulse structure:
|Pulse start, µs||Pulse length, µs||Source station||Meaning|
|0||65–135||A||Sync pulse (LED array, omnidirectional)|
|400||65-135||B||Sync pulse (LED array, omnidirectional)|
|1222–6777||~10||A or B||Laser plane sweep pulse (center=4000µs)|
|8333||End of cycle|
The sync pulse lengths encode which of the 4 cycles we're receiving and station id/calibration data (see description).
Complete tracking module consists of two parts:
- IR Sensor and amplifier (custom board)
- Timing & processing module (we use Teensy 3.2)
To detect the infrared pulses, of course we need an IR sensor. After a couple of attempts, I ended up using BPV22NF photodiodes. Main reasons are:
- Optical IR filter 790-1050nm, which excludes most of sunlight, but includes the 850nm stations use.
- High sensitivity and speed
- Wide 120 degree field of view
To get the whole top hemisphere FOV we need to place 3 photodiodes in 120deg formation in horizontal plane, then tilt each one 30deg in vertical plane. I used a small 3d-printed part, but it's not required.
IR photodiodes produce very small current, so we need to amplify it before feeding to a processing module. I use TLV2462IP opamp – a modern, general purpose rail-to-rail opamp with good bandwidth, plus there are 2 of them in a chip, which is convenient.
One more thing we need to add is a simple high-pass filter to filter out background illumination level changes.
|Top view||Bottom view|
|D1, D2, D3||BPV22NF||3||3x$1.11|
Sample oscilloscope videos:
|Teensy connections||Full position tracker|
Note: Teensy's RX1/TX1 UART interface can be used to output position instead of USB.
We use hardware comparator interrupt with ISR being called on both rise and fall edges of the signal. ISR (
cmp0_isr) gets the timing in microseconds and processes the pulses depending on their lengths. We track the sync pulses lengths to determine which cycle corresponds to which base station and sweep. After the tracking is established, we convert time delays to angles and calculate the 3d lines and 3d position (see geometry.cpp). After position is determined, we report it as text to USB console and as Mavlink ATT_POS_MOCAP message to UART port (see mavlink.cpp).
NOTE: Currently, base station positions and direction matrices are hardcoded in geometry.cpp (
lightsources). You'll need to adjust it for your setup. See #2.
Installation on macOS, Linux
- GNU ARM Embedded toolchain. Can be installed on Mac with
brew cask install gcc-arm-embedded. I'm developing with version 5_4-2016q3, but other versions should work too.
- CMake 3.5+
brew install cmake
- Command line uploader/monitor: ty. See build instructions in the repo.
- I recommend CLion as the IDE - it made my life a lot easier and can compile/upload right there.
Getting the code:
$ git clone https://github.com/ashtuchkin/vive-diy-position-sensor.git $ cd vive-diy-position-sensor $ git submodule update --init
Compilation/upload command line (example, using CMake out-of-source build in build/ directory):
$ cd build $ cmake .. -DPLATFORM=Teensy $ make # Build firmware $ make vive-diy-position-sensor-upload # Upload to Teensy $ tyc monitor # Serial console to Teensy
Installation on Windows
I haven't been able to make it work in Visual Studio, so providing command line build solution.
- GNU ARM Embedded toolchain. Windows 32 bit installer is fine.
- CMake 3.5+. Make sure it's in your %PATH%.
- Ninja build tool. Copy binary in any location in your %PATH%.
Getting the code is the same as above. GitHub client for Windows will make it even easier.
cd build cmake -G Ninja .. -DPLATFORM=Teensy ninja # Build firmware. Will generate "vive-diy-position-sensor.hex" in current directory.