AR-Rift (Part 1)
The overall aim of this work was to build a stereo camera rig to support immersive video see-trough augmented reality (AR) for the Oculus Rift. An immersive experience implies that the video frame captured by the cameras must match both the extents and distribution of the Rift’s field of view (FOV) so that virtual and video spaces are perceptually aligned, and that this full frame should be augmentable. The vision of the project is to support an immersive AR experience where the lines between what is real and what is virtual are blurred.
This is the first in a series of seven articles detailing the process of designing and building a stereo camera rig for the Oculus Rift and an AR showcase demonstrating the types of interactions the system makes possible. The video above gives an overview of the work and showcases the demonstrations.
The articles are split as follows:
- Camera Selection (Part 2): Camera requirements for a stereo rig for head-mounted video see-through AR for the Oculus Rift. Modified Logitech C310 cameras selected.
- Building the Stereo Camera (Part 3): Building and mounting the cameras. A discussion of stereo convergence methods.
- Wireless Rift & 6DoF Tracking (Part 4): Details of a wireless solution for the Rift and 6DoF tracking for head and body.
- Aligning Tracking & Video Spaces (Part 5): Aligning the tracking coordinate system & camera images. Verifying and mapping camera FOV. Matching physical and virtual camera FOV distributions. Real-time camera undistortion on the GPU.
- 3DUI for Immersive AR (Part 6): Design of a bimanual 3D user interface (3DUI) for immersive head-mounted AR.
- AR Showcase (Part 7): Realisation of the system.