the Office of the Future and related issues is carried out by the NSF Science and Technology Center for Graphics and Visualization at UNC and the National Tele-immersion Initiative effort at UNC.
Deepak Bandyopadhyay, Wei-Chao Chen, Greg Coombe, David Gotz, Justin Hensley, Sang-Uok Kum, Scott Larsen, Kok-Lim Low, Aditi Majumder, Andrew Nashel, Srihari Sukumaran, Ruigang Yang
Henry Fuchs (PI), Herman Towles, Greg Welch
Gary Bishop, Mike Brown, Anselmo Lastra, Lars Nyland, Ramesh Raskar, Brent Seales
Stephen Brumback, Kurtis Keller
Jim Mahaney, John Thomas
Matt Cutts, Adam Lake, David Marshburn, Gopi Meenakshisundaram, Lev Stesin
Ramesh Raskar, Greg Welch, Matt Cutts, Adam Lake, Lev Stesin and Henry Fuchs,
“The Office of the Future : A Unified Approach to Image-Based Modeling and Spatially Immersive Displays,”
ACM SIGGRAPH 1998, Orlando FL, PDF copy, FTP directory for figures and images, Slides from Siggraph talk left, right
Ramesh Raskar, Matt Cutts, Greg Welch, Wolfgang Stuerzlinger,
“Efficient image generation for multiprojector and multisurface display surfaces,”
Ninth EuroGraphics Rendering Workshop, June 1998 (Appeared in Drettakis, G., Max, N. (eds.), Rendering Techniques ’98. Proceedings of the Eurographics Workshop in Vienna, Austria, June 29-July 1, 1998. 231 partly coloured figures. XI, 339 pages. ISBN 3-211-83213-0. August 1998) PDF copy, A more detailed Tech Report
Ramesh Raskar, Greg Welch, Henry Fuchs,
Seamless Projection Overlaps Using Image Warping and Intensity Blending,
Fourth International Conference on Virtual Systems and Multimedia, Gifu, Japan. November 1998. PDF version, Slides
Ramesh Raskar, Greg Welch, Henry Fuchs,
Spatially Augmented Reality,
First International Workshop on Augmented Reality, November 1998. PDF version, Slides from talk
Brent Seales, Greg Welch, Chris Jaynes,
“Real-Time Depth Warping for 3-D Scene Reconstruction”,
IEEE Aerospace Conference 1999, Snowmass at Aspen, CO, March 6-13 1999. PDF version.
Ramesh Raskar ,
“Oblique Projector Rendering on Planar Surfaces for a Tracked User”,
Siggraph Sketch at Los Angeles, SIGGRAPH 1999. PDF version.
Ramesh Raskar, Michael S. Brown, Ruigang Yang, Wei-Chao Chen, Greg Welch, Herman Towles, Brent Seales, Henry Fuchs. 1999.
“Multi-Projector Displays Using Camera-Based Registration,”
Proceedings of IEEE Visualization 99, San Fransisco, CA, October 24-29, 1999. (PDF version)
Ramesh Raskar, Greg Welch, Wei-Chao Chen.
“Table-Top Spatially-Augmented Reality: Bringing Physical Models to Life with Projected Imagery,”
Second International Workshop on Augmented Reality (IWAR’99), October 20-21, 1999, San Francisco, CA. PDF version
“Immersive Planar Displays using Roughly Aligned Projectors,”
Appears in IEEE VR 2000, March 2000, NJ. PDF version
Greg Welch, Henry Fuchs, Ramesh Raskar, Michael Brown, and Herman Towles,
“Projected Imagery In Your Office in the Future,”
IEEE Computer Graphics and Applications, July/August 2000: 62-67. (PDF version)
Gary Bishop and Greg Welch,
“Working in the Office of ‘Real Soon Now’,”
IEEE Computer Graphics and Applications, July/August 2000: 76-78. (PDF version)
R Raskar, G Welch, K Low, D Bandyopadhyay
” Shader Lamps,”
Eurographics Rendering Workshop, Jun 2001 (PDF version)
More recent papers
” UNC Office of the Future Group’,”
(New paper influenced by Office of the Future project)
R Raskar, J vanBaar, P Beardsley, T Willwacher, S Rao, C Forlines
” iLamps: Geometrically Aware and Self-Configuring Projectors,”
SIGGRAPH 2003 ( Link )
Abstract of the paper : The Office of the Future : A Unified Approach to Image-Based Modeling and Spatially Immersive Displays
We introduce ideas, proposed technologies, and initial results for an office of the future that is based on a unified application of computer vision and computer graphics in a system that combines and builds upon the notions of the CAVE(tm), tiled display systems, and image-based modeling. The basic idea is to use real-time computer vision techniques to dynamically extract per-pixel depth and reflectance information for the visible surfaces in the office including walls, furniture, objects, and people, and then to either project images on the surfaces, render images of the surfaces, or interpret changes in the surfaces. In the first case, one could designate every-day (potentially irregular) real surfaces in the office to be used as spatially immersive display surfaces, and then project high-resolution graphics and text onto those surfaces. In the second case, one could transmit the dynamic image-based models over a network for display at a remote site. Finally, one could interpret dynamic changes in the surfaces for the purposes of tracking, interaction, or augmented reality applications.
To accomplish the simultaneous capture and display we envision an office of the future where the ceiling lights are replaced by computer controlled cameras and “smart” projectors that are used to capture dynamic image-based models with imperceptible structured light techniques, and to display high-resolution images on designated display surfaces. By doing both simultaneously on the designated display surfaces, one can dynamically adjust or autocalibrate for geometric, intensity, and resolution variations resulting from irregular or changing display surfaces, or overlapped projector images.
Our current approach to dynamic image-based modeling is to use an optimized structured light scheme that can capture per-pixel depth and reflectance at interactive rates. Our system implementation is not yet imperceptible, but we can demonstrate the approach in the laboratory. Our approach to rendering on the designated (potentially irregular) display surfaces is to employ a two-pass projective texture scheme to generate images that when projected onto the surfaces appear correct to a moving head-tracked observer. We present here an initial implementation of the overall vision, in an office-like setting, and preliminary demonstrations of our dynamic modeling and display techniques.