We present an alternative method for solving the motion
stereo problem for two views in a variational framework.
Instead of directly solving for the depth, we simultaneously
estimate the optical flow and the 3D structure by minimizing
a joint energy function consisting of an optical flow constraint
and a 3D constraint.
We propose a novel omnidirectional video completion framework based on depth estimation.
First, we recover the depth of the scene from a pixel motion model constrained by known camera pose.
The depth map is further improved by a structureaware refinement.
The refined depth map is then employed for color propagation into the holes.
One of the challenges in mixed reality (MR) applications is
handling contradictory occlusions between real and virtual
We proposed a solution to the occlusion problem
that does not require precise foreground-background
In our method, a virtual object is blended with a
real scene so that the virtual object can be perceived as being
behind the foreground region.
Outdoor environment makes the occlusion handling difficult due to the unpredictable illumination changes.
We proposed an outdoor illumination constraints for resolving the foreground occlusion
problem in outdoor environment.
In addition, we introduce an effective method to resolve the shadow occlusion problem
by using shadow detection and recasting with a spherical vision camera.
- L. B. Vinh, T. Kakuta, R. Kawakami, T. Oishi, K. Ikeuchi, "Foreground and Shadow Occlusion Handling for Outdoor Augmented Reality," In Proc. 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2010), pp. 109-118, 13-16 Oct. 2010, Seoul.
- T. Kakuta, , L. B. Vinh, R. Kawakami, T. Oishi, K. Ikeuchi, "Detection of Moving Objects and Cast Shadows Using a Spherical Vision Camera for Outdoor Mixed Reality," ACM Symp. on Virtual Reality Software and Technology (VRST 2008), Oct. 2008, pp. 219-222.
There are many situations in which virtual objects are presented
half-transparently on a background in real time applications.
In such cases, we often need to show the object with constant
To overcome this problem, we present a framework for blending
images based on a subjective metric of visibility. In our method, a
blending parameter is locally and adaptively optimized so that
the visibility of each location achieves the targeted level.
In outdoor Mixed Reality (MR), objects distant from the observer
suffer from an effect called aerial perspective that fades the color
of the objects and blends it to the environmental light color.
We present a turbiditybased
method for rendering a virtual object with aerial perspective
effect in a MR application.
M. Carlos, T. Oishi, K. Ikeuchi, "Real-time rendering of aerial perspective effect based on turbidity estimation," IPSJ Transactions on Computer Vision and Applications (TCVA), vol. 9, no. 1, 2017.
M. Carlos, T. Oishi, K. Ikeuchi, "Turbidity-based Aerial Perspective Rendering for Mixed Reality," In Proc. 12th IEEE International Symposium on Mixed and Augmented Reality (ISMAR2014), pp. 283-284, Munich, Sept. 2014.
Localization / tracking
A robust image-based alignment method to be used in outdoor
environments is proposed.
In the proposed method, the albedo of real objects is estimated
using 3D shapes of these objects in advance,
and the appearance is reproduced from the albedo and current light
environment. The appearance of real objects and reproduced image
becomes close, so a robust image-based alignment is achieved.
Virtual Reconstruction of Cultural Heritage Assets
We developed Mixed Reality (MR) contents that reconstructed the ancient capital of Asuka-Kyo and applied a fast shading and
shadowing method that used shadowing planes. We conducted a subjective evaluation experiment with Head Mounted Display,
which showed that displaying these contents increased the audience's knowledge of both Asuka-Kyo and MR technologies. We also
conducted impression evaluation tests with and without shading and shadowing.
- T. Kakuta, T. Oishi, K. Ikeuchi, "Development and Evaluation of Asuka-Kyo MR Contents with Fast Shading and Shadowing," Proc. Int. Society on Virtual Systems and MultiMedia (VSMM 2008), pp.254-260, Oct. 2008.
- T. Kakuta, T. Oishi, K. Ikeuchi, "Virtual Kawaradera: Fast Shadow Texture for Augmented Reality," Proc. the Tenth International Conference on Virtual System and Multimedia (VSMM 2004), pp. 141-150, November 2004.