Details

Computational Imaging for Scene Understanding


Computational Imaging for Scene Understanding

Transient, Spectral, and Polarimetric Analysis
1. Aufl.

von: Takuya Funatomi, Takahiro Okabe

142,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 15.04.2024
ISBN/EAN: 9781394284429
Sprache: englisch
Anzahl Seiten: 352

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p>Most cameras are inherently designed to mimic what is seen by the human eye: they have three channels of RGB and can achieve up to around 30 frames per second (FPS).</p> <p>However, some cameras are designed to capture other modalities: some may have the ability to capture spectra from near UV to near IR rather than RGB, polarimetry, different times of light travel, etc. Such modalities are as yet unknown, but they can also collect robust data of the scene they are capturing.</p> <p>This book will focus on the emerging computer vision techniques known as computational imaging. These include capturing, processing and analyzing such modalities for various applications of scene understanding.</p>
<p>Introduction xiii<br /><i>Takuya FUNATOMI and Takahiro OKABE</i></p> <p><b>Part 1 Transient Imaging and Processing 1</b></p> <p><b>Chapter 1 Transient Imaging 3</b><br /><i>Adrian JARABO</i></p> <p>1.1 Introduction 3</p> <p>1.2.Mathematical formulation 5</p> <p>1.2.1 Analysis of transient light transport propagation 7</p> <p>1.2.2 Sparsity of the impulse response function T (x, t) 8</p> <p>1.3.Capturinglight in flight 9</p> <p>1.3.1 Single-photon avalanche diodes (SPAD) 11</p> <p>1.4.Applications 14</p> <p>1.4.1.Range imaging 14</p> <p>1.4.2.Material estimation and classification 14</p> <p>1.4.3 Light transport decomposition 15</p> <p>1.5.Non-line-of-sight imaging 15</p> <p>1.5.1.Backprojection 17</p> <p>1.5.2 Confocal NLOS and the light-cone transform 17</p> <p>1.5.3 Surface-based methods 18</p> <p>1.5.4.Virtualwaves and phasor fields 19</p> <p>1.5.5.Discussion 21</p> <p>1.6.Conclusion 22</p> <p>1.7.References 22</p> <p><b>Chapter 2 Transient Convolutional Imaging 29</b><br /><i>Felix HEIDE</i></p> <p>2.1 Introduction 29</p> <p>2.2.Time-of-flight imaging 30</p> <p>2.2.1.Correlationimage sensors 32</p> <p>2.2.2 Convolutional ToF depth imaging 32</p> <p>2.2.3 Multi-path interference 34</p> <p>2.3.Transient convolutional imaging 35</p> <p>2.3.1 Global convolutional transport 37</p> <p>2.3.2 Transient imaging using correlation image sensors 37</p> <p>2.3.3 Spatio-temporal modulation 40</p> <p>2.4.Transient imaging in scattering media 41</p> <p>2.5.Present and future directions 43</p> <p>2.6.References 43</p> <p><b>Chapter 3 Time-of-Flight and Transient Rendering 45</b><br /><i>Adithya Kumar PEDIREDLA</i></p> <p>3.1 Introduction 45</p> <p>3.2.Mathematicalmodeling 46</p> <p>3.2.1 Mathematical modeling for time-of-flight cameras 47</p> <p>3.3.Howto render time-of-flight cameras? 50</p> <p>3.3.1 Challenges and solutions in time-of-flight rendering 51</p> <p>3.4.Open-source implementations 56</p> <p>3.5.Applicationsof transient rendering 57</p> <p>3.6.Future directions 61</p> <p>3.7.References 62</p> <p><b>Part 2 Spectral Imaging and Processing 69</b></p> <p><b>Chapter 4 Hyperspectral Imaging 71</b><br /><i>Nathan HAGEN</i></p> <p>4.1 Introduction 71</p> <p>4.2.2D(raster scanning) architectures 75</p> <p>4.2.1.Czerny-Turnergratingspectrometers 76</p> <p>4.2.2 Transmission grating/prism spectrometers 78</p> <p>4.2.3.Coded aperture spectrometers 79</p> <p>4.2.4.Echelle spectrometers 80</p> <p>4.3.1Dscanningarchitectures 81</p> <p>4.3.1.Dispersive spectrometers 82</p> <p>4.3.2 Interferometric methods 83</p> <p>4.3.3 Interferometric filter methods 83</p> <p>4.3.4 Polarization-based filter methods 86</p> <p>4.3.5 Active illumination methods 88</p> <p>4.4.Snapshot architectures 88</p> <p>4.4.1.Bowen-Walravenimage slicer 89</p> <p>4.4.2 Image slicing and imagemapping 90</p> <p>4.4.3 Integral field spectrometry with coherent fiber bundles (IFS-F) 93</p> <p>4.4.4 Integral field spectroscopy with lens let arrays (IFS-L) 94</p> <p>4.4.5 Filter array camera (FAC) 94</p> <p>4.4.6 Computed tomography imaging spectrometry (CTIS) 96</p> <p>4.4.7 Coded aperture snapshot spectral imager (CASSI) 97</p> <p>4.5.Comparisonof snapshot techniques 98</p> <p>4.5.1.The disadvantages of snapshot 100</p> <p>4.6.Conclusion 101</p> <p>4.7.References 102</p> <p><b>Chapter 5 Spectral Modeling and Separation of Reflective-Fluorescent Scenes 109</b><br /><i>Ying FU, Antony LAM, Imari SATO, Takahiro OKABE, and Yoichi SATO</i></p> <p>5.1 Introduction 109</p> <p>5.2.RelatedWork 111</p> <p>5.3.Separationof reflection and fluorescence 113</p> <p>5.3.1.Reflection and fluorescence models 113</p> <p>5.3.2 Separation using high-frequency illumination 114</p> <p>5.3.3 Discussion on the illumination frequency 116</p> <p>5.3.4.Error analysis 118</p> <p>5.4.Estimating the absorption spectra 119</p> <p>5.5.Experiment results and analysis 122</p> <p>5.5.1.Experimental setup 122</p> <p>5.5.2 Quantitative evaluation of recovered spectra 122</p> <p>5.5.3.Visual separation and relighting results 126</p> <p>5.5.4 Separation by using high-frequency filters 130</p> <p>5.5.5 Ambient illumination 134</p> <p>5.6.Limitations and conclusion 137</p> <p>5.7.References 137</p> <p><b>Chapter 6 Shape from Water 141</b><br /><i>Yuta ASANO, Yinqiang ZHANG, Ko NISHINO, and Imari SATO</i></p> <p>6.1 Introduction 141</p> <p>6.2.Relatedworks 143</p> <p>6.3.Light absorption in water 145</p> <p>6.4 Bispectral light absorption for depth recovery 146</p> <p>6.4.1.Bispectral depth imaging 146</p> <p>6.4.2 Depth accuracy and surface reflectance 147</p> <p>6.5.Practical shape from water 148</p> <p>6.5.1 Non-collinear/perpendicular light-camera configuration 148</p> <p>6.5.2 Perspective camera with a point source 150</p> <p>6.5.3.Non-idealnarrow-bandfilters 151</p> <p>6.6 Co-axial bispectral imaging system and experiment results 151</p> <p>6.6.1.Systemconfigurationand calibration 151</p> <p>6.6.2 Depth and shape accuracy 152</p> <p>6.6.3 Complex static and dynamic objects 154</p> <p>6.7 Trispectral light absorption for depth recovery 155</p> <p>6.7.1.Trispectraldepthimaging 156</p> <p>6.7.2 Evaluation on the reflectance spectra database 157</p> <p>6.8.Discussions 157</p> <p>6.9.Conclusion 158</p> <p>6.10.References 158</p> <p><b>Chapter 7 Far Infrared Light Transport Decomposition and Its Application for Thermal Photometric Stereo 161</b><br /><i>Kenichiro TANAKA</i></p> <p>7.1 Introduction 161</p> <p>7.1.1.Contributions 162</p> <p>7.2.Relatedwork 163</p> <p>7.2.1 Light transport decomposition 163</p> <p>7.2.2.Computational thermal imaging 164</p> <p>7.2.3.Photometricstereo 165</p> <p>7.3.Far infrared light transport 165</p> <p>7.4 Decomposition and application 171</p> <p>7.4.1 Far infrared light transport decomposition 171</p> <p>7.4.2 Separating the ambient component 172</p> <p>7.4.3.Separatingreflectionand radiation 172</p> <p>7.4.4 Separating diffuse and global radiations 172</p> <p>7.4.5.Other options 173</p> <p>7.4.6 Thermal photometric stereo 173</p> <p>7.5.Experiments 174</p> <p>7.5.1 Decomposition result 175</p> <p>7.5.2.Surfacenormal estimation 177</p> <p>7.6.Conclusion 179</p> <p>7.7.References 180</p> <p><b>Chapter 8 Synthetic Wavelength Imaging: Utilizing Spectral Correlations for High-Precision Time-of-Flight Sensing 187</b><br /><i>Florian WILLOMITZER</i></p> <p>8.1 Introduction 187</p> <p>8.2.Syntheticwavelengthimaging 189</p> <p>8.3.Synthetic wavelength interferometry 193</p> <p>8.4 Synthetic wavelength holography 197</p> <p>8.4.1 Imaging around corners with synthetic wavelength holography 199</p> <p>8.4.2 Imaging through scattering media with synthetic wavelength holography 200</p> <p>8.4.3 Discussion and comparison with the state of the art 203</p> <p>8.5 Fundamental performance limits of synthetic wavelength imaging 205</p> <p>8.6.Conclusionand future directions 210</p> <p>8.7.Acknowledgment 210</p> <p>8.8.References 211</p> <p><b>Part 3 Polarimetric Imaging and Processing 219</b></p> <p><b>Chapter 9 Polarization-Based Shape Estimation 221</b><br /><i>Daisuke MIYAZAKI</i></p> <p>9.1 Fundamental theory of polarization 221</p> <p>9.2 Reflection component separation 225</p> <p>9.3.Phase angle of polarization 226</p> <p>9.4 Surface normal estimation from the phase angle 228</p> <p>9.5.Degree of polarization 233</p> <p>9.6 Surface normal estimation from the degree of polarization 236</p> <p>9.7.Stokes vector 236</p> <p>9.8 Surface normal estimation from the Stokes vector 237</p> <p>9.9.References 239</p> <p><b>Chapter 10 Shape from Polarization and Shading 241</b><br /><i>Thanh-Trung NGO, Hajime NAGAHARA, and Rin-ichiro TANIGUCHI</i></p> <p>10.1 Introduction 241</p> <p>10.2.Relatedworks 243</p> <p>10.2.1.Shadingand polarization fusion 243</p> <p>10.2.2 Shape estimation under uncalibrated light sources 244</p> <p>10.3 Problem setting and assumptions 245</p> <p>10.4.Shadingstereoscopic constraint 246</p> <p>10.5.Polarizationstereoscopic constraint 248</p> <p>10.6.Normal estimation with two constraints 249</p> <p>10.6.1 Algorithm 1: Recovering individual surface points 250</p> <p>10.6.2 Algorithm 2: Recovering shape and light directions 251</p> <p>10.7.Experiments 252</p> <p>10.7.1 Simulation experiments with weights for two constraints 253</p> <p>10.7.2.Real-world experiments 254</p> <p>10.8.Conclusionand future works 263</p> <p>10.9.References 263</p> <p><b>Chapter 11 Polarization Imaging in the Wild Beyond the Unpolarized World Assumption 269</b><br /><i>Jérémy Maxime RIVIERE</i></p> <p>11.1 Introduction 269</p> <p>11.2.Mueller calculus 271</p> <p>11.3.Polarizingfilters 273</p> <p>11.3.1.Linear polarizers 273</p> <p>11.3.2.Reflectors 274</p> <p>11.4.Polarizationimaging 275</p> <p>11.5 Image formation model 277</p> <p>11.5.1 Partially linearly polarized incident illumination 277</p> <p>11.5.2 Unpolarized incident illumination 279</p> <p>11.5.3.Discussion 280</p> <p>11.6.Polarization imaging reflectometry in the wild 282</p> <p>11.7.DigitalSingle-Lens Reflex (DSLR) setup 283</p> <p>11.7.1 Data acquisition 283</p> <p>11.7.2.Calibration 285</p> <p>11.7.3.Polarizationprocessingpipeline 285</p> <p>11.8.Reflectance recovery 287</p> <p>11.8.1.Surface normal estimation 287</p> <p>11.8.2.Diffuse albedo estimation 288</p> <p>11.8.3 Specular component estimation 288</p> <p>11.9.Results and analysis 291</p> <p>11.9.1.Results 291</p> <p>11.9.2.Discussion and error analysis 293</p> <p>11.10.References 296</p> <p><b>Chapter 12 Multispectral Polarization Filter Array 299</b><br /><i>Kazuma SHINODA</i></p> <p>12.1 Introduction 299</p> <p>12.2 Multispectral polarization filter array with a photonic crystal 302</p> <p>12.3 Generalization of imaging and demosaicking with multispectral polarization filter arrays 306</p> <p>12.4.Demonstration 311</p> <p>12.5.Conclusion 313</p> <p>12.6.References 313</p> <p>List of Authors 317</p> <p>Index 319</p>
<p><b>Takuya Funatomi</b> is an associate professor in Division of Information Science at the Nara Institute of Science and Technology (NAIST) in Japan. He has received a bachelor's degree in Engineering and a master’s degree and PhD in Informatics from Kyoto University in 2002, 2004 and 2007, respectively.</p> <p><b>Takahiro Okabe</b> is a professor at Kyushu Institute of Technology in Japan. He received a bachelor's and a master's degree in Physics and a PhD in Information Science and Technology from the University of Tokyo in 1997, 1999 and 2011, respectively.</p>

Diese Produkte könnten Sie auch interessieren:

Critical Systems Thinking
Critical Systems Thinking
von: Michael C. Jackson
PDF ebook
34,99 €
Circuitos lógicos digitales 4ed
Circuitos lógicos digitales 4ed
von: Javier Vázquez del Real
EPUB ebook
28,99 €