BRDF Representation and Acquisition

Photorealistic rendering of real world environments is important in a range of different areas; including Visual Special effects, Interior/Exterior Modelling, Architectural Modelling, Cultural Heritage, Computer Games and Automotive Design. Currently, rendering systems are able to produce photoreali...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 35; no. 2; pp. 625 - 650
Main Authors Guarnera, D., Guarnera, G.C., Ghosh, A., Denk, C., Glencross, M.
Format Journal Article
LanguageEnglish
Published Oxford Blackwell Publishing Ltd 01.05.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Photorealistic rendering of real world environments is important in a range of different areas; including Visual Special effects, Interior/Exterior Modelling, Architectural Modelling, Cultural Heritage, Computer Games and Automotive Design. Currently, rendering systems are able to produce photorealistic simulations of the appearance of many real‐world materials. In the real world, viewer perception of objects depends on the lighting and object/material/surface characteristics, the way a surface interacts with the light and on how the light is reflected, scattered, absorbed by the surface and the impact these characteristics have on material appearance. In order to re‐produce this, it is necessary to understand how materials interact with light. Thus the representation and acquisition of material models has become such an active research area. This survey of the state‐of‐the‐art of BRDF Representation and Acquisition presents an overview of BRDF (Bidirectional Reflectance Distribution Function) models used to represent surface/material reflection characteristics, and describes current acquisition methods for the capture and rendering of photorealistic materials.
Bibliography:ark:/67375/WNG-TZX8NHM6-J
istex:17E3C66F16B3D50D8CA912948F8EED06230D9046
ArticleID:CGF12867
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12867