gpuRIR: A python library for room impulse response simulation with GPU acceleration

The Image Source Method (ISM) is one of the most employed techniques to calculate acoustic Room Impulse Responses (RIRs), however, its computational complexity grows fast with the reverberation time of the room and its computation time can be prohibitive for some applications where a huge number of...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 80; no. 4; pp. 5653 - 5671
Main Authors Diaz-Guerra, David, Miguel, Antonio, Beltran, Jose R.
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The Image Source Method (ISM) is one of the most employed techniques to calculate acoustic Room Impulse Responses (RIRs), however, its computational complexity grows fast with the reverberation time of the room and its computation time can be prohibitive for some applications where a huge number of RIRs are needed. In this paper, we present a new implementation that dramatically improves the computation speed of the ISM by using Graphic Processing Units (GPUs) to parallelize both the simulation of multiple RIRs and the computation of the images inside each RIR. Additional speedups were achieved by exploiting the mixed precision capabilities of the newer GPUs and by using lookup tables. We provide a Python library under GNU license that can be easily used without any knowledge about GPU programming and we show that it is about 100 times faster than other state of the art CPU libraries. It may become a powerful tool for many applications that need to perform a large number of acoustic simulations, such as training machine learning systems for audio signal processing, or for real-time room acoustics simulations for immersive multimedia systems, such as augmented or virtual reality.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-020-09905-3