TerraMobilita/iQmulus urban point cloud analysis benchmark

The objective of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser scanning (MLS) at large scale. A very detailed semantic tree for urban scenes is proposed. We call analysis the capacity of a method to separat...

Full description

Saved in:
Bibliographic Details
Published inComputers & graphics Vol. 49; pp. 126 - 133
Main Authors Vallet, Bruno, Brédif, Mathieu, Serna, Andres, Marcotegui, Beatriz, Paparoditis, Nicolas
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.06.2015
Elsevier
SeriesComputers and Graphics
Subjects
Online AccessGet full text

Cover

Loading…
Abstract The objective of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser scanning (MLS) at large scale. A very detailed semantic tree for urban scenes is proposed. We call analysis the capacity of a method to separate the points of the scene into these categories (classification), and to separate the different objects of the same type for object classes (detection). A very large ground truth is produced manually in two steps using advanced editing tools developed especially for this benchmark. Based on this ground truth, the benchmark aims at evaluating the classification, detection and segmentation quality of the submitted results. [Display omitted] •Very rich data: high accuracy, high resolution, many attributes.•Massive data: 160 million annotated points thanks to a performant web based annotation tool (and many hours of work).•Rich semantics organized in a semantic tree with various levels of generalization.•Very objective evaluation metrics.
AbstractList The objective of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser scanning (MLS) at large scale. A very detailed semantic tree for urban scenes is proposed. We call analysis the capacity of a method to separate the points of the scene into these categories (classification), and to separate the different objects of the same type for object classes (detection). A very large ground truth is produced manually in two steps using advanced editing tools developed especially for this benchmark. Based on this ground truth, the benchmark aims at evaluating the classification, detection and segmentation quality of the submitted results. [Display omitted] •Very rich data: high accuracy, high resolution, many attributes.•Massive data: 160 million annotated points thanks to a performant web based annotation tool (and many hours of work).•Rich semantics organized in a semantic tree with various levels of generalization.•Very objective evaluation metrics.
The object of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser scanning (MLS) at large scale. A very detailed semantic tree for urban scenes is proposed. We call analysis the capacity of a method to separate the points of the scene into these categories (classification), and to separate the different objects of the same type for object classes (detection). A very large ground truth is produced manually in two steps using advanced editing tools developed especially for this benchmark. Based on this ground truth, the benchmark aims at evaluating both the classification, detection and segmentation quality of the submitted results.
The objective of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser scanning (MLS) at large scale. A very detailed semantic tree for urban scenes is proposed. We call analysis the capacity of a method to separate the points of the scene into these categories (classification), and to separate the different objects of the same type for object classes (detection). A very large ground truth is produced manually in two steps using advanced editing tools developed especially for this benchmark. Based on this ground truth, the benchmark aims at evaluating the classification, detection and segmentation quality of the submitted results.
Author Vallet, Bruno
Brédif, Mathieu
Marcotegui, Beatriz
Serna, Andres
Paparoditis, Nicolas
Author_xml – sequence: 1
  givenname: Bruno
  surname: Vallet
  fullname: Vallet, Bruno
  email: bruno.vallet@ign.fr
  organization: Université Paris-Est, IGN Recherche, SRIG, MATIS, 73 avenue de Paris, 94160 Saint Mandé, France
– sequence: 2
  givenname: Mathieu
  surname: Brédif
  fullname: Brédif, Mathieu
  organization: Université Paris-Est, IGN Recherche, SRIG, MATIS, 73 avenue de Paris, 94160 Saint Mandé, France
– sequence: 3
  givenname: Andres
  orcidid: 0000-0003-2348-3079
  surname: Serna
  fullname: Serna, Andres
  organization: Centre de Morphologie Mathématique (CMM), 35 rue Saint Honoré, 77305 Fontainebleau, France
– sequence: 4
  givenname: Beatriz
  surname: Marcotegui
  fullname: Marcotegui, Beatriz
  organization: Centre de Morphologie Mathématique (CMM), 35 rue Saint Honoré, 77305 Fontainebleau, France
– sequence: 5
  givenname: Nicolas
  surname: Paparoditis
  fullname: Paparoditis, Nicolas
  organization: Université Paris-Est, IGN Recherche, SRIG, MATIS, 73 avenue de Paris, 94160 Saint Mandé, France
BackLink https://hal.science/hal-01167995$$DView record in HAL
BookMark eNp9kEFr2zAYhkVJoWm6H7Cbj9vBzidLsqztFErXDjJGoT0LWf68KlOsTLID_fdTSNlhh55e-Hif94PnmizGMCIhHylUFGiz3lXW_KpqoKICVgHwC7KkrWSlbFq-IEsAJcuWK3ZFrlPaAUBdN3xJvjxhjOZH6Jx3k1m7x_3s51TMsTNjcQhunArrw9wXZjT-NblUdDjal72Jv2_I5WB8wg9vuSLP3-6ebh_K7c_777ebbWm5lFPJBQ69AIZKciaMsoMxFBuuaK8EVRJFD00_MOjr1kDdqVYpgQwtVw2XqmMr8vm8-2K8PkSXf7_qYJx-2Gz16QaUNjJDR5q7n87dQwx_ZkyT3rtk0XszYpiTplIwwSnLsSL0XLUxpBRx-LdNQZ-c6p3OTvXJqQams9PMyP8Ym61NLoxTNM6_S349k5hNHR1GnazLJrF3Ee2k--Deof8CiKWReA
CitedBy_id crossref_primary_10_1109_ACCESS_2021_3062547
crossref_primary_10_1007_s42489_025_00185_1
crossref_primary_10_3390_rs16224240
crossref_primary_10_3390_rs15153787
crossref_primary_10_3390_s19194188
crossref_primary_10_1049_cit2_12349
crossref_primary_10_1109_ACCESS_2019_2894533
crossref_primary_10_1109_TGRS_2020_3005960
crossref_primary_10_1109_TITS_2024_3469546
crossref_primary_10_3390_rs12111875
crossref_primary_10_1016_j_isprsjprs_2023_01_009
crossref_primary_10_1016_j_isprsjprs_2018_05_004
crossref_primary_10_1007_s00138_024_01543_1
crossref_primary_10_1016_j_isprsjprs_2020_03_011
crossref_primary_10_1080_01431161_2023_2297177
crossref_primary_10_1016_j_isprsjprs_2019_05_007
crossref_primary_10_1080_17538947_2024_2407943
crossref_primary_10_1007_s10489_022_03930_5
crossref_primary_10_1049_cvi2_12250
crossref_primary_10_1080_2150704X_2022_2163203
crossref_primary_10_1016_j_autcon_2019_103058
crossref_primary_10_1109_ACCESS_2020_2992612
crossref_primary_10_1016_j_isprsjprs_2019_01_004
crossref_primary_10_1109_LGRS_2021_3063290
crossref_primary_10_3390_rs10101531
crossref_primary_10_1016_j_isprsjprs_2023_05_018
crossref_primary_10_1139_geomatica_2018_0001
crossref_primary_10_24012_dumf_1067736
crossref_primary_10_1016_j_optlaseng_2022_107240
crossref_primary_10_1016_j_eswa_2023_121842
crossref_primary_10_1109_JSTARS_2020_3035274
crossref_primary_10_1080_17538947_2023_2228298
crossref_primary_10_1109_LGRS_2020_3037484
crossref_primary_10_1109_ACCESS_2023_3335607
crossref_primary_10_3390_s24061853
crossref_primary_10_1016_j_cag_2015_01_006
crossref_primary_10_1016_j_inffus_2024_102575
crossref_primary_10_3390_rs9030277
crossref_primary_10_3390_rs13153021
crossref_primary_10_3390_app13169146
crossref_primary_10_1109_TPAMI_2020_3005434
crossref_primary_10_1016_j_isprsjprs_2020_08_011
crossref_primary_10_3390_rs15163973
crossref_primary_10_1016_j_isprsjprs_2018_08_009
crossref_primary_10_3390_rs16142518
crossref_primary_10_3390_rs12111729
crossref_primary_10_3390_rs14051090
crossref_primary_10_3390_rs13204029
crossref_primary_10_3103_S1060992X20040062
crossref_primary_10_1016_j_buildenv_2024_112310
crossref_primary_10_3390_robotics12040100
crossref_primary_10_1109_TGRS_2023_3342807
crossref_primary_10_1155_2022_3707985
crossref_primary_10_1016_j_isprsjprs_2015_09_008
crossref_primary_10_3390_s19040810
crossref_primary_10_1109_ACCESS_2021_3094127
crossref_primary_10_3390_rs13183621
crossref_primary_10_1016_j_isprsjprs_2015_07_005
crossref_primary_10_1007_s00371_023_03237_7
crossref_primary_10_3390_rs13224713
crossref_primary_10_3390_rs14163848
crossref_primary_10_1109_TNNLS_2020_3015992
crossref_primary_10_20965_ijat_2017_p0657
crossref_primary_10_3390_rs11121499
crossref_primary_10_1109_TMM_2022_3233256
crossref_primary_10_1061__ASCE_CP_1943_5487_0000929
crossref_primary_10_3390_rs15112735
crossref_primary_10_1007_s11263_021_01504_5
crossref_primary_10_3390_ijgi5010006
crossref_primary_10_3390_rs11232727
crossref_primary_10_1109_JSTARS_2019_2897987
crossref_primary_10_3788_LOP232716
crossref_primary_10_1080_2150704X_2018_1444286
crossref_primary_10_1016_j_autcon_2017_06_026
crossref_primary_10_1016_j_autcon_2024_105870
crossref_primary_10_1016_j_isprsjprs_2024_02_007
crossref_primary_10_1109_JSEN_2019_2927269
crossref_primary_10_3390_rs10040531
crossref_primary_10_1016_j_cag_2015_04_006
crossref_primary_10_1016_j_isprsjprs_2021_07_008
crossref_primary_10_1016_j_isprsjprs_2017_08_010
crossref_primary_10_1080_01431161_2023_2266121
crossref_primary_10_3390_rs17020298
crossref_primary_10_1177_0278364918767506
crossref_primary_10_1080_10106049_2022_2032392
crossref_primary_10_1016_j_eswa_2023_120672
crossref_primary_10_1109_ACCESS_2022_3226211
crossref_primary_10_1109_TGRS_2025_3543174
crossref_primary_10_1016_j_isprsjprs_2022_02_007
crossref_primary_10_1016_j_jag_2018_02_016
crossref_primary_10_1007_s41064_021_00148_x
crossref_primary_10_3390_rs16061079
crossref_primary_10_1080_17538947_2024_2310083
crossref_primary_10_3788_CJL231000
crossref_primary_10_1139_geomat_2018_0001
crossref_primary_10_1109_TITS_2021_3076844
crossref_primary_10_1364_AO_423599
crossref_primary_10_1007_s12524_021_01358_x
crossref_primary_10_1109_TITS_2022_3167957
crossref_primary_10_1007_s11263_021_01554_9
crossref_primary_10_1109_MGRS_2019_2937630
crossref_primary_10_1109_JSTARS_2020_3001978
crossref_primary_10_3390_rs10010002
crossref_primary_10_1109_JSTARS_2021_3090502
crossref_primary_10_3390_su15021157
crossref_primary_10_3390_rs15061493
crossref_primary_10_1016_j_biosystemseng_2023_05_009
crossref_primary_10_1016_j_cag_2022_07_010
crossref_primary_10_3390_rs13163146
Cites_doi 10.1109/3DIMPVT.2011.10
10.5194/isprsannals-II-5-W2-313-2013
10.1109/ICCV.2009.5459471
10.1016/j.isprsjprs.2013.07.001
10.1109/CVPRW.2009.5206590
10.52638/rfpt.2012.63
10.1007/978-3-642-38294-9_18
10.1016/j.isprsjprs.2014.03.015
ContentType Journal Article
Copyright 2015 Elsevier Ltd
Distributed under a Creative Commons Attribution 4.0 International License
Copyright_xml – notice: 2015 Elsevier Ltd
– notice: Distributed under a Creative Commons Attribution 4.0 International License
DBID AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
1XC
VOOES
DOI 10.1016/j.cag.2015.03.004
DatabaseName CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Hyper Article en Ligne (HAL)
Hyper Article en Ligne (HAL) (Open Access)
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList

Computer and Information Systems Abstracts
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1873-7684
EndPage 133
ExternalDocumentID oai_HAL_hal_01167995v1
10_1016_j_cag_2015_03_004
S009784931500028X
GroupedDBID --K
--M
-~X
.DC
.~1
0R~
1B1
1~.
1~5
29F
4.4
457
4G.
5GY
5VS
6TJ
7-5
71M
8P~
9JN
AACTN
AAEDT
AAEDW
AAIAV
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AAXUO
AAYFN
ABAOU
ABBOA
ABEFU
ABMAC
ABTAH
ABXDB
ACDAQ
ACGFS
ACNNM
ACRLP
ACZNC
ADBBV
ADEZE
ADGUI
ADJOM
ADMUD
AEBSH
AEKER
AFFNX
AFKWA
AFTJW
AGHFR
AGSOS
AGUBO
AGYEJ
AHHHB
AHZHX
AI.
AIALX
AIEXJ
AIGVJ
AIKHN
AITUG
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
AOUOD
ARUGR
ASPBG
AVWKF
AXJTR
AZFZN
BKOJK
BLXMC
CS3
EBS
EFJIC
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-Q
G8K
GBLVA
GBOLZ
HLZ
HVGLF
HZ~
H~9
IHE
J1W
K-O
KOM
LG9
M41
MHUIS
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
PC.
Q38
R2-
RIG
ROL
RPZ
SBC
SDF
SDG
SDP
SES
SEW
SPC
SPCBC
SSV
SSW
SSZ
T5K
TN5
UHS
VH1
VOH
WH7
WUQ
XPP
ZMT
ZY4
~02
~G-
AATTM
AAXKI
AAYWO
AAYXX
ABDPE
ABJNI
ABWVN
ACRPL
ADNMO
AEIPS
AFJKZ
AFXIZ
AGCQF
AGQPQ
AGRNS
AIIUN
ANKPU
APXCP
BNPGV
CITATION
SSH
7SC
8FD
JQ2
L7M
L~C
L~D
1XC
VOOES
ID FETCH-LOGICAL-c477t-45efd503e97435a9cfaa1e6491d95197e5d06df30d28a02b98995e3ec496479b3
IEDL.DBID .~1
ISSN 0097-8493
IngestDate Fri May 09 12:23:10 EDT 2025
Fri Jul 11 01:10:03 EDT 2025
Thu Apr 24 23:08:58 EDT 2025
Tue Jul 01 03:26:51 EDT 2025
Sat Jun 29 15:31:02 EDT 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords Benchmark
Mobile mapping
Laser scanning
Segmentation
Urban scene
Classification
detection
urban scene
segmentation
mobile mapping
classification
analysis
laser scanning
Language English
License Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c477t-45efd503e97435a9cfaa1e6491d95197e5d06df30d28a02b98995e3ec496479b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0000-0003-2348-3079
0000-0002-3619-6661
0000-0002-9492-5180
0000-0002-2825-7292
0000-0003-0228-1232
OpenAccessLink https://hal.science/hal-01167995
PQID 1753541375
PQPubID 23500
PageCount 8
ParticipantIDs hal_primary_oai_HAL_hal_01167995v1
proquest_miscellaneous_1753541375
crossref_primary_10_1016_j_cag_2015_03_004
crossref_citationtrail_10_1016_j_cag_2015_03_004
elsevier_sciencedirect_doi_10_1016_j_cag_2015_03_004
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2015-06-01
PublicationDateYYYYMMDD 2015-06-01
PublicationDate_xml – month: 06
  year: 2015
  text: 2015-06-01
  day: 01
PublicationDecade 2010
PublicationSeriesTitle Computers and Graphics
PublicationTitle Computers & graphics
PublicationYear 2015
Publisher Elsevier Ltd
Elsevier
Publisher_xml – name: Elsevier Ltd
– name: Elsevier
References Golovinskiy A, Kim VG, Funkhouser T. Shape-based recognition of 3D point clouds in urban environments. In: International conference on computer vision, ICCV; 2009.
Serna A, Marcotegui B, Goulette F, Deschaud J-E. Paris-rue-Madame database: a 3D mobile laser scanner dataset for benchmarking urban detection, segmentation and classification methods. ICPRAM 2014.
Serna A, Marcotegui B. Attribute controlled reconstruction and adaptive mathematical morphology. In: Eleventh international symposium on mathematical morphology, Uppsala, Sweden; 2013. p. 205–16.
Weinmann M, Jutzi B, Mallet C. Feature relevance assessment for the semantic interpretation of 3D point cloud data. ISPRS workshop on laser scanning. Antalya; 2013.
Hernández J, Marcotegui B. Filtering of artifacts and pavement segmentation from mobile LiDAR data. In: Bretar F, Pierrot-Deseilligny M, Vosselman MG, editors. ISPRS workshop Laser scanning ׳09, vol. XXXVIII-3/W8 of The international archives of the photogrammetry, remote sensing and spatial information sciences. Paris, France; 2009, p. 329–33.
Serna, Marcotegui (bib7) 2013; 84

Kaartinen H, Kukko A, Hyyppä J, Lehtomäki M. EuroSDR benchmarking of mobile mapping algorithms and systems. EuroSDR report; 2012
Shapovalov R, Velizhev A, Barinova O. Non-associative Markov networks for 3D point cloud classification. The international archives of the photogrammetry, remote sensing and spatial information sciences, vol. XXXVIII. Part 3A; 2010. p. 103–8.
Paparoditis N, Papelard J-P, Cannelle B, Devaux A, Soheilian B, David N, et al., Stereopolis II: a multi-purpose and multi-sensor 3D mobile mapping system for street visualisation and 3D metrology. Revue Française de Photogrammétrie et de Télédétection 200: 69–79; October 2012.
Serna, Marcotegui (bib8) 2014; 93
Munoz D, Bagnell JA, Vandapel N, Hebert M. Contextual classification with functional max-margin Markov networks. In: IEEE Computer society conference on computer vision and pattern recognition (CVPR); June, 2009.
10.1016/j.cag.2015.03.004_bib5
10.1016/j.cag.2015.03.004_bib4
10.1016/j.cag.2015.03.004_bib6
10.1016/j.cag.2015.03.004_bib9
10.1016/j.cag.2015.03.004_bib10
10.1016/j.cag.2015.03.004_bib1
10.1016/j.cag.2015.03.004_bib11
Serna (10.1016/j.cag.2015.03.004_bib8) 2014; 93
10.1016/j.cag.2015.03.004_bib3
10.1016/j.cag.2015.03.004_bib2
Serna (10.1016/j.cag.2015.03.004_bib7) 2013; 84
References_xml – reference: Munoz D, Bagnell JA, Vandapel N, Hebert M. Contextual classification with functional max-margin Markov networks. In: IEEE Computer society conference on computer vision and pattern recognition (CVPR); June, 2009.
– reference: Serna A, Marcotegui B, Goulette F, Deschaud J-E. Paris-rue-Madame database: a 3D mobile laser scanner dataset for benchmarking urban detection, segmentation and classification methods. ICPRAM 2014. 〈
– reference: Hernández J, Marcotegui B. Filtering of artifacts and pavement segmentation from mobile LiDAR data. In: Bretar F, Pierrot-Deseilligny M, Vosselman MG, editors. ISPRS workshop Laser scanning ׳09, vol. XXXVIII-3/W8 of The international archives of the photogrammetry, remote sensing and spatial information sciences. Paris, France; 2009, p. 329–33.
– reference: Shapovalov R, Velizhev A, Barinova O. Non-associative Markov networks for 3D point cloud classification. The international archives of the photogrammetry, remote sensing and spatial information sciences, vol. XXXVIII. Part 3A; 2010. p. 103–8.
– reference: Golovinskiy A, Kim VG, Funkhouser T. Shape-based recognition of 3D point clouds in urban environments. In: International conference on computer vision, ICCV; 2009.
– volume: 93
  start-page: 243
  year: 2014
  end-page: 255
  ident: bib8
  article-title: Detection, segmentation and classification of 3D urban objects using mathematical morphology and supervised learning
  publication-title: ISPRS J Photogramm Remote Sens
– reference:
– reference: Kaartinen H, Kukko A, Hyyppä J, Lehtomäki M. EuroSDR benchmarking of mobile mapping algorithms and systems. EuroSDR report; 2012
– volume: 84
  start-page: 23
  year: 2013
  end-page: 32
  ident: bib7
  article-title: Urban accessibility diagnosis from mobile laser scanning data
  publication-title: ISPRS J Photogramm Remote Sens
– reference: Serna A, Marcotegui B. Attribute controlled reconstruction and adaptive mathematical morphology. In: Eleventh international symposium on mathematical morphology, Uppsala, Sweden; 2013. p. 205–16.
– reference: Paparoditis N, Papelard J-P, Cannelle B, Devaux A, Soheilian B, David N, et al., Stereopolis II: a multi-purpose and multi-sensor 3D mobile mapping system for street visualisation and 3D metrology. Revue Française de Photogrammétrie et de Télédétection 200: 69–79; October 2012.
– reference: Weinmann M, Jutzi B, Mallet C. Feature relevance assessment for the semantic interpretation of 3D point cloud data. ISPRS workshop on laser scanning. Antalya; 2013.
– ident: 10.1016/j.cag.2015.03.004_bib9
– ident: 10.1016/j.cag.2015.03.004_bib10
  doi: 10.1109/3DIMPVT.2011.10
– ident: 10.1016/j.cag.2015.03.004_bib11
  doi: 10.5194/isprsannals-II-5-W2-313-2013
– ident: 10.1016/j.cag.2015.03.004_bib1
  doi: 10.1109/ICCV.2009.5459471
– volume: 84
  start-page: 23
  year: 2013
  ident: 10.1016/j.cag.2015.03.004_bib7
  article-title: Urban accessibility diagnosis from mobile laser scanning data
  publication-title: ISPRS J Photogramm Remote Sens
  doi: 10.1016/j.isprsjprs.2013.07.001
– ident: 10.1016/j.cag.2015.03.004_bib4
  doi: 10.1109/CVPRW.2009.5206590
– ident: 10.1016/j.cag.2015.03.004_bib3
– ident: 10.1016/j.cag.2015.03.004_bib2
– ident: 10.1016/j.cag.2015.03.004_bib5
  doi: 10.52638/rfpt.2012.63
– ident: 10.1016/j.cag.2015.03.004_bib6
  doi: 10.1007/978-3-642-38294-9_18
– volume: 93
  start-page: 243
  year: 2014
  ident: 10.1016/j.cag.2015.03.004_bib8
  article-title: Detection, segmentation and classification of 3D urban objects using mathematical morphology and supervised learning
  publication-title: ISPRS J Photogramm Remote Sens
  doi: 10.1016/j.isprsjprs.2014.03.015
SSID ssj0002264
Score 2.4496372
Snippet The objective of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser...
The object of the TerraMobilita/iQmulus 3D urban analysis benchmark is to evaluate the current state of the art in urban scene analysis from mobile laser...
SourceID hal
proquest
crossref
elsevier
SourceType Open Access Repository
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 126
SubjectTerms Benchmark
Benchmarking
Categories
Classification
Computer Science
Ground truth
Image Processing
Laser scanning
Lasers
Mobile mapping
Scene analysis
Segmentation
Semantics
Three dimensional models
Urban scene
Title TerraMobilita/iQmulus urban point cloud analysis benchmark
URI https://dx.doi.org/10.1016/j.cag.2015.03.004
https://www.proquest.com/docview/1753541375
https://hal.science/hal-01167995
Volume 49
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bT8IwFG4QX_TBeI14IdP4ZDLp6GXMN0IkeIHEBBLemq4rgsJGBvPR327PLnhJ5MHHNe22ntOe0_Z85ytCV1py5fpmAhJFiA0uwG4AAy1tSAe4SzDXcN7R7fHOgD4M2bCEWkUuDMAqc9uf2fTUWucltVyatflkAjm-ZgdEPeKw9IxoCBns1IVRfvPxBfOARNGMidJYY1O7iGymGC8lXwDdxTKeU_qXb9oYA0jyl61OHVB7F-3kK0ermf3cHirpcB9tf-MTPEC3fR3Hshtl1Nu1yfMsmSYLK4l9GVrzaBIuLTWNksCSORWJ5ZsOj2cyfjtEg_Zdv9Wx88sRbEVdd2lTpkcBw0SbDQFh0lMjKR3NqecEHiSjahZgHowIDuoNieu-Z1TANNGKQu6p55MjVA6jUB8jyzE-2qVYEsIhCKl8wiXXrIHpiBDl8wrChViEypnD4QKLqSggYq_CSFKAJAUmwkiygq5XTeYZbca6yrSQtfihe2HM-rpml0Yvq9cDT3an-SSgLIsueezdqaCLQm3CTByIhshQR8lCAEUpMy7cZSf_-_4p2oKnDDd2hsrLONHnZoWy9KvpEKyizeb9Y6f3CYQn4PY
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3JTsMwEB0VOAAHxCrKGhAnpFAHL2m4VQhUoEVCKlJvluO4tFCSKjQc-XY8WdgkOHB17CwzmRnb8-YZ4Mgoof3QGiDVlLoYAtwmMtCypvKQu4QIg_sd3VvRvmfXfd6vwXlVC4OwytL3Fz4999ZlS6OUZmMyGmGNr10BsYB6PN8j6s_AHLPmi8cYnLx94jywUrSgorTu2HavUps5yEurB4R38YLolP0WnGaGiJL84azzCHS5DEvl1NFpFW-3AjUTr8LiF0LBNTjrmTRV3aTg3m6M7p6zcfbiZGmoYmeSjOKpo8dJFjmq5CJxQvvFw2eVPq3D_eVF77ztlqcjuJr5_tRl3AwiTqixKwLKVaAHSnlGsMCLAqxGNTwiIhpQEp02FTkNA6sDbqjRDItPg5BuwGycxGYTHM8GaZ8RRanALKQOqVDC8CZhA0p1KOpAKrFIXVKH4wkWY1lhxB6llaRESUpCpZVkHY4_hkwK3oy_OrNK1vKb8qX1638NO7R6-bg9EmW3Wx2JbUV6KeCvXh0OKrVJazmYDlGxSbIXiRyl3MZwn2_97_n7MN_udTuyc3V7sw0LeKUAke3A7DTNzK6drkzDvfx3fAfCO-KE
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=TerraMobilita%2FiQmulus+urban+point+cloud+analysis+benchmark&rft.jtitle=Computers+%26+graphics&rft.au=Vallet%2C+Bruno&rft.au=Br%C3%A9dif%2C+Mathieu&rft.au=Serna%2C+Andres&rft.au=Marcotegui%2C+Beatriz&rft.date=2015-06-01&rft.issn=0097-8493&rft.volume=49&rft.spage=126&rft.epage=133&rft_id=info:doi/10.1016%2Fj.cag.2015.03.004&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_cag_2015_03_004
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0097-8493&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0097-8493&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0097-8493&client=summon