Blending ultra-high resolution optical sensor and lidar data to identify mobile living creatures

Fatwa Ramdani, Nasrulloh R.B.S. Loka, Alfiyan Arief, Ika Qutsiati Utami

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

The urban environment are characterized by various materials and land cover surfaces than found in common scenes. Data gotten from satellite imagery has been used to map the detail of urban environment. However, there is very limited scientific documentation on the identification procedure of living creatures in urban areas till date. This study describes the small scale geometric properties (i.e., human) based-on blending ultrahigh resolution optical and LIDAR data. We additionally illustrated how new remote detecting innovations can enhance our capacities to outline regions in ultra-high spatial and portioned detail. The result appears promising for detecting mobile living creatures. This study used only the Free and Open Source Software (FOSS) to promote FOSS in urban remote sensing studies.

Original languageEnglish
Title of host publication2017 International Symposium on Geoinformatics, ISyG 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages9-12
Number of pages4
ISBN (Electronic)9781538616833
DOIs
Publication statusPublished - 2 Jul 2017
Externally publishedYes
Event2017 International Symposium on Geoinformatics, ISyG 2017 - Malang, Indonesia
Duration: 24 Nov 201725 Nov 2017

Publication series

Name2017 International Symposium on Geoinformatics, ISyG 2017
Volume2018-January

Conference

Conference2017 International Symposium on Geoinformatics, ISyG 2017
Country/TerritoryIndonesia
CityMalang
Period24/11/1725/11/17

Keywords

  • DTM
  • LIDAR
  • data fusion
  • living creatures
  • ultra-high resolution

Fingerprint

Dive into the research topics of 'Blending ultra-high resolution optical sensor and lidar data to identify mobile living creatures'. Together they form a unique fingerprint.

Cite this