What to write about?
Charles K. Toth
Research Professor
Department of Civil, Environmental
and Geodetic Engineering
(Ohio State University)
PROFILE
Study at Technical University of Budapest:
⚫ M.Sc. in Electrical Engineering
⚫ Ph.D. in Electrical Engineering and Geo-Information Sciences
Research interests and expertise: Broad areas of spatial information sciences and systems, including photogrammetry and computer vision, navigation and georeferencing, multi-sensor geospatial data acquisition systems, such as GNSS/IMU and other sensor integration for navigation in GNSS-challenged environments, sensors and algorithms for indoor/personal navigation, image-based navigation using artificial intelligence (AI) methods, UAS and mobile mapping technologies.
Publications: Charles has published more than 400 journal and conference papers, several book chapters, and contributed to over 60 research projects.
Awards: Academic Research Award, 2021 ION Captain P.V.H. Weems Award, 2018 ASPRS Outstanding Service Award, 2016 ISPRS Schwidefsky Medal and many more.
Memberships: He is member of the ASPRS and ISPRS.
Our new blog category 'Expert Talk' provides a platform for experts from the geospatial industry and science to air their thoughts. This will allow us to cover a whole range of topics around aerial surveying, photogrammetry, georeferencing, mapping and more that go beyond the UltraCam (product) family and encourage a wide variety of discussions in the aerial survey industry. We invite you to read interesting articles, gain exciting insights into current scientific findings, and dive into theoretical visions of the future.
But now, Charles, the floor is yours!
What to write about?
When my long-time friend Michael Gruber, Chief Scientist Photogrammetry at Vexcel Imaging, asked me about a contribution to their blog, while I wasn’t sure what thoughts I could share about Vexcel’s top-of-the-line camera sensor suite, obviously, there was no choice but to accept his invitation. Both of us are quite passionate about photogrammetry and its future, and we witnessed the main transition from analytical to digital photogrammetry which culminated with the introduction of digital sensors about 20 years ago. However, the changes back then are dwarfed by the phenomenal technological developments of recent years. So where is photogrammetry heading, in the light of the two major trends dominating geospatial data acquisition?
'Crowdsensing' becomes an emerging approach in data acquisition
Mapping has become mainstream with the introduction of Google Maps, Bing, and Apple Maps (later), in parallel with the collaboration based OpenStreetMap project which are among the most fundamental applications on smart devices in our world where spatial awareness is increasingly becoming important. These Internet Giants operate large fleets of mobile mapping vehicles to acquire street level data and buy airborne and satellite imagery to continuously update their databases. In contrast, OpenStreetMap relies on voluntary geographic information; data acquired by ordinary users who not only collect but can enter and edit the database. While there are many more relevant differences between the two approaches, the message here is clear that the user of map data can also be the source of the map data. As the sensing capabilities continue to grow, crowdsourcing or (crowdsensing) will gradually become a strong competitor to professionally acquired geospatial data. Currently, the competition is limited to mobile mapping data, acquired mainly in urban areas, but as the use of UAS is expanding, largescale airborne data will soon be available in volume. This will potentially provide coverage for most areas where mapping data is needed for personal and vehicle navigation as well as location-based services in general. For example, the performance of crowdsourced traffic data simply cannot be matched with traditional traffic data produced by transportation departments.
Autonomous vehicles for mapping purposes
Autonomous vehicle (AV) technologies are rapidly advancing and representing the largest R&D investments now. Progress has been remarkably strong on both sides: sensors, mainly lidar and perception, using AI. With respect to mapping, it is generally agreed that HD maps (high-definition map, defined in automotive context) are essential to keep the vehicle on the road and assist the sensing, tracking and scene interpretation tasks. By now, all the production vehicles (Levels 2 and 3) use some sort of proprietary maps; mostly acquired by conventional mapping, though there are already signs that data acquired by vehicles is used to learn a particular route, such as for a shuttle or update the map database. Compared to smart devices, an AV vehicle represents significantly more powerful sensing capabilities and substantially more processing power; in terms of communication abilities, however, there is no real difference. Of course, the mapping performance of an AV vehicle is well below a professional mobile mapping vehicle, but the difference is likely to shrink over time. More importantly, AV vehicles can endlessly map the same area, and thus, this extremely high redundancy could provide a basis for obtaining fairly accurate map data, clearly, paving the way to the future when HD maps will likely be acquired by AV vehicles.
Outlooks in photogrammetry
Photogrammetry, as a discipline to accurately reconstruct the 3D object space from images (and lidar) will continue to be used as a core component in both conventional and crowdsourced mapping. Chances are that the urban mapping will be dominated by crowdsourcing while other areas will be mapped based on traditional airborne and spaceborne photogrammetry. It is worth nothing that due to AV and other mobility applications, there are many startups that use photogrammetric techniques on component level.
Would you like to talk about a certain topic, contributing your expert knowledge to our blog?
You are currently viewing a placeholder content from Facebook. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.
More InformationYou are currently viewing a placeholder content from Instagram. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.
More InformationYou are currently viewing a placeholder content from X. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.
More Information