Browsing by Author "Ortega, Lidia M."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item CEIG 2021: Frontmatter(Eurographics Association, 2021) Ortega, Lidia M.; Chica, Antonio; Ortega, Lidia M. and Chica, AntonioItem Guided Modeling of Natural Scenarios: Vegetation and Terrain(The Eurographics Association, 2022) Collado, José Antonio; López, Alfonso; Pérez, Juan Roberto Jiménez; Ortega, Lidia M.; Jurado, Juan M.; Feito, Francisco; Posada, Jorge; Serrano, AnaThe generation of realistic natural scenarios is a longstanding and ongoing challenge in Computer Graphics. LiDAR (Laser Imaging Detection and Ranging) point clouds have been gaining interest for the representation and analysis of real-world scenarios. However, the output of these sensors is conditioned by several parameters, including, but not limited to, distance to scanning target, aperture angle, number of laser beams, as well as systematic and random errors for the acquisition process. Hence, LiDAR point clouds may present inaccuracies and low density, thus hardening their visualization. In this work, we propose reconstructing the surveyed environments to enhance the point cloud density and provide a 3D representation of the scenario. To this end, ground and vegetation layers are detected and parameterized to allow their reconstruction. As a result, point clouds of any required density can be modeled, as well as 3D realistic natural scenarios that may lead to procedural generation through their parameterization.Item Modeling and Enhancement of LiDAR Point Clouds from Natural Scenarios(The Eurographics Association, 2022) Collado, José Antonio; López, Alfonso; Jiménez-Pérez, J. Roberto; Ortega, Lidia M.; Feito, Francisco R.; Jurado, Juan Manuel; Sauvage, Basile; Hasic-Telalovic, JasminkaThe generation of realistic natural scenarios is a longstanding and ongoing challenge in Computer Graphics. A common source of real-environmental scenarios is open point cloud datasets acquired by LiDAR (Laser Imaging Detection and Ranging) devices. However, these data have low density and are not able to provide sufficiently detailed environments. In this study, we propose a method to reconstruct real-world environments based on data acquired from LiDAR devices that overcome this limitation and generate rich environments, including ground and high vegetation. Additionally, our proposal segments the original data to distinguish among different kinds of trees. The results show that the method is capable of generating realistic environments with the chosen density and including specimens of each of the identified tree types.