Localized Gaussians as Self-Attention Weights for Point Clouds Correspondence

dc.contributor.authorRiva, Alessandroen_US
dc.contributor.authorRaganato, Alessandroen_US
dc.contributor.authorMelzi, Simoneen_US
dc.contributor.editorCaputo, Arielen_US
dc.contributor.editorGarro, Valeriaen_US
dc.contributor.editorGiachetti, Andreaen_US
dc.contributor.editorCastellani, Umbertoen_US
dc.contributor.editorDulecha, Tinsae Gebrechristosen_US
dc.date.accessioned2024-11-11T12:48:31Z
dc.date.available2024-11-11T12:48:31Z
dc.date.issued2024
dc.description.abstractCurrent data-driven methodologies for point cloud matching demand extensive training time and computational resources, presenting significant challenges for model deployment and application. In the point cloud matching task, recent advancements with an encoder-only Transformer architecture have revealed the emergence of semantically meaningful patterns in the attention heads, particularly resembling Gaussian functions centered on each point of the input shape. In this work, we further investigate this phenomenon by integrating these patterns as fixed attention weights within the attention heads of the Transformer architecture. We evaluate two variants: one utilizing predetermined variance values for the Gaussians, and another where the variance values are treated as learnable parameters. Additionally we analyze the performances on noisy data and explore a possible way to improve robustness to noise. Our findings demonstrate that fixing the attention weights not only accelerates the training process but also enhances the stability of the optimization. Furthermore, we conducted an ablation study to identify the specific layers where the infused information is most impactful and to understand the reliance of the network on this information.en_US
dc.description.sectionheadersShape Analysis
dc.description.seriesinformationSmart Tools and Applications in Graphics - Eurographics Italian Chapter Conference
dc.identifier.doi10.2312/stag.20241345
dc.identifier.isbn978-3-03868-265-3
dc.identifier.issn2617-4855
dc.identifier.pages10 pages
dc.identifier.urihttps://doi.org/10.2312/stag.20241345
dc.identifier.urihttps://diglib.eg.org/handle/10.2312/stag20241345
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies → Machine learning; Shape analysis; Theory of computation → Computational geometry
dc.subjectComputing methodologies → Machine learning
dc.subjectShape analysis
dc.subjectTheory of computation → Computational geometry
dc.titleLocalized Gaussians as Self-Attention Weights for Point Clouds Correspondenceen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
stag20241345.pdf
Size:
18.49 MB
Format:
Adobe Portable Document Format