Repository logo
  • Communities & Collections
  • All of DSpace
  • English
  • ČeÅ”tina
  • Deutsch
  • EspaƱol
  • FranƧais
  • GĆ idhlig
  • LatvieÅ”u
  • Magyar
  • Nederlands
  • PortuguĆŖs
  • PortuguĆŖs do Brasil
  • Suomi
  • Svenska
  • TürkƧe
  • ŅšŠ°Š·Š°Ņ›
  • বাংলা
  • ą¤¹ą¤æą¤‚ą¤¦ą„€
  • Ελληνικά
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Yoshimura, Atsushi"

Now showing 1 - 1 of 1
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Local Positional Encoding for Multi-Layer Perceptrons
    (The Eurographics Association, 2023) Fujieda, Shin; Yoshimura, Atsushi; Harada, Takahiro; Chaine, Raphaƫlle; Deng, Zhigang; Kim, Min H.
    A multi-layer perceptron (MLP) is a type of neural networks which has a long history of research and has been studied actively recently in computer vision and graphics fields. One of the well-known problems of an MLP is the capability of expressing highfrequency signals from low-dimensional inputs. There are several studies for input encodings to improve the reconstruction quality of an MLP by applying pre-processing against the input data. This paper proposes a novel input encoding method, local positional encoding, which is an extension of positional and grid encodings. Our proposed method combines these two encoding techniques so that a small MLP learns high-frequency signals by using positional encoding with fewer frequencies under the lower resolution of the grid to consider the local position and scale in each grid cell. We demonstrate the effectiveness of our proposed method by applying it to common 2D and 3D regression tasks where it shows higher-quality results compared to positional and grid encodings, and comparable results to hierarchical variants of grid encoding such as multi-resolution grid encoding with equivalent memory footprint.

Eurographics Association Ā© 2013-2025  |  System hosted at Graz University of Technology      
DSpace software copyright Ā© 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback