Functional representations are important for memory and time series prediction. Some examples:
Why shouldn't we learn an optimal functional basis?
We compare to a FFT representation of CIFAR-10, using the same number of basis functions. Siren is used to represent the basis functions. Currently the model isn't effectively parallelized (ie. there is a siren model for each basis function that must be passed in a for loop).
Nevertheless, the results are promising. Using L1 and LPIPS loss, we can see that we can impliment a neural basis that outperforms the FFT representation on the task of image compression and reconstruction.