Weight sharing for single-channel LMS

Research output: Contribution to conferencePaperpeer-review

24 Downloads (Pure)


Constraining a group of taps of an adaptive filter to a single value may seem like a futile task, as weight sharing reduces the degree of freedom of the algorithm, and there are no obvious advantages for implementing such an update scheme. On the other hand, weight sharing is popular in deep learning and underpins the success of convolutional neural networks (CNNs) in numerous applications. To this end, we investigate the advantages of weight sharing in single-channel least mean square (LMS), and propose weight sharing LMS (WSLMS) and partial weight sharing LMS (PWS). In particular, we illustrate how weight sharing can lead to numerous benefits such as an enhanced robustness to noise and a computational cost that is independent of the filter length. Simulations support the analysis
Original languageEnglish
Number of pages5
Publication statusPublished - 2 Jul 2023

Cite this