The input signal to this filter is a linear frequency modulated (LFM) signal. The high-pass filter cut-off should increase linearly with time up to a known time limit. The main idea is to filter-out a time-shifted version of an LFM signal.
of course you should use wavelet transform for time frequency decomposition of signal. but you should know that wavelet packets decomposes the signal to discrete frequency bands only. what you need after is an accurate time filtering to suppress unwanted signals at unexpected times. we used this method for noise removal from otoacoustic emission (OAE) signal. OAE signal is the same as LFM. unfortunately our paper is in Persian. If you are interested I can help you in a co-work manner.
We have published TF adaptive filtering methods based on the short time Fourier transform in several venues. I am attaching two papers, published in IEEE Trans Sig Proc and in Signal Processing. The basis of the method is covered in section VI of the first paper and is described in more detail in the second paper. I am also attaching a MATLAB script that computes the STFT with the correct group delay, using a two tap recursion.