a few hundred years ago, I wrote something in Fortran to look in a timeseries for the pattern of an instantaneous peak followed by a fairly quick exponential decay, then sometime later a peak and another exponential decay and so on, with some healthy "burst" noise on top. By blurrying your eyes you can easily distinguish periods of "signal-free" data from those "good" intervals with very sharp peaks and decays. Peaks have different amplitudes and decay rates are similar but not the same. The solution [then] looked to best correlate the data against a prescribed curve that I translated in time and about two dozen IF statements to solve different scenarios of spikes, etc.
Same question has now landed on my plate again and surely this must be the domain of A.I. I figure also a good chance to learn how scipy would like to handle it.... Except I am stuck at the starting gate. Everything I try takes me down the road of just optimizing | Signal - sum A_i(exp(-k_i[t-t_i])) | with constraint t_i > t_{i-1}. Which is fine, except I was hoping for something more "modern".
Any suggestions? Thanks, T.
Aucun commentaire:
Enregistrer un commentaire