19 November 2023 2 8K Report

Hi folks!

Let's say that I have two lists / vectors "t_list" and "y_list" representing the relationship y(t). I also have numerically computed dy/dt and stored it into "dy_dt_list".

The problem is that "dy_dt_list" contains a lot of fluctuations, and that I know that it MONOTONOUSLY DECREASES out of a physical theory.

1) Is there is a simple way in R or Python to carry out a spline regression that reproduces the numerical values of dy/dt(t) in "dy_dt_list" as best it can UNDER THE CONSTRAINT that it keeps decreasing? I thus want to get a monotonously decreasing (dy/dt)_spline as the output.

2) Is there is a simple way in R or Python to carry out a spline regression that reproduces the numerical values of y(t) as best it can UNDER THE CONSTRAINT that (dy/dt)spline keeps decreasing? I thus want to get y_spline as the output, given that the above constraint is fulfilled.

I'd like to avoid having to reinvent the wheel!

P.S: I added an example to clarify things!

More Marc Fischer's questions See All
Similar questions and discussions