The question is whether (A) an analog system, or (B) a digital system, reflects the reality we see.
The A solution can be represented by (1) derivatives, and (2) include the hypothesis of continuity. They both, (1) and (2), match each other; without continuity there is no derivative, and without derivative there is no continuity. They both were used by Isaac Newton in his theory of "fluxions" in calculus. A fluxion is "the term for derivative in Newton's calculus" [1].
The B solution cannot be represented by derivatives, nor include the hypothesis of continuity. Again, the absence of both conditions, (1) and (2), match each other. This fact (i.e., the absence of (1) and (2) is not important) remained hidden for centuries in the fake controversy of primacy of calculus that followed, and was flamed by Newton as president of the Royal Society, against Leibniz, in 1713.
But Galois, around 1830, rediscovered a problem standing for 350 years, determining a necessary and sufficient condition for a polynomial to be solved by radicals, allowing calculus to be done by finite integer fields or Galois fields, thus eliminating the need for continuity in calculus.
How? Usual calculus requires continuity for the existence of derivatives, based on the four operations of arithmetics. It does seem necessary to require continuity, as Cauchy did in analysis in the field of real numbers. However, in the field of finite integers, such as Galois fields, calculus can be defined exactly, not requiring continuity.
Continuity is therefore an artifact of the formulation, and should be avoided. This complies with quantum mechanics and the work of Leon Brillouin, in 1956. It is fictional to consider continuity in mathematics, physics, computer science, and code. We are led today to consider finite integer fields, such as Galois fields, in calculus. We eschew the considerations of so-called "real numbers," as they include irrationals, which cannot be counted. The sum of two numbers in a Galois field is always a number in a Galois field. The sum of two numbers in the real set is never an infinitesimal; they can never be created nor exist.
The conclusion is that digital signal processing is the reality, not analogue processing. There is no effective quantization in digital processing, the quantum nature simply asserts itself. And this changes how we should view calculus: continuity is not required if one uses Galois fields. What is your opinion?
[1] https://mathworld.wolfram.com/Fluxion.html