Alan Turing — "Those who can imagine anything, can create the impossible."
Please keep in mind the following important problem as we develop some methods to factor composites efficiently.
Big Problem (BP):
Given a 2n-digit composite (a product of two unknown and unequal prime factors each with n digits where n ≥ 450), develop an algorithm which factors the composite in polynomial time.
As we work our way up Mount BP, let us make some discoveries/ innovations to expedite our journey successfully. Amen!
TENTATIVE CONCLUSION: For n ≥ 450, our big problem is practically unsolvable in real time. Why?
We recall our Big Problem (BP):
Given c, a 2n-decimal digit composite (a product of two unknown and unequal prime factors, a and b, each with n decimal digits where n ≥ 450), develop an algorithm which factors the composite in polynomial time.
For n ≥ 450 it will take generally at least 10^(2n - 2) random calculations to compute a and b.
Can we significantly reduce our calculations?
HYPOTHESIS: If we can show that for k ≥ 2, the correct point, (a_k, b_k), does not affect the previous correct points whose subscripts are less than k, then we can significantly reduce our calculations and solve our problem hopefully in polynomial time.
Let's conduct an experiment to test our hypothesis.
Let c = 6,432,934,577 which is a composite of two unknown prime factors, a and b.
Given c, a 2n-decimal digit composite (a product of two unknown and unequal prime factors, a and b, each with n decimal digits where n ≥ 450), develop an algorithm which factors the composite in polynomial time.
We have 2n - 2 unknowns and 2n - 2 equations. Therefore, our problem should be solvable in polynomial time. We need to generate appropriate data sets for each appropriate pair of equations. And select the unique solution from those data sets. In practice, this may or may not be difficult. But in theory, it should work...