I am working on solving a system of linear equations with multiple variables. The system is of the form Ax=b, where the matrix A is extremely ill-conditioned. I have been struggling to find a stable solution for this problem. I have tried several numerical approaches, including genetic algorithms and other optimization methods, but I have not been successful in obtaining a reliable solution.

Here are some additional details about the problem:

1. The system is ill-conditioned, leading to significant numerical instability.

2. I know the boundaries of the solution space for x.

3. I also have the real solution for x that can be used for comparison.

I have been working on this problem for 4 months and would greatly appreciate any advice or suggestions on possible methods I could use to solve this problem. Below is the code I have been working on:

import numpy as np

Example of a linear system Ax = b

in attached file

  • I calculated the condition number of the matrix using cond_number = np.linalg.cond(A), and the result is 5.956181138617321e+23, which indicates a highly ill-conditioned matrix.
  • So far, I have attempted to solve the problem using SVD and genetic algorithms.
  • I am trying to simulate the approach described in [this research paper][1]. The simulation involves estimating a line using moments specifically, 15 moments. The challenge arises because the higher-order moments generate extremely large coefficients (on the order of 10^28) in the later equations, while the coefficients of the earlier equations are relatively small (on the order of 10). This disparity complicates the solution of the system of equations.

Any suggestions on how to handle such an ill-conditioned matrix and the wide range of coefficients would be appreciated.

[1]: https://ieeexplore.ieee.org/document/566816

More Hamid Reza Reza Aghamiri's questions See All
Similar questions and discussions