Homepage › Solution manuals › Gilbert Strang › Linear Algebra and Learning from Data › Exercise 2.1.9
Exercise 2.1.9
Answers
, so we have .
Using Conjugate Gradients, the computer program agrees with manual solution.
import numpy as np def conjugate_gradient(S, b, N): x = np.zeros(b.shape) r = b d = r for k in range(1, N): alpha = np.matmul(r.transpose(), r)/np.matmul(d.transpose(), np.matmul(S, d)) x = x + alpha * d r_next = r - alpha * np.matmul(S, d) u = np.matmul(r_next.transpose(), r_next) d = np.matmul(r.transpose(), r) beta = u/d d = r_next + beta * d r = r_next return x A = np.array([[2,-1,0],[-1,2,-1], [0,-1,2]]) b = np.array([1,0,0]).reshape(-1, 1) x = conjugate_gradient(A, b, 1000) print(’Conjugate␣Gradient␣Solution:␣’, x)
Conjugate Gradient Solution: [[0.75] [0.5 ] [0.25]]
2020-03-20 00:00