I find that defining functions which ask the user to define and then pass in another function to be a very natural design pattern for me. For example,
def gradient_descent(x0, grad_f):
x = x0
for _ in range(100):
x -= 0.1 * grad_f(x)
return x
Implements a generic gradient descent routine; all the user has to do is define the gradient function for f. This is basically the interface used by scipy.optimize, and the programs I write tend to use various function closures and dynamically defined functions in a similar way.
However, I have found myself facing some serious difficulties in taking advantage of parallelism with multiprocessing since functions can't be pickled. I know that there are ways around this, but it makes me question whether programming like this is even a "pythonic" way to do things.
Is this a natural design pattern in Python? Is there a better way to design programs that will likely need to be refactored to use multiple processes?
Aucun commentaire:
Enregistrer un commentaire