Comments (2)
I've slightly modified your code, and it seems to work:
import time
import math
import os
import sys
import numpy as np
import sympy as sym
from sympy.stats import Normal, density, E, std, cdf, skewness
from sympy import lambdify
from scipy.optimize import minimize
from pathos.pools import ProcessPool as Pool
import pathos
#The following creats the log-likelihood function for 10 gaussian observations.
CS1_Onsets = np.array([11.65239205, 23.67586247, 31.96652543, 40.59708088, 48.44196047, 57.50153782, 68.16787918, 76.90944557, 88.71767575, 99.527949 ])
CS_Wait_Times = np.array(CS1_Onsets[0]) # This takes care of the fact that the first observation cannot be obtained by differencing.
CS_Wait_Times = np.append(CS_Wait_Times,np.diff(CS1_Onsets))
del CS1_Onsets
u_CS = sym.Symbol('u_CS',positive=True)
var_CS = sym.Symbol('var_CS',positive=True)
x_CS = sym.Symbol('x_CS',positive=True)
Gaussian_CS = density(Normal("x_CS", u_CS, var_CS**(1/2)))
log_likelihood = 0
#Compute log-likelihood:
for i in range(0,len(CS_Wait_Times),1):
log_likelihood = log_likelihood + sym.log(Gaussian_CS(CS_Wait_Times[i]))
######################################### Optimising the log-likelihood function #####################################
Neg_log_likelihood = -1*log_likelihood
del log_likelihood
Neg_Score = [sym.diff(Neg_log_likelihood,u_CS),sym.diff(Neg_log_likelihood,var_CS)]
##The following implements the optimisation
bounds = [(0.0000000001,86400),(0.0000000001,86400)] #u_CS, var_CS,u_CS_US_Int,var_CS_US_Int,p_US
options = {"maxiter":400}
Results = [] #Store the results for different starting points.
Initial_Parameter_Estimates = [(1,1),(10,10),(100,100),(1000,1000),(10000,10000),(8,1),(80,10),(800,100),(8000,1000),(80000,10000)]
start_time = time.perf_counter()
Objective = lambdify([(u_CS, var_CS)], Neg_log_likelihood,'mpmath') #This gives the negative log-likelihood in the desired form:
Gradient = lambdify([(u_CS, var_CS)], Neg_Score,'mpmath')
##Parallel Version - The following doesnt work.
def MINI(start_loc):
import math
import numpy as np
import sympy as sym
from sympy.stats import Normal, density, E, std, cdf, skewness
from sympy import lambdify
from scipy.optimize import minimize
from sympy.stats import Normal, density, E, std, cdf, skewness
fun = Objective
x0 = start_loc
method = 'SLSQP'
jac = Gradient
bounds = [(0.0000000001, 86400), (0.0000000001, 86400)]
options = {"maxiter": 400}
result = minimize(fun=fun, x0=x0, method='SLSQP', jac=jac, bounds=bounds, options=options)
return result
if __name__ == '__main__':
pool = Pool()
from pathos.helpers import freeze_support
freeze_support()
start_time = time.perf_counter()
RESULTS = pool.map(MINI,Initial_Parameter_Estimates)
r = list(map(MINI, Initial_Parameter_Estimates))
finish = time.perf_counter()
Running_parallel = finish - start_time
print('Running time of parallel execution is:',Running_parallel)
# cleanup
pool.close()
pool.join()
pool.clear()
The primary change is to use ProcessPool
instead of ParallelPool
:
from pathos.pools import ProcessPool as Pool
ProcessPool
uses object serialization (using dill
), while ParallelPool
uses serialization by source extraction (using dill.source
). The former is faster, and more robust. The issue is that the sympy-generated log
function can't be found by ppft
as it's a dynamically-generated function. ppft
checks the namespace of the function MINI
... but fails to find log
in some cases (as you noted). For functions that aren't dynamically-generated, you can include the importing module, and it does generally find the function. ProcessPool
is using multiprocess
, which I imagine is what you were wanting to use in the first place.
I'm going to close the issue, but if it doesn't work for you, then please reopen.
from pathos.
That did it, cheers mike.
from pathos.
Related Issues (20)
- Support of PyTorch Tensors on CPU HOT 4
- Getting "TypeError: can't pickle _cffi_backend.FFI objects" on "results.get()" HOT 5
- Code corruption with Pool? HOT 6
- add CI support for pypy3.8, pypy3.9, and python3.12
- add formal support for python3.11
- TypeError: no default __reduce__ due to non-trivial __cinit__ HOT 2
- A closed pool may be returned while calling `ProcessPool.__init__` HOT 2
- How to run multiple processes in parallel using Pathos? HOT 5
- add changelog to release notes HOT 1
- Pathos still giving pickle error HOT 3
- Shared Memory Objects HOT 1
- host release-specific docs so docs match what's available HOT 3
- [multiprocessing] Propagate OOM errors
- pathos.pools.ProcessPool deadlock/hang on exceptions HOT 5
- map raises DeprecationWarning in 3.12.0b1 HOT 1
- Endless warning message when use imap
- AssertionError if i try to create new pool after old_pool.close() HOT 3
- Information about load balancing/resource allocation HOT 2
- it is possible to use pathos with gmpy2 ? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pathos.