Python multiprocessing doesn't seem to use more than one core -


i want use python multiprocessing run grid search predictive model. when @ core usage, seem using 1 core. idea i'm doing wrong?

import multiprocessing sklearn import svm import itertools  #first read data #x feature numpy 2d array #y 1d numpy array of labels  #define grid         c = [0.1, 1] gamma = [0.0] params = [c, gamma] grid = list(itertools.product(*params)) grid_hx = []  def worker(par, grid_list):     #define sklearn model     clf = svm.svc(c=g[0], gamma=g[1],probability=true,random_state=seed)     #run cross validation fuction: returns error     ll = my_cross_validation_function(x, y, model=clf, n=1, test_size=0.2)     print(par, ll)     grid_list.append((par, ll))   if __name__ == '__main__':    manager = multiprocessing.manager()    grid_hx = manager.list()    jobs = []    g in grid:       p = multiprocessing.process(target=worker, args=(g,grid_hx))       jobs.append(p)       p.start()       p.join()     print("\n-------------------")    print("sorted list")    print("-------------------")    l = sorted(grid_hx, key=itemgetter(1))    l in l[:5]:       print l 

your problem join each job after started it:

for g in grid:     p = multiprocessing.process(target=worker, args=(g,grid_hx))     jobs.append(p)     p.start()     p.join() 

join blocks until respective process has finished working. means code starts only 1 process @ once, waits until finished , starts next one.

in order processes run in parallel, need first start them all , join them all:

jobs = [] g in grid:     p = multiprocessing.process(target=worker, args=(g,grid_hx))     jobs.append(p)     p.start()  j in jobs:     j.join() 

documentation: link


Comments

Popular posts from this blog

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

java - UML - How would you draw a try catch in a sequence diagram? -

c++ - No viable overloaded operator for references a map -