Skip to main content

Parallel python: destroying job_servers

Remember to destroy job servers after they are done.

My nice animation generator crashed sometime over the weekend with the following death rattle:

Traceback (most recent call last):
File "animate_array_job.py", line 168, in
tscale = tscale, dpi = dpi)
File "animate_array_job.py", line 92, in process_file
job_server = pp.Server(ppservers=ppservers)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/pp.py", line 336, in __init__
self.set_ncpus(ncpus)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/pp.py", line 494, in set_ncpus
range(ncpus - len(self.__workers))])
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/pp.py", line 141, in __init__
self.start()
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/pp.py", line 148, in start
shell=True)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/subprocess.py", line 594, in __init__
errread, errwrite)
File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/subprocess.py", line 1009, in _execute_child
errpipe_read, errpipe_write = os.pipe()
OSError: [Errno 24] Too many open files

Comments

Popular posts from this blog

A note on Python's __exit__() and errors

Python's context managers are a very neat way of handling code that needs a teardown once you are done. Python objects have do have a destructor method ( __del__ ) called right before the last instance of the object is about to be destroyed. You can do a teardown there. However there is a lot of fine print to the __del__ method. A cleaner way of doing tear-downs is through Python's context manager , manifested as the with keyword. class CrushMe: def __init__(self): self.f = open('test.txt', 'w') def foo(self, a, b): self.f.write(str(a - b)) def __enter__(self): return self def __exit__(self, exc_type, exc_val, exc_tb): self.f.close() return True with CrushMe() as c: c.foo(2, 3) One thing that is important, and that got me just now, is error handling. I made the mistake of ignoring all those 'junk' arguments ( exc_type, exc_val, exc_tb ). I just skimmed the docs and what popped out is that you need to return True or...

Store numpy arrays in sqlite

Use numpy.getbuffer (or sqlite3.Binary ) in combination with numpy.frombuffer to lug numpy data in and out of the sqlite3 database: import sqlite3, numpy r1d = numpy.random.randn(10) con = sqlite3.connect(':memory:') con.execute("CREATE TABLE eye(id INTEGER PRIMARY KEY, desc TEXT, data BLOB)") con.execute("INSERT INTO eye(desc,data) VALUES(?,?)", ("1d", sqlite3.Binary(r1d))) con.execute("INSERT INTO eye(desc,data) VALUES(?,?)", ("1d", numpy.getbuffer(r1d))) res = con.execute("SELECT * FROM eye").fetchall() con.close() #res -> #[(1, u'1d', <read-write buffer ptr 0x10371b220, size 80 at 0x10371b1e0>), # (2, u'1d', <read-write buffer ptr 0x10371b190, size 80 at 0x10371b150>)] print r1d - numpy.frombuffer(res[0][2]) #->[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] print r1d - numpy.frombuffer(res[1][2]) #->[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] Note that for work where data ty...