pool.map() из многопроцессорной библиотеки python вызывает ошибку JSON
Я использую многопроцессорную библиотеку, что очень затрудняет отладку ошибки при ее возникновении. Простая структура кода:
import multiprocessing
from multiprocessing import Pool
import traceback
def myFunction(row):
try:
# Create some log strings from row
string = "[time]: "+row[0]
# Send log strings to a file
except Exception as e:
print "Caught Exception"
traceback.print_exc()
print()
raise e
no_processors = multiprocessing.cpu_count()
with closing(Pool(processes=no_processors)) as pool:
pool.map(myFunction, list)
Тем не менее, я получаю эту ошибку при запуске:
Caught exception in worker thread (row = 0):
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/py2neo/packages/httpstream/http.py", line 634, in read
File "filename.multi2.py", line 190, in myFunction
data = self._response.read()
File "/usr/lib/python2.7/httplib.py", line 543, in read
tx.execute()
File "/usr/local/lib/python2.7/dist-packages/py2neo/cypher.py", line 233, in execute
return self._post(self._execute or self._begin)
File "/usr/local/lib/python2.7/dist-packages/py2neo/cypher.py", line 208, in _post
j = rs.json
File "/usr/local/lib/python2.7/dist-packages/py2neo/packages/httpstream/http.py", line 562, in json
return self._read_chunked(amt)
raise TypeError("Content is not JSON")
TypeError: Content is not JSON
File "/usr/lib/python2.7/httplib.py", line 597, in _read_chunked
()
raise IncompleteRead(''.join(value))
IncompleteRead: IncompleteRead(198 bytes read)
()
pool.map(myFunction, list)
File "/usr/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/usr/lib/python2.7/multiprocessing/pool.py", line 558, in get
raise self._value
TypeError: Content is not JSON
Кто-нибудь знает причину / решение?