-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while running run_magic_imputation #113
Comments
This appears to be an issue with versioning of different python modules. Can you please try again with a new conda environment? We also suggest using a python version > 3.8 |
Thanks for the suggestions. I tried using v3.9 and v3.10 in a virtual environment but now I am unable to run PCA (and all steps after that) on my adata object. `--------------------------------------------------------------------------- File ~/.local/lib/python3.10/site-packages/scanpy/preprocessing/_pca.py:200, in pca(data, n_comps, zero_center, svd_solver, random_state, return_info, use_highly_variable, dtype, copy, chunked, chunk_size) File ~/.local/lib/python3.10/site-packages/scanpy/preprocessing/_pca.py:303, in _pca_with_sparse(X, npcs, solver, mu, random_state) File ~/.local/lib/python3.10/site-packages/scipy/sparse/linalg/_eigen/_svds.py:443, in svds(A, k, ncv, tol, which, v0, maxiter, return_singular_vectors, solver, random_state, options) File ~/.local/lib/python3.10/site-packages/scipy/sparse/linalg/_eigen/_svds.py:40, in _iv(A, k, ncv, tol, which, v0, maxiter, return_singular, solver, random_state) ValueError: |
Hello,
I am trying to run Palantir on my dataset to be able to eventually use Cellrank2. I am getting an error message (pasted below). I have loaded the scipy module and imported the necessary features as mentioned in the tutorial. How can I fix this? Thanks.
`---------------------------------------------------------------------------
_RemoteTraceback Traceback (most recent call last)
_RemoteTraceback:
"""
Traceback (most recent call last):
File "/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/site-packages/joblib/externals/loky/process_executor.py", line 616, in wait_result_broken_or_wakeup
result_item = result_reader.recv()
File "/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/multiprocessing/connection.py", line 251, in recv
return _ForkingPickler.loads(buf.getbuffer())
ModuleNotFoundError: No module named 'scipy.sparse._csr'
"""
The above exception was the direct cause of the following exception:
BrokenProcessPool Traceback (most recent call last)
in
----> 1 imputed_X = palantir.utils.run_magic_imputation(adata)
~/.local/lib/python3.8/site-packages/palantir/utils.py in run_magic_imputation(data, dm_res, n_steps, sim_key, expression_key, imputation_key, n_jobs)
570
571 # Run the dot product in parallel on chunks
--> 572 res = Parallel(n_jobs=n_jobs)(
573 delayed(_dot_helper_func)(T_steps, X[:, chunks[i - 1] : chunks[i]])
574 for i in range(1, len(chunks))
/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/site-packages/joblib/parallel.py in call(self, iterable)
1052
1053 with self._backend.retrieval_context():
-> 1054 self.retrieve()
1055 # Make sure that we get a last message telling us we are done
1056 elapsed_time = time.time() - self._start_time
/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/site-packages/joblib/parallel.py in retrieve(self)
931 try:
932 if getattr(self._backend, 'supports_timeout', False):
--> 933 self._output.extend(job.get(timeout=self.timeout))
934 else:
935 self._output.extend(job.get())
/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/site-packages/joblib/_parallel_backends.py in wrap_future_result(future, timeout)
540 AsyncResults.get from multiprocessing."""
541 try:
--> 542 return future.result(timeout=timeout)
543 except CfTimeoutError as e:
544 raise TimeoutError from e
/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/concurrent/futures/_base.py in result(self, timeout)
437 raise CancelledError()
438 elif self._state == FINISHED:
--> 439 return self.__get_result()
440 else:
441 raise TimeoutError()
/ihome/crc/install/python/ondemand-jupyter-python3.8/lib/python3.8/concurrent/futures/_base.py in __get_result(self)
386 def __get_result(self):
387 if self._exception:
--> 388 raise self._exception
389 else:
390 return self._result
BrokenProcessPool: A result has failed to un-serialize. Please ensure that the objects returned by the function are always picklable.`
The text was updated successfully, but these errors were encountered: