Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incompatible glibc on conda install #216

Open
McAllister-NOAA opened this issue Nov 29, 2022 · 4 comments
Open

Incompatible glibc on conda install #216

McAllister-NOAA opened this issue Nov 29, 2022 · 4 comments

Comments

@McAllister-NOAA
Copy link

HI! I am having a problem install GetOrganelle from conda. When I run conda install -c bioconda getorganelle I get the following error during installation:

Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: \ 
Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed                                                                                                                             

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versionsThe following specifications were found to be incompatible with your system:

  - feature:/linux-64::__glibc==2.17=0
  - feature:|@/linux-64::__glibc==2.17=0

Your installed version is: 2.17

I have the most current conda install on Linux (22.9.0) from x86_64. Any help you would have on how to bypass or remedy this problem would be much appreciated. I am trying the conda install method because I have run into some other problems through the install option 2 (setup.py) that I'm hoping the conda environment will solve.

The problem I have run into on setup.py is the following:
ERROR: Disentangling failed: Failed in 'from scipy import stats, inf, log'!
Which seems to be from the lack of those modules installed during setup in the correct python environment. They certainly exist in the default shell python environment, but I can't figure out where to try to install them so that disentangling doesn't fail (other steps in the process work fine.

Thanks so much! Please let me know if you need more info. I quite like your program BTW.

Sincerely,
Sean McAllister

@Kinggerm
Copy link
Owner

Kinggerm commented Nov 29, 2022

Hi Sean,

Thanks for reaching out!
I'm sorry that I'm not a conda expert to solve this. I would say when I encountered weird conda errors like this, I would try to create a new environment, e.g. conda create -n getorganelle and ran conda install there. I even don't know why it works sometimes, and why not.

However, as for the scipy issue, please see #132 . I believe you would find your solution by troubleshooting the scipy. Please keep me posted.

Best,
Jianjun

@McAllister-NOAA
Copy link
Author

McAllister-NOAA commented Feb 7, 2023

I tried removing and creating a new conda environment as advised. And ran the from scipy import stats, inf, log within the python within that conda environment. Still no luck:

2023-02-07 09:34:56,958 - INFO: Disentangling T368_getOrganelle_round5/extended_spades/K127/assembly_graph.fastg.extend-animal_mt.fastg as a circular genome ... 
2023-02-07 09:34:57,049 - ERROR: Disentangling failed: Failed in 'from scipy import stats, inf, log'!
2023-02-07 09:34:57,050 - INFO: Extracting animal_mt from the assemblies failed.

The odd thing is that it doesn't fail this way every time. I did 90 blue whale genomes without any errors. But this sample may be insufficient and not able to completely circularize. I don't know if there is a different path for disentangling when this is the case? Thanks for your feedback!

Sean

@Kinggerm
Copy link
Owner

Kinggerm commented Feb 7, 2023

What did you get by running from scipy import stats, inf, log within python of the same environment as you ran GetOrganelle?

Please also attach the complete log file of T368_getOrganelle_round5

@McAllister-NOAA
Copy link
Author

There weren't any comments when I ran from scipy import stats, inf, log, it just popped up with another command prompt after a few seconds.

Log:

GetOrganelle v1.7.7.0

get_organelle_from_reads.py assembles organelle genomes from genome skimming data.
Find updates in https://github.com/Kinggerm/GetOrganelle and see README.md for more information.

Python 3.8.16 (default, Jan 17 2023, 23:13:24)  [GCC 11.2.0]
PLATFORM: Linux poseidon.pmel.noaa.gov 3.10.0-1160.81.1.el7.x86_64 #1 SMP Fri Dec 16 17:29:43 UTC 2022 x86_64 x86_64
PYTHON LIBS: GetOrganelleLib 1.7.7.0; numpy 1.23.5; sympy 1.11.1; scipy 1.10.0
DEPENDENCIES: Bowtie2 2.4.1; SPAdes 3.13.1; Blast 2.5.0
GETORG_PATH=/home/poseidon/mcallister/.GetOrganelle
LABEL DB: animal_mt 0.0.1
WORKING DIR: /scratch/mcallister/20221101_Chloe_Mitogenomes/getOrganelle
/home/poseidon/mcallister/anaconda3/envs/getorganelle/bin/get_organelle_from_reads.py -1 ../qc_files/T368_qc_adaptorTrim_Only/reads_R1.fastq.gz -2 ../qc_files/T368_qc_adaptorTrim_Only/reads_R2.fastq.gz -u ../qc_files/T368_qc_adaptorTrim_Only/reads_U.fastq.gz -o T368_getOrganelle_round5 -R 30 -F animal_mt -k 21,55,85,115,127 -t 20 --reduce-reads-for-coverage inf --max-reads inf -w 79 --max-n-words 1E9 -s ../seed_ref/Calanidae_seed.fasta --disentangle-time-limit 200000000

2023-02-07 00:40:49,554 - INFO: Pre-reading fastq ...
2023-02-07 00:40:49,555 - INFO: Unzipping reads file: ../qc_files/T368_qc_adaptorTrim_Only/reads_R1.fastq.gz (3098258996 bytes)
2023-02-07 00:42:59,158 - INFO: Unzipping reads file: ../qc_files/T368_qc_adaptorTrim_Only/reads_R2.fastq.gz (3217095574 bytes)
2023-02-07 00:45:17,732 - INFO: Unzipping reads file: ../qc_files/T368_qc_adaptorTrim_Only/reads_U.fastq.gz (40174152 bytes)
2023-02-07 00:45:19,274 - INFO: Counting read qualities ...
2023-02-07 00:45:19,564 - INFO: Identified quality encoding format = Sanger
2023-02-07 00:45:19,564 - INFO: Phred offset = 33
2023-02-07 00:45:19,569 - INFO: Trimming bases with qualities (0.00%): 33..33  !
2023-02-07 00:45:19,623 - INFO: Mean error rate = 0.0025
2023-02-07 00:45:19,626 - INFO: Counting read lengths ...
2023-02-07 00:47:07,792 - INFO: Mean = 149.8 bp, maximum = 150 bp.
2023-02-07 00:47:07,793 - INFO: Reads used = 49216893+49216893+869348
2023-02-07 00:47:07,793 - INFO: Pre-reading fastq finished.

2023-02-07 00:47:07,793 - INFO: Making seed reads ...
2023-02-07 00:47:08,166 - INFO: Making seed - bowtie2 index ...
2023-02-07 00:47:08,526 - INFO: Making seed - bowtie2 index finished.
2023-02-07 00:47:08,526 - INFO: Mapping reads to seed bowtie2 index ...
2023-02-07 00:55:37,362 - INFO: Mapping finished.
2023-02-07 00:55:37,362 - INFO: Seed reads made: T368_getOrganelle_round5/seed/animal_mt.initial.fq (5098417 bytes)
2023-02-07 00:55:37,365 - INFO: Making seed reads finished.

2023-02-07 00:55:37,365 - INFO: Checking seed reads and parameters ...
2023-02-07 00:55:37,961 - INFO: Estimated animal_mt-hitting base-coverage = 184.25
2023-02-07 00:55:38,314 - INFO: Setting '--max-extending-len inf'
2023-02-07 00:55:38,351 - INFO: Checking seed reads and parameters finished.

2023-02-07 00:55:38,352 - INFO: Making read index ...
2023-02-07 01:04:19,265 - INFO: 72484301 candidates in all 99303134 reads
2023-02-07 01:04:19,265 - INFO: Pre-grouping reads ...
2023-02-07 01:04:19,265 - INFO: Setting '--pre-w 79'
2023-02-07 01:04:25,171 - INFO: 200000/17923944 used/duplicated
2023-02-07 01:04:45,909 - INFO: 6825 groups made.
2023-02-07 01:04:54,960 - INFO: Making read index finished.

2023-02-07 01:04:54,960 - INFO: Extending ...
2023-02-07 01:04:54,960 - INFO: Adding initial words ...
2023-02-07 01:04:55,305 - INFO: AW 213290
2023-02-07 01:10:22,882 - INFO: Round 1: 72484301/72484301 AI 14322 AW 270870
2023-02-07 01:16:02,998 - INFO: Round 2: 72484301/72484301 AI 18468 AW 321282
2023-02-07 01:21:32,071 - INFO: Round 3: 72484301/72484301 AI 20639 AW 363232
2023-02-07 01:27:35,524 - INFO: Round 4: 72484301/72484301 AI 506355 AW 13508540
2023-02-07 01:39:31,061 - INFO: Round 5: 72484301/72484301 AI 9332694 AW 239049830
2023-02-07 01:52:13,479 - INFO: Round 6: 72484301/72484301 AI 17266512 AW 464885964
2023-02-07 02:02:38,861 - INFO: Round 7: 72484301/72484301 AI 21551616 AW 603818976
2023-02-07 02:12:30,534 - INFO: Round 8: 72484301/72484301 AI 23684772 AW 682059000
2023-02-07 02:21:26,571 - INFO: Round 9: 72484301/72484301 AI 24788101 AW 725191158
2023-02-07 02:30:25,257 - INFO: Round 10: 72484301/72484301 AI 25419660 AW 750518462
2023-02-07 02:39:25,319 - INFO: Round 11: 72484301/72484301 AI 25793477 AW 765778586
2023-02-07 02:47:51,276 - INFO: Round 12: 72484301/72484301 AI 26028983 AW 775563688
2023-02-07 02:56:01,549 - INFO: Round 13: 72484301/72484301 AI 26185363 AW 782033532
2023-02-07 03:05:30,017 - INFO: Round 14: 72484301/72484301 AI 26289236 AW 786330106
2023-02-07 03:15:04,328 - INFO: Round 15: 72484301/72484301 AI 26360842 AW 789288738
2023-02-07 03:24:43,024 - INFO: Round 16: 72484301/72484301 AI 26412778 AW 791437870
2023-02-07 03:34:10,349 - INFO: Round 17: 72484301/72484301 AI 26450604 AW 792994052
2023-02-07 03:43:46,195 - INFO: Round 18: 72484301/72484301 AI 26479493 AW 794181128
2023-02-07 03:53:11,795 - INFO: Round 19: 72484301/72484301 AI 26501347 AW 795063316
2023-02-07 04:02:49,000 - INFO: Round 20: 72484301/72484301 AI 26517214 AW 795705164
2023-02-07 04:12:19,527 - INFO: Round 21: 72484301/72484301 AI 26529788 AW 796213442
2023-02-07 04:21:46,605 - INFO: Round 22: 72484301/72484301 AI 26539709 AW 796610728
2023-02-07 04:31:10,528 - INFO: Round 23: 72484301/72484301 AI 26547732 AW 796927482
2023-02-07 04:40:34,429 - INFO: Round 24: 72484301/72484301 AI 26554330 AW 797184312
2023-02-07 04:49:57,070 - INFO: Round 25: 72484301/72484301 AI 26559206 AW 797375214
2023-02-07 04:59:29,222 - INFO: Round 26: 72484301/72484301 AI 26562798 AW 797518406
2023-02-07 05:08:54,855 - INFO: Round 27: 72484301/72484301 AI 26565278 AW 797618134
2023-02-07 05:18:24,143 - INFO: Round 28: 72484301/72484301 AI 26567370 AW 797701440
2023-02-07 05:27:48,736 - INFO: Round 29: 72484301/72484301 AI 26569351 AW 797779278
2023-02-07 05:37:33,437 - INFO: Round 30: 72484301/72484301 AI 26571025 AW 797845880
2023-02-07 05:37:33,439 - INFO: Hit the round limit 30 and terminated ...
2023-02-07 05:44:45,271 - INFO: Extending finished.

2023-02-07 05:44:51,916 - INFO: Separating extended fastq file ... 
2023-02-07 05:46:47,199 - INFO: Setting '-k 21,55,85,115,127'
2023-02-07 05:46:47,199 - INFO: Assembling using SPAdes ...
2023-02-07 05:46:50,360 - INFO: spades.py -t 20  --phred-offset 33 -1 T368_getOrganelle_round5/extended_1_paired.fq -2 T368_getOrganelle_round5/extended_2_paired.fq --s1 T368_getOrganelle_round5/extended_1_unpaired.fq --s2 T368_getOrganelle_round5/extended_2_unpaired.fq --s3 T368_getOrganelle_round5/extended_3.fq -k 21,55,85,115,127 -o T368_getOrganelle_round5/extended_spades2023-02-07 09:17:07,615 - INFO: Insert size = 308.863, deviation = 95.1702, left quantile = 194, right quantile = 432
2023-02-07 09:17:07,616 - INFO: Assembling finished.

2023-02-07 09:34:56,815 - INFO: Slimming T368_getOrganelle_round5/extended_spades/K127/assembly_graph.fastg finished!
2023-02-07 09:34:56,816 - INFO: Slimming assembly graphs finished.

2023-02-07 09:34:56,816 - INFO: Extracting animal_mt from the assemblies ...
2023-02-07 09:34:56,958 - INFO: Disentangling T368_getOrganelle_round5/extended_spades/K127/assembly_graph.fastg.extend-animal_mt.fastg as a circular genome ... 
2023-02-07 09:34:57,049 - ERROR: Disentangling failed: Failed in 'from scipy import stats, inf, log'!
2023-02-07 09:34:57,050 - INFO: Extracting animal_mt from the assemblies failed.


Total cost 32052.63 s
Thank you!

The mitogenome is recovered using get_organelle_from_assembly.py based on spades assembly of all reads, and while it is near complete, it isn't circularized. I'm hoping to mess with the parameters to close the mitogenome.

Thanks,
Sean

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants