You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I'm running into the following error whenever I'm trying to run the pipeline on MAcOS, any help is appreciated!
==================================== Bulk data analysis pipeline will run ==============================================================
=================================================================================================================================
[info] Input file is GATA1_D7_30min_chr11_R1_001.fastq.gz and GATA1_D7_30min_chr11_R2_001.fastq.gz
Sun Jan 7 16:46:07 EST 2024
[info] Trimming file GATA1_D7_30min_chr11 ...
Sun Jan 7 16:46:07 EST 2024
[info] Use Truseq adaptor as default
[info] Second stage trimming GATA1_D7_30min_chr11 ...
Sun Jan 7 16:46:14 EST 2024
[info] Aligning file GATA1_D7_30min_chr11 to reference genome...
Sun Jan 7 16:46:24 EST 2024
[info] Bowtie2 command: --dovetail --phred33
[info] The dovetail mode is enabled [as parameter frag_120 is on]
[info] FASTQ files won't be aligned to the spike-in genome
[info] Filtering unmapped fragments... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
[info] Sorting BAM... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
INFO 2024-01-07 16:46:24 SortSam
********** NOTE: Picard's command line syntax is changing.
16:46:24.792 INFO NativeLibraryLoader - Loading libgkl_compression.dylib from jar:file:/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/picard-2.8.0.jar!/com/intel/gkl/native/libgkl_compression.dylib
16:46:24.814 WARN NativeLibraryLoader - Unable to load libgkl_compression.dylib from native/libgkl_compression.dylib (Can't load library: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs/libgkl_compression3199524088769265004.dylib)
16:46:24.814 INFO NativeLibraryLoader - Loading libgkl_compression.dylib from jar:file:/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/picard-2.8.0.jar!/com/intel/gkl/native/libgkl_compression.dylib
16:46:24.819 WARN NativeLibraryLoader - Unable to load libgkl_compression.dylib from native/libgkl_compression.dylib (Can't load library: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs/libgkl_compression8273045536062371726.dylib)
[Sun Jan 07 16:46:24 EST 2024] SortSam INPUT=sorted/GATA1_D7_30min_chr11.step1.bam OUTPUT=sorted/GATA1_D7_30min_chr11.bam SORT_ORDER=coordinate TMP_DIR=[/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs] VALIDATION_STRINGENCY=SILENT VERBOSITY=INFO QUIET=false COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_INDEX=false CREATE_MD5_FILE=false GA4GH_CLIENT_SECRETS=client_secrets.json USE_JDK_DEFLATER=false USE_JDK_INFLATER=false
[Sun Jan 07 16:46:24 EST 2024] Executing as [email protected] on Mac OS X 14.1.2 aarch64; Java HotSpot(TM) 64-Bit Server VM 1.8.0_391-b13; Deflater: Jdk; Inflater: Jdk; Provider GCS is not available; Picard version: 2.21.7-SNAPSHOT
16:46:24.837 WARN IntelDeflaterFactory - IntelInflater is not supported, using Java.util.zip.Inflater
16:46:24.849 WARN IntelDeflaterFactory - IntelDeflater is not supported, using Java.util.zip.Deflater
INFO 2024-01-07 16:46:24 SortSam Finished reading inputs, merging and writing to output now.
[Sun Jan 07 16:46:24 EST 2024] picard.sam.SortSam done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=257425408
[info] Marking duplicates... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
[info] Removing duplicates... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:25 EST 2024
[info] Filtering to <120bp... dup.marked and dedup BAMs
Sun Jan 7 16:46:25 EST 2024
[info] Creating bam index files... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:25 EST 2024
[info] Reads shifting
Sun Jan 7 16:46:25 EST 2024
[info] Your data won't be shifted as the experiment_type is specified as CUT&RUN...
[info] Peak calling using MACS2... GATA1_D7_30min_chr11.bam
[info] Logs are stored in /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs
Sun Jan 7 16:46:25 EST 2024
[info] Peak calling with BAM file with NO duplications
[info] macs2 narrow peak calling
[info] macs2 broad peak calling
[info] Getting broad peak summits
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/get_summits_broadPeak.py", line 14, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.broad/GATA1_D7_30min_chr11_peaks.broadPeak'
[info] SEACR stringent peak calling
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/change.bdg.py", line 12, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat_pileup.bdg'
Calling enriched regions without control file
Proceeding without normalization of control to experimental bedgraph
Using stringent threshold
Creating experimental AUC file: Sun Jan 7 16:46:26 EST 2024
/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/SEACR_1.1.sh: line 103: 0I?e1?z۳0???.auc.bed: Illegal byte sequence
Unable to access /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat.stringent.bed
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/get_summits_seacr.py", line 14, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat.stringent.bed'
[info] Generating the normalized signal file with BigWig format...
Sun Jan 7 16:46:26 EST 2024
cp: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11.cpm.norm.bw: No such file or directory
cp: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11.cpm.norm.bw: No such file or directory
[info] Your bigwig file won't be normalized with spike-in reads
[info] Input file is /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11_peaks.narrowPeak
cat: GATA1_D7_30min_chr11_peaks.narrowPeak: No such file or directory
cat: GATA1_D7_30min_chr11_summits.bed: No such file or directory
[info] Get randomized [1000] peaks from the top [2000] peaks...
[info] Filtering the blacklist regions for the selected peak files
[info] Getting Fasta sequences
[info] Start MEME analysis for de novo motif finding ...
[info] Up to 10 will be output ...
Log::Log4perl configuration looks suspicious: No loggers defined at /Users/shivanikushwaha/opt/anaconda3/envs/meme/lib/site_perl/5.26.2/Log/Log4perl/Config.pm line 325.
Starting getsize: getsize random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> $metrics
getsize ran successfully in 0.026079 seconds
Starting fasta-most: fasta-most -min 50 < random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> $metrics
fasta-most ran successfully in 0.074794 seconds
Starting fasta-center: fasta-center -dna -len 100 < random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered
fasta-center ran successfully in 0.081879 seconds
Starting fasta-shuffle-letters: fasta-shuffle-letters random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-shuffled -kmer 2 -tag -dinuc -dna -seed 1
fasta-shuffle-letters ran successfully in 0.025716 seconds
Starting fasta-get-markov: fasta-get-markov -nostatus -nosummary -dna -m 1 random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa random1000/MEME_GATA1_D7_30min_chr11_shuf/background
fasta-get-markov ran successfully in 0.024623 seconds
Starting meme: meme random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered -oc random1000/MEME_GATA1_D7_30min_chr11_shuf/meme_out -mod zoops -nmotifs 10 -minw 6 -maxw 30 -bfile random1000/MEME_GATA1_D7_30min_chr11_shuf/background -dna -revcomp -nostatus
No sequences found in file `random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered'. Check file format.
meme exited with error code 1
Starting dreme: dreme -verbosity 1 -oc random1000/MEME_GATA1_D7_30min_chr11_shuf/dreme_out -png -dna -p random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered -n random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-shuffled -m 10
File "/Users/shivanikushwaha/opt/anaconda3/envs/meme/bin/dreme", line 765
print "Finding secondary RE in left flank..."
^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print("Finding secondary RE in left flank...")?
dreme exited with error code 1
Starting meme-chip_html_to_tsv: meme-chip_html_to_tsv random1000/MEME_GATA1_D7_30min_chr11_shuf/meme-chip.html random1000/MEME_GATA1_D7_30min_chr11_shuf/summary.tsv "meme-chip -oc random1000/MEME_GATA1_D7_30min_chr11_shuf -dreme-m 10 -meme-nmotifs 10 random1000/padded.fa/GATA1_D7_30min_chr11_summits_padded.fa" 5.0.5 "Mon Mar 18 20:12:19 2019 -0700"
meme-chip_html_to_tsv ran successfully in 0.132533 seconds
[info] De Novo motifs can be found: random1000/MEME_GATA1_D7_30min_chr11_shuf ...
[info] Loading the De Novo motifs ...
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/read.meme.py", line 94, in
dreme_matrices = read_dreme(this_dir + "/dreme_out/dreme.txt")
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/read.meme.py", line 47, in read_dreme
f = open(n)
FileNotFoundError: [Errno 2] No such file or directory: 'random1000/MEME_GATA1_D7_30min_chr11_shuf/dreme_out/dreme.txt'
[info] The signficance cutoff of Fimo scaning is 0.0005...
[info] Motif files can be found: random1000/MEME_GATA1_D7_30min_chr11_shuf/motifs
[info] Filtering the blacklist regions for the selected peak files
[info] Getting Fasta sequences
[info] Scaning the De Novo motifs for each peak
ls: random1000/MEME_GATA1_D7_30min_chr11_shuf/motifs: No such file or directory
[info] Output can be found: fimo.result/GATA1_D7_30min_chr11
Congrats! The bulk data analysis is complete!
The text was updated successfully, but these errors were encountered:
Hello,
I'm running into the following error whenever I'm trying to run the pipeline on MAcOS, any help is appreciated!
==================================== Bulk data analysis pipeline will run ==============================================================
Input FASTQ folder: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/exampleData
Sample name: GATA1_D7_30min_chr11
Workdir folder: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11
Experiment name: GATA1_D7_30min_chr11
Experiment type: CUT&RUN
Reference genome: hg38
Spike-in genome: FALSE
Spike-in normalization: FALSE
Fragment 120 filtration: TRUE
=================================================================================================================================
[info] Input file is GATA1_D7_30min_chr11_R1_001.fastq.gz and GATA1_D7_30min_chr11_R2_001.fastq.gz
Sun Jan 7 16:46:07 EST 2024
[info] Trimming file GATA1_D7_30min_chr11 ...
Sun Jan 7 16:46:07 EST 2024
[info] Use Truseq adaptor as default
[info] Second stage trimming GATA1_D7_30min_chr11 ...
Sun Jan 7 16:46:14 EST 2024
[info] Aligning file GATA1_D7_30min_chr11 to reference genome...
Sun Jan 7 16:46:24 EST 2024
[info] Bowtie2 command: --dovetail --phred33
[info] The dovetail mode is enabled [as parameter frag_120 is on]
[info] FASTQ files won't be aligned to the spike-in genome
[info] Filtering unmapped fragments... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
[info] Sorting BAM... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
INFO 2024-01-07 16:46:24 SortSam
********** NOTE: Picard's command line syntax is changing.
********** For more information, please see:
********** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)
********** The command line looks like this in the new syntax:
********** SortSam -INPUT sorted/GATA1_D7_30min_chr11.step1.bam -OUTPUT sorted/GATA1_D7_30min_chr11.bam -SORT_ORDER coordinate -VALIDATION_STRINGENCY SILENT -TMP_DIR /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs
16:46:24.792 INFO NativeLibraryLoader - Loading libgkl_compression.dylib from jar:file:/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/picard-2.8.0.jar!/com/intel/gkl/native/libgkl_compression.dylib
16:46:24.814 WARN NativeLibraryLoader - Unable to load libgkl_compression.dylib from native/libgkl_compression.dylib (Can't load library: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs/libgkl_compression3199524088769265004.dylib)
16:46:24.814 INFO NativeLibraryLoader - Loading libgkl_compression.dylib from jar:file:/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/picard-2.8.0.jar!/com/intel/gkl/native/libgkl_compression.dylib
16:46:24.819 WARN NativeLibraryLoader - Unable to load libgkl_compression.dylib from native/libgkl_compression.dylib (Can't load library: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs/libgkl_compression8273045536062371726.dylib)
[Sun Jan 07 16:46:24 EST 2024] SortSam INPUT=sorted/GATA1_D7_30min_chr11.step1.bam OUTPUT=sorted/GATA1_D7_30min_chr11.bam SORT_ORDER=coordinate TMP_DIR=[/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs] VALIDATION_STRINGENCY=SILENT VERBOSITY=INFO QUIET=false COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_INDEX=false CREATE_MD5_FILE=false GA4GH_CLIENT_SECRETS=client_secrets.json USE_JDK_DEFLATER=false USE_JDK_INFLATER=false
[Sun Jan 07 16:46:24 EST 2024] Executing as [email protected] on Mac OS X 14.1.2 aarch64; Java HotSpot(TM) 64-Bit Server VM 1.8.0_391-b13; Deflater: Jdk; Inflater: Jdk; Provider GCS is not available; Picard version: 2.21.7-SNAPSHOT
16:46:24.837 WARN IntelDeflaterFactory - IntelInflater is not supported, using Java.util.zip.Inflater
16:46:24.849 WARN IntelDeflaterFactory - IntelDeflater is not supported, using Java.util.zip.Deflater
INFO 2024-01-07 16:46:24 SortSam Finished reading inputs, merging and writing to output now.
[Sun Jan 07 16:46:24 EST 2024] picard.sam.SortSam done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=257425408
[info] Marking duplicates... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:24 EST 2024
[info] Removing duplicates... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:25 EST 2024
[info] Filtering to <120bp... dup.marked and dedup BAMs
Sun Jan 7 16:46:25 EST 2024
[info] Creating bam index files... GATA1_D7_30min_chr11.bam
Sun Jan 7 16:46:25 EST 2024
[info] Reads shifting
Sun Jan 7 16:46:25 EST 2024
[info] Your data won't be shifted as the experiment_type is specified as CUT&RUN...
[info] Peak calling using MACS2... GATA1_D7_30min_chr11.bam
[info] Logs are stored in /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/logs
Sun Jan 7 16:46:25 EST 2024
[info] Peak calling with BAM file with NO duplications
[info] macs2 narrow peak calling
[info] macs2 broad peak calling
[info] Getting broad peak summits
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/get_summits_broadPeak.py", line 14, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.broad/GATA1_D7_30min_chr11_peaks.broadPeak'
[info] SEACR stringent peak calling
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/change.bdg.py", line 12, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat_pileup.bdg'
Calling enriched regions without control file
Proceeding without normalization of control to experimental bedgraph
Using stringent threshold
Creating experimental AUC file: Sun Jan 7 16:46:26 EST 2024
/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/SEACR_1.1.sh: line 103: 0I?e1?z۳0???.auc.bed: Illegal byte sequence
Unable to access /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat.stringent.bed
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/get_summits_seacr.py", line 14, in
f = open(sys.argv[1])
FileNotFoundError: [Errno 2] No such file or directory: '/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/seacr/GATA1_D7_30min_chr11_treat.stringent.bed'
[info] Generating the normalized signal file with BigWig format...
Sun Jan 7 16:46:26 EST 2024
cp: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11.cpm.norm.bw: No such file or directory
cp: /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11.cpm.norm.bw: No such file or directory
[info] Your bigwig file won't be normalized with spike-in reads
[info] Input file is /Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/bulk-example-test/GATA1_D7_30min_chr11/peakcalling/macs2.narrow/GATA1_D7_30min_chr11_peaks.narrowPeak
cat: GATA1_D7_30min_chr11_peaks.narrowPeak: No such file or directory
cat: GATA1_D7_30min_chr11_summits.bed: No such file or directory
[info] Get randomized [1000] peaks from the top [2000] peaks...
[info] Filtering the blacklist regions for the selected peak files
[info] Getting Fasta sequences
[info] Start MEME analysis for de novo motif finding ...
[info] Up to 10 will be output ...
Log::Log4perl configuration looks suspicious: No loggers defined at /Users/shivanikushwaha/opt/anaconda3/envs/meme/lib/site_perl/5.26.2/Log/Log4perl/Config.pm line 325.
Starting getsize: getsize random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> $metrics
getsize ran successfully in 0.026079 seconds
Starting fasta-most: fasta-most -min 50 < random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> $metrics
fasta-most ran successfully in 0.074794 seconds
Starting fasta-center: fasta-center -dna -len 100 < random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa 1> random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered
fasta-center ran successfully in 0.081879 seconds
Starting fasta-shuffle-letters: fasta-shuffle-letters random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-shuffled -kmer 2 -tag -dinuc -dna -seed 1
fasta-shuffle-letters ran successfully in 0.025716 seconds
Starting fasta-get-markov: fasta-get-markov -nostatus -nosummary -dna -m 1 random1000/MEME_GATA1_D7_30min_chr11_shuf/GATA1_D7_30min_chr11_summits_padded.fa random1000/MEME_GATA1_D7_30min_chr11_shuf/background
fasta-get-markov ran successfully in 0.024623 seconds
Starting meme: meme random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered -oc random1000/MEME_GATA1_D7_30min_chr11_shuf/meme_out -mod zoops -nmotifs 10 -minw 6 -maxw 30 -bfile random1000/MEME_GATA1_D7_30min_chr11_shuf/background -dna -revcomp -nostatus
No sequences found in file `random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered'. Check file format.
meme exited with error code 1
Starting dreme: dreme -verbosity 1 -oc random1000/MEME_GATA1_D7_30min_chr11_shuf/dreme_out -png -dna -p random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-centered -n random1000/MEME_GATA1_D7_30min_chr11_shuf/seqs-shuffled -m 10
File "/Users/shivanikushwaha/opt/anaconda3/envs/meme/bin/dreme", line 765
print "Finding secondary RE in left flank..."
^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print("Finding secondary RE in left flank...")?
dreme exited with error code 1
Starting meme-chip_html_to_tsv: meme-chip_html_to_tsv random1000/MEME_GATA1_D7_30min_chr11_shuf/meme-chip.html random1000/MEME_GATA1_D7_30min_chr11_shuf/summary.tsv "meme-chip -oc random1000/MEME_GATA1_D7_30min_chr11_shuf -dreme-m 10 -meme-nmotifs 10 random1000/padded.fa/GATA1_D7_30min_chr11_summits_padded.fa" 5.0.5 "Mon Mar 18 20:12:19 2019 -0700"
meme-chip_html_to_tsv ran successfully in 0.132533 seconds
[info] De Novo motifs can be found: random1000/MEME_GATA1_D7_30min_chr11_shuf ...
[info] Loading the De Novo motifs ...
Traceback (most recent call last):
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/read.meme.py", line 94, in
dreme_matrices = read_dreme(this_dir + "/dreme_out/dreme.txt")
File "/Users/shivanikushwaha/Documents/Documents_Shivani/apps/CUT-RUNTools-2.0/install/read.meme.py", line 47, in read_dreme
f = open(n)
FileNotFoundError: [Errno 2] No such file or directory: 'random1000/MEME_GATA1_D7_30min_chr11_shuf/dreme_out/dreme.txt'
[info] The signficance cutoff of Fimo scaning is 0.0005...
[info] Motif files can be found: random1000/MEME_GATA1_D7_30min_chr11_shuf/motifs
[info] Filtering the blacklist regions for the selected peak files
[info] Getting Fasta sequences
[info] Scaning the De Novo motifs for each peak
ls: random1000/MEME_GATA1_D7_30min_chr11_shuf/motifs: No such file or directory
[info] Output can be found: fimo.result/GATA1_D7_30min_chr11
Congrats! The bulk data analysis is complete!
The text was updated successfully, but these errors were encountered: