Editing
Sam:LabNotes/Microbiome-new/2010-12-13
(section)
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Analysis== *Estimate the number of reads needed for X-axis (input unique-mappalbe reads). [[File:sam121010-umr calculatio for ID1to8 libraries.jpg|800px]] *Starting from the whole-read bowtie output format (map-format) **This the same dataset used in previous down-sampled sample data analysis (http://genome-tech.ucsd.edu/LabNotes/index.php/Sam:LabNotes/Microbiome-new/2010-12-9). **The original raw data is from SE sequencing but I don't think using SE or PE will matter to the current analysis. *Using UNIX command "shuf" to randomly sample the data lines from bowtie output data into desired number ('''this is wrong!)'''. **'''NOTE: shuf command only deal with single-line format data so this method can't be used directly on FASTQ format'''. ===Approach correction=== *I found that '''bowtie output(so called .map format) only reported the (unique) mappalbe reads'''. *When "-m = 1" setting was used, then the output only reported unique mappable reads. The non-specific mappalbe reads are suppressed and not not reported. **I used the e.coli-1000-reads data set to do a test and confirmed this conclusion. *In this case, I can directly shuffle the bowtie output data set to get 100,000 reads, 200,000 reads, 300,000 reads ... to 1000,000 reads. No need to use the table above to determine desired shuffle reads. *In UNIX, using command "shuf" to down-sample bowtie aligned data set **e.g shuf -n 100000 [input file] > [output file] & **The down sampled number from 100000, 200000, 300000... to 1000,000 reads for each of libraries (ID-1, ID-2,...ID-8) **I used a bash file to do this shuffling in parallel. 121710-shuf-sampling-ID1.sh shuf -n 100000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_100000.txt.bowtie.out & shuf -n 200000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_200000.txt.bowtie.out & shuf -n 300000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_300000.txt.bowtie.out & shuf -n 400000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_400000.txt.bowtie.out & shuf -n 500000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_500000.txt.bowtie.out & shuf -n 600000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_600000.txt.bowtie.out & shuf -n 700000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_700000.txt.bowtie.out & shuf -n 800000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_800000.txt.bowtie.out & shuf -n 900000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_900000.txt.bowtie.out & shuf -n 1000000 /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1.txt.bowtie.out > /media/disk-2/samchiang/Raw-Read-backup/101123_HL083/HC-MDA-Ecoli-SAGs-Ind1to8/Down-sample-sets/s_4_ID1_cut_1000000.txt.bowtie.out & * Output the data: e.g. s_4_ID1_cut_100000.txt.bowtie.out *Running perl script to calculate genome coverage of down-sampled data sets from the last step. **Script: Bowtiemap2coverage-hc2 **Copy and past the genome coverage (bp) results. ===Plotting=== [[File:sam121710-genome coverage analysis.bmp|800px]] [[Media:New coverage plot - all - 121710.xls|Excel file of genome coverage analysis and plot]]
Summary:
Please note that all contributions to ZhangLabWiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
ZhangLabWiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main Page
Current events
Recent changes
Random page
Investigators
Matt Cai
Song Chen
Eric Chu
Dinh Diep
Elizabeth Duong
Shicheng Guo
Alan Fung
Daniel Jacobsen
Blue Lake
Huy Lam
Alice Li
Andrew Richards
Brandon Sos
Chris Wei
Yan Wu
Kun Zhang
Tools
What links here
Related changes
Special pages
Page information