-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Index FilteredNT on Pegasus for BLAST #18
Comments
On Mon, 22 Apr 2024 10:43:44 -0400, [email protected] wrote:
Hi Raja and Charlies, Thank you for contacting us for your request of using Pegasus to run your NCBI blast computation! @CHARLIES, if your are going to work with huge data, please make sure to select the "defq" partition which allows for up to 14 days runtime to run your jobs. Best, |
New task for @penningtonea
|
From Joe:
|
Next step for NT project: Index NT with blast, find documentation online for blast indexing and submit as slurm job.
Blockers:
|
BLAST command to make database: makeblastdb –in mydb.fsa –dbtype nucl –parse_seqids |
|
Next task:
|
It looks like the command makeblastdb works on Keeney's computer. So @penningtonea please run the command via slurm. Ask Hadley or Keeney for help if you are stuck. For blast search use blastn -db databaseName -query queryFileName. Also check if there is -output you can use Edit: spelling |
Note for SCP transferring: |
Progress update:
The script did not work. Received the same error message from the call earlier - output file with error message: QUESTION: How can I change my account from Account=watkinslab and my GroupID from MG-watkinslab(1111) to our group? I no longer need access to the watkinslab account. |
Downloading preformatted blast will negate the need to index NT on pegasus. Log in to HIVE API and execute the following commands received from NLM to download preformatted NT:
perl update_blastdb.pl --decompress nt
|
@penningtonea you still should go ahead and index filtered-NT on pegasus in parallel. It will be good training for you and also I am not 100% sure the formatted-NT will work. |
Instructions to download indexed NT from NCBI: Hi, Thanks for writing to us. Make sure you have enough disk space. After install standalone blast+, you can use the update_blastdb.pl to download and extract all files needed for nt: perl update_blastdb.pl --decompress nt Do perl update_blastdb.pl --help to see available switches Regards, |
Documenting email response: If you do not, you will need to prefix the nt with direct path." |
Working on downloading and indexing filtered-NT. Afterwards will scp the db to HIVE3. |
Mix-up with task. Will download NT on Pegasus today. Email sent to Adam Wong inquiring the best way to go about indexing NT via slurm scheduler due to the size of the job. |
NT was downloaded and indexed on Pegasus. Directory: SMHS/home/epennington/lustre/groups/hivelab/emily/NT Now downloading and indexing filtered-NT v7.0 on Pegasus. |
@penningtonea
|
GW HPC Pegasus is not working. I am able to log in but unable to submit or allocate jobs. Indexing filtered_nt will resume when Pegasus is back up and running as usual. |
I am moving this closed ticket to July task list to compare to similar ticket |
@HadleyKing and @penningtonea I was trying to follow this ticket. Can you tell me which ticket this was moved to? |
@rajamazumder This ticket was tied to https://github.com/GW-HIVE/Platform/issues/93 |
Apr 19, 2024, 5:09 PM
We would like to try to do an experiment on Pegasis with NCBI's NT db, which is about 1.2 TB last time I checked. We have a filtered version of that which is about 1.1 TB that I would need to transfer from our HIVE resources to Pegasis. I would like to use NCBI BLAST to create an index for the NT db.
We think this process may take a week or so. I wanted to let someone know before I started it so that the job will not get killed.
Once we have the index we will move it out of Pegasus and delete the file. We need the indexed file to run some blasts and compare to will a smaller version of NT (80GB) that we have prepared. This is for a paper we are working on.
The text was updated successfully, but these errors were encountered: