I want to download 15,000 files from my FireCloud bucket, all of which have the same filename but different paths, eg gs://fc-mybucket/MyTool/*/MySubtask/output.vcf
It's too slow to make 15,000 calls to gsutil as follows: gsutil cp gs://fc-mybucket/MyTool/ID1/MySubtask/output.vcf ID1.output.vcf
Using the -m option as follows doesn't work:
gsutil cp -m gs://fc-mybucket/MyTool/*/MySubtask/output.vcf mydir/
because it keeps copying all the source files into a single file mydir/output.vcf.
I also tried using -r:
gsutil cp -m -r gs://fc-mybucket/MyTool/*/MySubtask/output.vcf mydir/
But this also fails to create all the subdirectories necessary.
I would like to specify two input files: a list of 15000 source paths and a list of 15000 destination paths.
Any suggestions of how to download and rename all my files in parallel?