Quantcast
Channel: Ask the FireCloud Team — GATK-Forum
Viewing all articles
Browse latest Browse all 1147

Cannot read a TSV that's bigger than 128K bytes, but I can write it!

$
0
0

I'm trying to create a join call-set and I have a large TSV I need to read for processing. I'm getting the following error:

>message: Workflow has invalid declarations: Could not evaluate workflow declarations:
> JointGenotyping. Use of WdlSingleFile(gs://fc-abad691f-3e3e-4ddc-85a6-e399521974bf/
>6f9f789f-746a-4fa7-bc08-30154d86c919/JointGenotyping/d3c4daff-5553-4917-8e4f-0579cd9a826f/
>call-RotateGVCF/output.tsv) failed because the file was too big (100362390 bytes when
>only files of up to 128000 bytes are permissible

The file was generated in a previous step by cromwell, so it's quite annoying that it's impossible to read it back in.

Would it be possible to increase the limit? Alternatively, if one were able to iterate over the lines of that file, that would also be good.

In the absence of this possibility, I would have to do even more processing within the task, instead of scattering using cromwell.


Viewing all articles
Browse latest Browse all 1147

Trending Articles