I have a use case of snakemake where I’ve integrated a simulation/analysis workflow with an inference framework. The framework randomly selects parameters, writes them to a temporary config file, runs snakemake, pulls the resulting summary statistics from a file, and then deletes the directory snakemake writes to. The issue is each simulation generates thousands of metadata files and I need to run hundreds of simulations. A lesser, but related problem is the production of output files from the job scheduler.
Idea: Could the clean-metadata flag clean all produced metadata from a snakemake run on success?
Is there a way to utilize onsuccess to target metadata and scheduler output? Or should I set the target metadata directory and scheduler output in my output directory?
I have multiple snakemake instances running simultaneously so I can’t delete everything after a snakemake instance completes. I’m currently deleting all files older than a day, but that relies on me remember to run said commands occasionally.