For some reason FreeFlow Core doesn't like the last character in the path when using only sys.argv, so I removed it by using sys.argv[:-1].
I checked the scripts and your solution should indeed solve the issue.
One question about syntax - why is it necessary to write the path in this way:
sys.argv[:-1] for shutil.copy ?
Only one pdf is expected in $FFout$.
You could run a script on the hotfolder instead so that the files are split before they are sent to the workflow. Please see attached sample script split_pdf. It splits a pdf dropped into the hotfolder into two files and creates a MAX file with the two new pdf files. If you use a MAX file to start with then please see sample script split_max. It splits each pdf file in the submitted MAX file and then creates a new MAX file with the new pdf files.
This doesn't seem relevant to my problem - there are no other nodes before External.
External is the first node, which calls a script that splits the pdf file.
No Split nodes are being used. I would have used them instead but that's impossible for a particular task since splitting configurations are defined by multiple variables, while Split node only seems to work with fixed values.
I have noticed that after some components FreeFlow Core renames the pdf to temp.pdf. Please check the stdout.log file. If you in such case return what you think should be the pdf name to $FFout$ it will not work since a file named temp.pdf is expected. One way to handle this is to copy the incoming files to a temp folder with some random name to avoid the temp.pdf file to be overwritten when the job is splitted (because the file will be named temp.pdf for every branch in the Split node). Attached is a sample script that does this. Each temp.pdf file is written to a temp folder with a random name before it is copied to $FFout$ and finally the temp folder with its content is removed.
I've made a script that would split a pdf file based on some variables extracted from its filename, linked it to an External node and tested that splitting does indeed work, but the resulting files don't seem to be appearing in the workflow past the External node.
Just moving the resulted files into the location from "$FFout$" doesn't seem to be doing anything.
How to pass the new script-generated files back into the workflow (preferably as part of the same job group) correctly?
Solved! Go to Solution.