I have figured out the issue. Because you asked me why they were being grouped it got me thinking and everything was being grouped on "Order ID". It appears that Freeflow Core autmatically groups on the pre-aranged field "Order ID". I changed my mapping from the pre-determined field "Order ID" to a custom field I created called "Order" and then there was no more grouping when I dropped a csv into the hotfolder. It is not neccessary for us to group by order number and it looks like if one file fails in a group the entire group will fail.
Problem solved!
What I'm interested in is how a job group "test" was formed with a subjob "CL100.E" that does not exist. I'm guessing that a WF node is allowing this to happen. "Join" perhaps -- but I cannot seem to be able to re-create your scenario.
Below is a screen shot of the MAX setup. As you can see I am routing to multiple workflows and each of these workflows are very large because of the many routing decisions being made.
MAX setup
The workflow is too big give you a screen shot. Is there anything in particular you want to see?
Can you supply a screen snap of the workflow you are using? Thanks.
That was one of the first things that I checked and both of these are deselected and it still cancels the entire job.
I am having an issue with MAX file processing. It only happens if there is a file not in the repository. I am submitting a CSV file like this
The status report that it creates looks like this
It shows all of the files were a sucess but the last file was not found. This is expected and without going into a long explanation is part of our pocess. Unfortunately Freeflow Core is actually cancelling all of the items in the MAX order and when I look at the cancelled items it looks like this?
Is there anyway to allow the rest of the files to process when it finds that one of the files is not in the repository?
Solved! Go to Solution.