You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we change a worker, it's analysis should be rerun for every document. The analyses that depend on it should also run again.
I think the best approach for this, at least for now, is to create a fabric task that creates a new pipeline for every document, exactly like the one created when a new document is uploaded. This should be pretty straight forward, but first we need to make sure that there is no problem in running the same worker again for the document (which should be the case). One issue that I imagine could happen is mixing old and new results in a collection instead of removing the old and adding the new ones.
The text was updated successfully, but these errors were encountered:
When we change a worker, it's analysis should be rerun for every document. The analyses that depend on it should also run again.
I think the best approach for this, at least for now, is to create a fabric task that creates a new pipeline for every document, exactly like the one created when a new document is uploaded. This should be pretty straight forward, but first we need to make sure that there is no problem in running the same worker again for the document (which should be the case). One issue that I imagine could happen is mixing old and new results in a collection instead of removing the old and adding the new ones.
The text was updated successfully, but these errors were encountered: