- Joined
- Aug 29, 2019
- Messages
- 1
- Reaction score
- 0
Hello,
I have created an automation where every night my backups are copied to a Google Drive folder with override items, so I won't have any duplicates.
The problem is that the backups is more than 30GB and every night this action triggers all files to get uploaded again, despite the fact that they are already on server.
I need a way my workflow omits the existing items. Note that the files are thousands and in a complex folder schema, so creating an AppleScript to find the "new ones" is that viable as an option. The perfect option would be to micmic the "skip duplicates" choice, when pasting items into a folder that duplicates are found
I have created an automation where every night my backups are copied to a Google Drive folder with override items, so I won't have any duplicates.
The problem is that the backups is more than 30GB and every night this action triggers all files to get uploaded again, despite the fact that they are already on server.
I need a way my workflow omits the existing items. Note that the files are thousands and in a complex folder schema, so creating an AppleScript to find the "new ones" is that viable as an option. The perfect option would be to micmic the "skip duplicates" choice, when pasting items into a folder that duplicates are found