My current company has a script that runs and deletes files that haven’t been modified for two years. It doesn’t take into account any other factors, just modification date. It doesn’t aks for confirmation and doesn’t even inform the end user about.
Thought about it but I use modification date for sorting to have the stuff I’ve recently worked on on top. I instead keep the files where the script isn’t looking. The downside is they are not backed up so I might potentially lose them but if I don’t do that, then I’ll lose them for sure…
You don’t actually have to set all the modification dates to now, you can pick any other timestamp you want. So to preserve the order of the files, you could just have the script sort the list of files by date, then update the modification date of the oldest file to some fixed time ago, the second-oldest to a bit later, and so on.
You could even exclude recently-edited files because the real modification dates are probably more relevant for those. For example, if you only process files older than 3 months, and update those starting from "6 months old"1, that just leaves remembering to run that script at least once a year or so. Just pick a date and put a recurring reminder in your calendar.
1: I picked 6 months there to leave some slack, in case you procrastinate your next run or it’s otherwise delayed because you’re out sick or on vacation or something.
Change the date on all the files by scaling to fit the oldest file. Scale to 1 year as a safe maximum age. So if the oldest file is 1.5 years old, scale all files to be t/1.5 duration prior to now.
Create a series of folders labeled with dates. Every day copy the useful stuff to the new folder. Every night change modified dates on all files to current date.
What industry are you in. This could be compliance for different reasons. Retention is a very specific thing that should be documented in policies.
I know financial institutions that specifically do not want data just hanging around. This limits liability and exposure if there is a breach, and makes any litigation much easier if the data doesn’t exist by policy.
Should they be more choosy on what gets deleted, yea probably. But I understand why it’s there.
That sounds like a lawyers dream… “can’t provide it if it doesn’t exist” … now granted, if they got a subpoena they’d have to save it going forward, but before then, if their not bound by something that forces data retention, the less random data laying around the better.
That’s the worst foresight I think I’ve ever heard of, you might as well make that 3 months if you’re just going to trash thousands of labor hours on those files.
Put all your files in a single zip file. No compression. Since Windows handles zip files like folders, you can work like normal. And the zip file will always have a recent time stamp.
My current company has a script that runs and deletes files that haven’t been modified for two years. It doesn’t take into account any other factors, just modification date. It doesn’t aks for confirmation and doesn’t even inform the end user about.
You should write a script to touch all the files before their script runs.
Thought about it but I use modification date for sorting to have the stuff I’ve recently worked on on top. I instead keep the files where the script isn’t looking. The downside is they are not backed up so I might potentially lose them but if I don’t do that, then I’ll lose them for sure…
You don’t actually have to set all the modification dates to now, you can pick any other timestamp you want. So to preserve the order of the files, you could just have the script sort the list of files by date, then update the modification date of the oldest file to some fixed time ago, the second-oldest to a bit later, and so on.
You could even exclude recently-edited files because the real modification dates are probably more relevant for those. For example, if you only process files older than 3 months, and update those starting from "6 months old"1, that just leaves remembering to run that script at least once a year or so. Just pick a date and put a recurring reminder in your calendar.
1: I picked 6 months there to leave some slack, in case you procrastinate your next run or it’s otherwise delayed because you’re out sick or on vacation or something.
Change the date on all the files by scaling to fit the oldest file. Scale to 1 year as a safe maximum age. So if the oldest file is 1.5 years old, scale all files to be t/1.5 duration prior to now.
Have you…called attention to this at all?
Have a script that makes a copy of all files that are 1.9 years old into a separate folder.
Create a series of folders labeled with dates. Every day copy the useful stuff to the new folder. Every night change modified dates on all files to current date.
What industry are you in. This could be compliance for different reasons. Retention is a very specific thing that should be documented in policies.
I know financial institutions that specifically do not want data just hanging around. This limits liability and exposure if there is a breach, and makes any litigation much easier if the data doesn’t exist by policy.
Should they be more choosy on what gets deleted, yea probably. But I understand why it’s there.
That sounds like a lawyers dream… “can’t provide it if it doesn’t exist” … now granted, if they got a subpoena they’d have to save it going forward, but before then, if their not bound by something that forces data retention, the less random data laying around the better.
That’s the worst foresight I think I’ve ever heard of, you might as well make that 3 months if you’re just going to trash thousands of labor hours on those files.
Put all your files in a single zip file. No compression. Since Windows handles zip files like folders, you can work like normal. And the zip file will always have a recent time stamp.