Quantcast
Channel: StackExchange Replication Questions
Viewing all articles
Browse latest Browse all 17268

Optimizing Powershell Script to find old files and delete them on DFS replicated folders

$
0
0

Here is the story. I have a fileshare that is replicated between 2 servers located in different places in the world. DFS will not replicate a file if it has only been viewed, but I wouldn't want to delete that file/folder because it was used within the time period I have set (7 days). So to make sure that I don't remove still used files I have to check both locations for their LastAccessTime.

I currently have this

Set-ExecutionPolicy RemoteSigned
$limit = (Get-Date).AddDays(-7)
$PathOne = "FirstPath"
$PathTwo = "SecondPath"
$ToBeDeletedPathOne = Get-ChildItem -Path $PathOne -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$TobeDeletedPathTwo = Get-ChildItem -Path $PathTwo -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastAccessTime -lt $limit }
$DiffObjects = Compare-Object -referenceobject $ToBeDeletedPathOne -differenceobject $ToBeDeletedPathTwo -IncludeEqual
$ToBeDeletedOverall = DiffObjects | where {$_.SideIndicator -eq "=="}

After this, I loop through and delete the files that are marked for deletion by both locations.

Part of the problem I have is that there are a tremendous amount of files and this can take a very long time. So I wanted to make it better/faster. My idea is to have this script run the scan as a different script on each FS server and wait for them to return the output. That way it can scan on the local machine easier than remotely and it would scan both locations simultaneously.

The other part of the problem comes in with the fact that I have no idea how to do this. I will continue to work on this and if I solve it, I will post here in case anyone in the future finds this useful.


Viewing all articles
Browse latest Browse all 17268

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>