Skip to content

Commit 9844f07

Browse files
committed
Add backup retention policy - keep only latest 7 backups on S3
After uploading new backup, delete old backups exceeding 7 files: - List all .sql.gz files from S3 bucket - Sort by date descending - Keep first 7, delete older ones This prevents unlimited backup accumulation and reduces storage costs.
1 parent da47dcd commit 9844f07

1 file changed

Lines changed: 22 additions & 0 deletions

File tree

scripts/backup.sh

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,28 @@ while true; do
2121
# Remove local SQL file
2222
rm *.sql.gz
2323

24+
# Keep only the latest 7 backups on remote
25+
BACKUP_FILES=$(s3cmd --access_key=$S3_ACCESS_KEY \
26+
--secret_key=$S3_SECRET_KEY \
27+
--region=$S3_BUCKET_LOCATION \
28+
--host=$S3_HOST_BASE \
29+
--host-bucket=$S3_HOST_BUCKET \
30+
ls s3://${S3_BUCKET}${S3_PREFIX}/ | grep '\.sql\.gz$' | sort -r)
31+
32+
# Count and delete old backups
33+
COUNT=0
34+
for FILE in $BACKUP_FILES; do
35+
COUNT=$((COUNT + 1))
36+
if [ $COUNT -gt 7 ]; then
37+
s3cmd --access_key=$S3_ACCESS_KEY \
38+
--secret_key=$S3_SECRET_KEY \
39+
--region=$S3_BUCKET_LOCATION \
40+
--host=$S3_HOST_BASE \
41+
--host-bucket=$S3_HOST_BUCKET \
42+
del $FILE
43+
fi
44+
done
45+
2446
# Backup 1 day later.
2547
sleep 86400
2648

0 commit comments

Comments
 (0)