I was sick of cluttered download folders, manually sorting invoices, and forgetting to back up critical files. So, I built a Python-powered automation pipeline that cleans, organizes, renames, uploads, zips, and even emails files on schedule. Here’s the whole setup.
1. Why I Needed a File Automation Workflow
Let me paint the picture: Every day, I download PDFs (bank statements, invoices), images (from design tools), and project files that I plan to sort later. That “later” never comes. So I end up with a junkyard of 2,000+ files in my downloads folder.
I wanted a system that:
- Cleans my downloads folder daily
- Detects file types and renames them smartly
- Moves files into labeled folders
- Syncs important ones to the cloud
- Emails reports when done
And I wanted it to run automatically every night. So I turned to the only dependable employee I have: Python.
2. Tools I Used
| Feature | Tool |
| --------------------- | ----------------------------------------------- |
| File handling | `os`, `shutil`, `pathlib` |
| Scheduling | `schedule`, `time` |
| PDF + image metadata | `PyMuPDF`, `Pillow` |
| Zipping and backup | `zipfile`, `os` |
| Email reporting | `smtplib`, `email` |
| Cloud sync (optional) | `boto3` (for S3) or `gspread` for Google Sheets |
Let’s go step-by-step.
3. Cleaning Up the Downloads Folder
from pathlib import Path
import shutil
downloads = Path.home() / "Downloads"
archive = Path.home() / "Documents" / "Archived"
def clean_old_files(folder=downloads, days_old=5):
for file in folder.iterdir():
if file.is_file():
age = (Path().stat().st_mtime - file.stat().st_mtime) / 86400
if age > days_old:
archive.mkdir(parents=True, exist_ok=True)
shutil.move(str(file), archive / file.name)
clean_old_files()
This script moves anything older than 5 days to an Archived
folder. It’s like a Roomba for your files.
4. Renaming and Organizing by File Type
I use smart renaming with timestamps and categorization:
import datetime
def smart_rename_and_organize():
categories = {
"pdf": "Documents",
"jpg": "Images",
"png": "Images",
"docx": "WordDocs",
"xlsx": "Sheets"
}
for file in downloads.iterdir():
if file.is_file():
ext = file.suffix.lower().strip(".")
category = categories.get(ext)
if category:
target_dir = Path.home() / category
target_dir.mkdir(exist_ok=True)
timestamp = datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
new_name = f"{file.stem}_{timestamp}{file.suffix}"
shutil.move(str(file), target_dir / new_name)
smart_rename_and_organize()
Now files like invoice.pdf
become invoice_20250619.pdf
, safely stored in ~/Documents
.
Compressing and Zipping for Backup
At the end of the week, I zip everything for archiving:
from zipfile import ZipFile
def zip_folder(source_folder, zip_path):
with ZipFile(zip_path, "w") as zipf:
for file in source_folder.rglob("*"):
zipf.write(file, file.relative_to(source_folder))
zip_path = Path.home() / "Backups" / "weekly_backup.zip"
zip_folder(Path.home() / "Documents", zip_path)
6. Emailing a Daily Report
Want an email that says “Hey, I backed up 20 files for you”? Easy
import smtplib
from email.message import EmailMessage
def send_email_report(subject, body):
msg = EmailMessage()
msg["From"] = "[email protected]"
msg["To"] = "[email protected]"
msg["Subject"] = subject
msg.set_content(body)
with smtplib.SMTP("smtp.gmail.com", 587) as smtp:
smtp.starttls()
smtp.login("[email protected]", "your-app-password")
smtp.send_message(msg)
send_email_report(
"Daily Backup Completed",
"All files processed and backed up as of " + datetime.datetime.now().strftime("%Y-%m-%d %H:%M")
)
7. Running It All on a schedule
import schedule
import time
schedule.every().day.at("23:30").do(clean_old_files)
schedule.every().day.at("23:32").do(smart_rename_and_organize)
schedule.every().sunday.at("23:59").do(zip_folder, Path.home() / "Documents", Path.home() / "Backups" / "weekly.zip")
schedule.every().day.at("00:01").do(send_email_report, "Files Processed", "Another day, another backup.")
while True:
schedule.run_pending()
time.sleep(1)
And now your personal file assistant runs like a silent ninja every night.
8. Bonus: Upload to Google Drive or S3
For those who want cloud sync:
pip install boto3
import boto3
def upload_to_s3(file_path, bucket_name, key):
s3 = boto3.client("s3")
s3.upload_file(str(file_path), bucket_name, key)
You can also use gspread
for Google Sheets logs, or pydrive
to upload zipped folders.
Final Thoughts: No More Digital Mess
With a few hours of Python, I built a daily automation system that:
- Organizes downloads
- Backs up docs weekly
- Sends me status updates
- Keeps my folders clean and searchable
I don’t even think about file management anymore. It just happens.