Most workflow automation doesn't need to be complicated. A single Python script, a Claude API call, and a cron job will handle 80% of the repetitive tasks most small businesses deal with. This post walks through exactly how to build that — starting from scratch.
The pattern
Almost every AI automation follows the same shape:
- Pull data from somewhere (a file, a database, an API, an email inbox)
- Send it to an LLM with a clear instruction
- Do something with the output — save it, send it, log it
That's it. The complexity in real systems comes from handling edge cases, retries, and scheduling — not from the core logic.
Setting up
You'll need Python 3.9+ and the Anthropic SDK:
pip install anthropic
Create a .env file in your project root:
ANTHROPIC_API_KEY=sk-ant-...
Load it in your script with python-dotenv, or just set the env variable directly in your shell.
A real example: daily report summarizer
Say you run a small business and every morning you read through a CSV of the previous day's transactions to spot anything unusual. Takes 20 minutes. Here's how to automate it.
Step 1 — read the data
import csv
import os
from anthropic import Anthropic
client = Anthropic()
def load_transactions(filepath):
with open(filepath, newline='') as f:
reader = csv.DictReader(f)
rows = list(reader)
return rows
Step 2 — build the prompt
The quality of your output depends heavily on how clearly you frame the task. Be specific about what you want back.
def build_prompt(transactions):
data = "\n".join(
f"{row['date']} | {row['description']} | ${row['amount']}"
for row in transactions
)
return f"""You are reviewing daily business transactions.
Below is a list of transactions from yesterday.
{data}
Summarize in 3–5 bullet points:
- Any unusually large or small amounts
- Patterns worth noting
- Anything that looks like it needs follow-up
Be concise. Plain English. No financial advice."""
Step 3 — call the API
def get_summary(prompt):
message = client.messages.create(
model="claude-haiku-4-5-20251001",
max_tokens=512,
messages=[{"role": "user", "content": prompt}]
)
return message.content[0].text
Model choice matters for cost. For daily summaries like this, claude-haiku is fast and cheap — fractions of a cent per run. Save Sonnet or Opus for tasks that need deeper reasoning.
Step 4 — do something with the output
Saving to a file is the simplest option. From there you can email it, push it to Slack, log it to a database — whatever fits your workflow.
from datetime import date
def save_report(summary):
filename = f"reports/daily-{date.today()}.txt"
os.makedirs("reports", exist_ok=True)
with open(filename, "w") as f:
f.write(summary)
print(f"Saved: {filename}")
if __name__ == "__main__":
rows = load_transactions("data/transactions.csv")
prompt = build_prompt(rows)
summary = get_summary(prompt)
save_report(summary)
print(summary)
Running it automatically
On Linux (or WSL2), add a cron job to run it every morning at 7am:
crontab -e
# Add this line:
0 7 * * * /usr/bin/python3 /home/user/reports/daily_summary.py >> /var/log/daily_summary.log 2>&1
On Windows, use Task Scheduler. On a server, a systemd timer works well for more control over retry behavior.
Where to go from here
This pattern scales to almost anything:
- Customer reviews — pull from Google/Yelp API, summarize sentiment weekly
- Support emails — read from Gmail API, draft reply suggestions
- Inventory alerts — query a database, flag items below reorder threshold with AI explanation
- Competitor monitoring — scrape a few pages, summarize changes
The key is keeping each automation focused on one job. Once it's running reliably, adding a second task takes minutes — you already have the scaffolding.
If you're trying to automate something specific and running into trouble, get in touch — happy to help think through it.