AI-powered email newsletter personalization for media
Mass media newsletters with identical content for entire database achieve 15-20% open rate. Personalized digests selecting articles matching individual reader interests achieve 35-50%. Technically this is a pipeline: reader profile → article selection → email subject adaptation.
Personalized digest generation
from anthropic import Anthropic
import pandas as pd
from collections import defaultdict
def generate_personalized_digest(user_profile: dict,
available_articles: list[dict],
n_articles: int = 5) -> dict:
"""
Select and compile personalized digest.
user_profile: {'topics': {'tech': 0.4, 'politics': 0.3}, 'read_ids': set()}
"""
llm = Anthropic()
# Filter already read
read_ids = user_profile.get('read_ids', set())
unread = [a for a in available_articles if a['id'] not in read_ids]
# Score articles by interests
topics = user_profile.get('topics', {})
scored = []
for article in unread:
topic_score = topics.get(article.get('topic', 'general'), 0.05)
freshness = max(0, 1.0 - article.get('hours_old', 24) / 48)
quality = article.get('editorial_score', 0.7)
scored.append({
**article,
'score': topic_score * 0.5 + freshness * 0.3 + quality * 0.2
})
top_articles = sorted(scored, key=lambda x: -x['score'])[:n_articles]
# Generate personalized email subject
article_titles = [a['title'] for a in top_articles]
response = llm.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=80,
messages=[{
"role": "user",
"content": f"""Write a compelling email subject line for a news digest in English.
Articles: {article_titles[:3]}
Reader's main interests: {list(topics.keys())[:3]}
Max 55 chars. No clickbait. Be specific."""
}]
)
return {
'articles': top_articles,
'subject': response.content[0].text.strip(),
'personalization_applied': True
}
Personalized digests require minimum 10-15 read articles for quality profile. Below this threshold use category-level personalization (selecting sections) without article-level detail. Technical stack: Redis for profile storage, Apache Kafka for event streaming, batch generation 1-2 hours before sending.







