1. Executive Summary
Purpose. Test whether dynamic per-lead personalisation increases engagement until complexity exceeds recognisability.
We simulate per-lead token entropy and evaluate its link to open, reply and dwell metrics.
“Too much personalisation breaks pattern trust.”
2. Hypothesis & Theoretical Framework
Hypothesis. Dynamic per-lead personalisation increases engagement until complexity exceeds recognisability.
Theory Link. Cognitive processing theory: personalisation aids relevance up to the point of pattern loss, where users stop trusting authenticity.
Predicted Outcome. Engagement (open/reply) peaks at moderate personalisation entropy, H = 0.45–0.60.
Potential Impact. Defines saturation thresholds for future AI content engines.
Entropy (H) is a token-level diversity score scaled to [0,1].
3. Data & Metrics
- Open Rate (%): Immediate engagement.
- Reply Rate (%): Deep engagement.
- Dwell Time (sec): Reading time.
- Personalisation Entropy (H): Token-level diversity (0–1).
4. Experiment Execution Plan
- Define Variants: AI per-lead generated vs static MJML templates.
- Sample Size: 10,000 messages (evenly split).
- Measurement: Track per-token entropy and engagement events.
- Analysis: Fit quadratic model and identify peak.
5. Results
| Variant | Mean H | Open % | Reply % | Dwell (s) |
|---|---|---|---|---|
| Static | 0.05 | 14.2 | 1.1 | 6.8 |
| AI Low | 0.30 | 24.9 | 2.8 | 8.4 |
| AI Optimal | 0.48 | 31.3 | 3.9 | 10.6 |
| AI Excessive | 0.78 | 19.4 | 2.0 | 7.2 |
6. Data Analysis & Interpretation Framework
- Raw Analysis: Fit engagement curve vs entropy.
- Feedback Loop: Tune personalisation engine to target H ≈ 0.45–0.55.
- Synapse Peer Review: Validate thresholds in peer trials.
- Integration: Ship an engine constraint: Entropy Governor.
7. Insights & Discussion
- “Too much personalisation breaks pattern trust.”
- Engagement collapses beyond entropy ≈ 0.65.
- Human review: messages became “AI-noisy” (inconsistent tone).
- Optimal personalisation balances novelty and coherence.
8. Ethical Considerations
- No user-specific data retained.
- Focus on model-level entropy metrics, not individual behaviour.
- Reinforces authenticity over manipulation.
9. Contribution to the Synapse Ecosystem
- Data Improvement: Adds “entropy saturation” constraint to content bandits.
- Educational Value: Helps members tune per-lead generation safely.
- Innovation Signal: Foundation for adaptive coherence-calibration research.