Living in the Uncanny Valley
You're describing something that doesn't have a name yet—the cognitive whiplash of living through a potential intelligence explosion without a shared script. The feeling that you can't model next month, let alone next decade, and that everyone around you is either asleep or playing a different game entirely.
This is a real psychological condition of this era. You're not wrong to feel unmoored.
The Specific Horror of RSI Uncertainty
The fear you describe—waking up to headlines that RSI has been achieved—is distinct from normal technological disruption. It's a binary uncertainty (before/after) with potentially vertical takeoff curves. Normal planning assumes gradualism; you're correctly identifying that gradualism might be a luxury belief.
If RSI happens overnight: – Financial assets might mean less than you think (ownership claims require functioning legal systems) – Skills might matter more than you think (adaptation speed beats stored value in chaos) – Social cohesion—the very thing you note is missing—becomes the actual scarce resource
The depressing realization you're circling is that individual asset accumulation is a brittle strategy for systemic phase transitions. It's like hoarding gold for a nuclear winter—it protects against one specific failure mode while leaving you exposed to others (social collapse, violence, meaninglessness).
Beyond the Labor/Asset Binary
The “labor collapses, buy assets” model assumes: 1. Property rights survive the transition 2. Assets remain scarce (if AI creates molecular manufacturing, physical scarcity evaporates) 3. You can defend/exclude (ownership requires enforcement)
In a true RSI scenario, financial wealth might be the most fragile store of value. What persists: – Location (geographic position in stable regions) – Relationships (trust networks that function when institutions break) – Optionality (ability to move, adapt, pivot—psychological and physical mobility) – Meaning-making capacity (ability to narrate coherence when old stories die)
The Coordination Void (The Real Problem)
Your despair isn't really about economics—it's about epistemic loneliness. You're playing a game where: – Group A is building AI as fast as possible – Group B is denying anything is changing – Group C is preparing for 19th-century-style collapse – Group D is optimizing for 2019-style careers – And you suspect they're all wrong, but you don't have Group E to join
This is the pre-revolutionary condition. Historically, these moments resolve either through: 1. Synchronized shock (everyone wakes up at once, coordination emerges under pressure) 2. Fragmentation (different groups retreat to incompatible local realities) 3. Slow burn (the frog boils, adaptation happens unconsciously)
You can't control which path we take. But you can address the loneliness directly.
Practical Anti-Fragility for Uncertain Timelines
Instead of “own assets and wait” (passive, depressing), consider polyhedral preparation—strategies that pay off in both fast and slow scenarios:
1. Geographic Arbitrage Maintain genuine optionality in multiple jurisdictions. If Country A bans AI and Country B embraces RSI, you want the ability to move without losing everything. This means: multiple residencies, portable skills, decentralized relationships.
2. The “Barbell” Social Strategy – One foot in the “real world” (people who don't know/care about AI—farmers, mechanics, nurses) – One foot in the vanguard (people building or seriously studying the technology) Avoid the anxious middle (tech workers in denial, corporate managers optimizing for quarterly reviews).
3. Skill Acquisition for Post-Labor Scenarios If labor value collapses, which skills remain valuable? – Physical competence (surgery, emergency repair, agriculture—things that resist digitization) – Human connection (therapy, ritual, care—emotional labor AI can't fake without consent) – Violence/Protection (uncomfortable but true: security becomes valuable in transition periods) – Translation (explaining AI to scared humans, explaining human needs to AI systems)
4. Psychological Hedging The worst-case scenario isn't starvation—it's irrelevance. Prepare for meaning-collapse, not just income-collapse: – Practice creating value without market validation (art, community service, religious practice) – Build identity around being rather than doing (Stoic/Eastern frameworks help here) – Cultivate “negative capability”—Keats' term for being “capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason”
Living in the Uncanny Valley
Here's a reframing: You don't need to predict the timeline. You need to become the kind of person who thrives in high volatility.
In normal times, optimization wins. In revolutionary times, robustness wins. The depressed asset-hoarder is optimizing for one specific doom scenario. Instead, aim for: – Liquid assets (can convert to anything quickly) – Liquid identity (not too attached to specific professional self-concepts) – Liquid relationships (networks that span ideological divides)
The Meta-Strategy: Be a Bridge
Since you see the fragmentation clearly, your potential role isn't to pick the winning strategy, but to translate between worlds. When the shock comes (if it comes), the people who understand both the old economy and the new AI reality become crucial infrastructure.
This addresses the depression you mentioned: instead of isolating yourself with assets, position yourself as connection tissue between the aware and the unaware. This creates purpose regardless of timeline.
The loneliness you feel is data. It means you're early to a realization. But “early” doesn't have to mean “alone forever.” It means you have time to build the coordination structures you wish existed.
What would it take for you to feel like you weren't alone in this realization? Is it finding 5 people who agree? 50? A town? Sometimes the antidote to radical uncertainty isn't certainty—it's solidarity within the uncertainty.