After building on dozens of emerging platforms over ten years—Alexa, Android Automotive OS, Apple Watch, blockchain, AR, VR, AI—we've learned how to evaluate and adopt new technologies.
Not perfectly. Some of our bets failed. But we've developed a pattern recognition that helps separate signal from noise.
Here's the playbook.
The Pattern of Platform Adoption
Every emerging platform follows a similar curve:
Phase 1: Hype (Too early, underbaked, but exciting)
Platform announces. Tech press gets excited. Everyone claims it's revolutionary. Most companies watch, waiting.
Alexa 2015. Apple Watch 2015. Blockchain 2017. AI/LLMs 2023.
Phase 2: Disappointment (Harder than expected, use cases unclear)
Reality sets in. Platform is buggy. Use cases are limited. Adoption slower than predicted. Many companies dismiss it.
Voice skills harder to build than expected. Wearable use cases unclear. Blockchain doesn't revolutionize everything. AI hallucinations make production deployment scary.
Phase 3: Maturity (Platform evolves, real use cases emerge)
Platform improves. Developers figure out what works. Real applications emerge. Mainstream adoption begins.
Voice works for hands-free contexts. Wearables find health/fitness niche. Blockchain finds supply chain applications. AI finds transcription, analysis, generation use cases.
Where to Enter
Most companies wait for Phase 3. By then, competition is established. Early-mover advantage is gone. You're building what everyone else already built.
We typically enter in Phase 2: after initial hype, when platform is still evolving, but before mainstream adoption.
Early enough to influence platform direction. Early enough to build brand association. Early enough to learn patterns before competitors.
Late enough that platform basics work. Late enough to avoid vaporware. Late enough to build real products, not just proofs-of-concept.
Not first. Fast first.
How to Recognize Signal vs. Noise
Not every emerging platform deserves investment. We filter using five questions:
1. Does this solve a real user problem?
Not "is this technology cool?" but "does someone wake up with a problem this solves?"
Signal:
Voice interfaces for hands-free contexts (driving, cooking, accessibility)
AI transcription (people constantly need audio → text)
Real-time data streaming for mobile apps (users expect current information)
Noise:
Blockchain for everything (decentralization isn't a user need)
VR social spaces with no clear use case (tech in search of problem)
NFTs for digital art (speculation, not utility)
The question isn't "what can this technology do?" It's "what problem does this solve that people currently struggle with?"
2. Is the ecosystem aligned?
Platform success requires aligned incentives: manufacturers, developers, users.
Aligned Ecosystem (Android Automotive OS):
Car manufacturers wanted app platforms (competitive differentiation)
Google provided infrastructure (extend Android reach)
Developers had mobile skills to transfer (low learning curve)
Users wanted in-car apps (familiar interaction model)
Incentives aligned. Platform succeeded.
Misaligned Ecosystem (Some VR platforms):
Hardware expensive (user barrier)
Content scarce (developer chicken-egg problem)
Motion sickness issues (user experience problems)
Fragmented platforms (developer effort multiplied)
Incentives misaligned. Adoption slow.
Ask: Are incentives aligned for this to grow? Or does success require overcoming misaligned interests?
3. Can you build defensible value?
Early platform advantage only matters if it compounds.
High Defensibility:
AI applications with proprietary data (your data creates moat even as AI models commoditize)
Platform integration with network effects (early users create value for later users)
Custom models fine-tuned on your domain (general models can't replicate)
Low Defensibility:
Generic voice skills (platform commoditizes quickly)
Simple platform wrappers (anyone can build same thing)
No unique data or network effects (first-mover advantage evaporates)
Question: If we're early to this platform, does being early create lasting advantage? Or just temporary head start?
4. What's the integration cost?
New platforms sound great until you realize what it takes to integrate them.
Moderate Integration (Voice to Existing App):
New interaction model (design work required)
Backend mostly unchanged (APIs can serve voice too)
Team can learn voice patterns (extends existing skills)
Feasible. Worth exploring.
High Integration (Blockchain for Existing Workflows):
Rearchitect everything (can't bolt blockchain onto existing systems)
New mental model (fundamentally different approach)
Unclear ROI (massive investment for uncertain benefit)
Often not worth it. Blockchain became "solution in search of problem" for many use cases because integration cost exceeded value.
Question: Does this platform integrate with what we have? Or require rebuilding everything?
5. Can your team learn it fast enough?
Best platform doesn't matter if your organization can't build on it.
We have transformation capability—we can teach teams to learn platforms quickly. But even we evaluate: How steep is the learning curve?
Learnable (Voice, Wearables, Mobile Extensions):
Team has relevant foundation (mobile development transfers to wearables)
Platform documentation improves over time
Community knowledge accumulates (Stack Overflow, tutorials emerge)
Harder (Some AR/VR, Blockchain Smart Contracts):
Requires entirely new skill set
Documentation remains thin
Few experienced practitioners to learn from
Question: Can your team become proficient fast enough to capitalize on being early? Or is learning curve too steep for timeline?
When to Commit vs. When to Experiment
Not every emerging platform deserves full commitment. We operate on three levels:
Monitor: Watch from distance. Track progress. Don't invest yet.
Very early stage (Phase 1 hype)
Unclear use cases
Ecosystem misalignment
Example: Many blockchain use cases, early metaverse platforms
Experiment: Build proofs-of-concept. Learn the platform. Don't bet the company.
Phase 2 (post-hype, pre-maturity)
Potential use cases emerging
Platform evolving but functional
Example: AR shopping experiences, some AI features
Commit: Full investment. Production products. Real resources.
Clear use cases validated
Ecosystem aligned
Defensible value identified
Team capable of learning
Example: AI transcription apps, mobile AI features, cloud data platforms
The mistake: Committing too early (building on vaporware) or too late (missing window).
How to Build MVPs on Uncertain Platforms
When experimenting on emerging platforms:
1. Start Small
Don't build the full vision. Build the smallest thing that tests the core hypothesis.
PGA Tour Alexa: Started with basic score lookup. Not full tournament experience.
AccuWeather AAOS: Started with current weather. Not full forecast features.
Test if platform can deliver core value. Expand if it works.
2. Learn, Don't Just Build
Goal isn't shipping a feature. It's learning if this platform can solve the problem.
What works? What doesn't? What do users actually do vs. what we expected?
Many of our wearable experiments: Learned that some interactions don't work on wrist. Valuable learning even though features were abandoned.
3. Time-Box Experiments
Give yourself defined timeframe. If platform doesn't prove value in that window, pivot.
Don't let experiments become zombie projects that limp along indefinitely.
What Failure Looks Like (And How to Learn From It)
Not every platform bet works. We've built on technologies that fizzled:
Voice-First Interfaces: Some use cases worked (hands-free). Many didn't (users preferred screens). Learned: Multimodal (voice + screen) beats voice-only.
Some VR Experiences: Nausea issues. Limited use cases. High hardware barrier. Learned: Spatial computing needs better hardware and killer use cases. Watching Apple Vision Pro with those lessons in mind.
NFTs: Speculation-driven, not utility-driven. Learned: Scarcity alone isn't value. But smart contract patterns have legitimate applications beyond NFTs.
Some Blockchain Use Cases: Decentralization for its own sake. Learned: Only use blockchain where distributed trust is actually required.
Failed platforms teach you:
How to recognize when platform isn't delivering promised value
When to cut losses vs. when to persist
How to extract lessons that apply elsewhere
What patterns to avoid on next platform
Current Platforms We're Watching
Based on our evaluation framework:
Committing:
AI-first engineering: Clear use cases, ecosystem aligned, defensible with proprietary data
Cloud data platforms: Required foundation for modern digital experiences
Multimodal interfaces: Voice + screen + gesture combinations
Experimenting:
Spatial computing: Apple Vision Pro early, use cases emerging, hardware improving
Agentic AI: Agents that take action, not just answer questions—potential high but early
Edge AI: On-device processing changes architecture, but ecosystem still forming
Monitoring:
Various web3/decentralized platforms: Use cases still unclear for most enterprise applications
Some metaverse platforms: Tech in search of problem
After Ten Years
We've learned to separate platform hype from platform opportunity.
Not by being smarter than everyone else. By having pattern recognition from building on enough platforms to know what success looks like.
Some platforms we bet on failed. Some experiments taught us what not to do. But overall: Being early to the right platforms, with the ability to ship fast enough to matter, has created compounding advantages.
The playbook:
Evaluate using five questions (user problem, ecosystem, defensibility, integration, learning)
Enter Phase 2 (post-hype, pre-maturity)
Start with experiments, commit when validated
Build small, learn fast, pivot when needed
Extract lessons even from failed bets
The platforms will keep changing. The evaluation framework doesn't.
After ten years: We don't just build on emerging platforms. We know how to evaluate which platforms are worth building on.

