Original Title: 17 things we are excited about for crypto in 2026
Original Author: a16z New Media
Translation: Peggy, BlockBeats
Editor's Note: As we enter 2026, the crypto industry is undergoing a profound structural redefinition: from stablecoins and RWA to AI agent ecosystems, privacy networks, prediction markets, and reshaping of legal frameworks, outlining a key year of technological and institutional convergence. The crypto industry is transitioning from "chain performance competition" to "network effects competition," from "code is law" to "spec is law," and from transaction-driven to product-driven; AI is driving the comprehensive evolution of agent economies and prediction systems.
This article brings together 17 forward-looking observations from various a16z teams, providing a framework for understanding the next phase of the crypto narrative and industry direction.
The following is the original text:
This week, a16z partners from teams such as Apps, American Dynamism, Bio, Crypto, Growth, Infra, and Speedrun released the annual "Big Ideas" trend outlook.
The following content compiles insights from multiple a16z crypto team partners (and several guest authors) on future developments, covering topics ranging from smart agents and AI, stablecoins and asset tokenization, financial innovation, to privacy and security, prediction markets, SNARKs, and other use cases... extending to future construction methods.

Last year, stablecoin trading volume was estimated to reach $46 trillion, continuously setting new records. To better understand this scale: this number is over 20 times that of PayPal; approximately nearly three times the transaction volume of one of the world's largest payment networks, Visa; and quickly approaching the annual transaction volume of the U.S. Automated Clearing House (ACH), commonly used for services like direct deposit.
Today, you can complete a stablecoin transfer in less than a second at a cost of less than 1 cent. However, the truly unresolved issue is: How to connect these digital dollars to the everyday financial networks people use—namely, the stablecoin's on/off-ramp mechanism.
A wave of new startups is pouring into this space, attempting to bridge stablecoins with local payment systems and fiat currencies. Some of them use cryptographic proofs to allow users to privately convert their local balance to digital dollars; others integrate with regional payment networks, leveraging capabilities such as QR codes, real-time payment rails, to achieve interbank payments... Some are even building a truly interoperable global wallet layer, as well as platforms that support direct stablecoin card payments by users.
Overall, these paths collectively broaden the range of people joining the digital dollar economy, and may facilitate stablecoins being more directly used in mainstream payment scenarios.
As these on/off-ramp infrastructures mature, enabling the digital dollar to seamlessly plug into local payment systems and merchant tools, new patterns of behavior will emerge:
Cross-border workers can settle their wages in real-time;
Merchants can accept global dollars without needing a bank account;
Apps can settle value instantly with global users.
Stablecoins will shift from being a "niche financial instrument" to the foundational settlement layer of the Internet era.
——Jeremy Zhang, a16z crypto Engineering Team
In recent years, banks, fintech companies, and asset managers increasingly want to bring U.S. stocks, commodities, indices, and other traditional assets onto the blockchain. However, many RWA tokenizations today exhibit a clear "skeuomorphic" tendency: still based on traditional thinking about real-world assets and not leveraging the advantages of being crypto-native.
On the other hand, synthetic asset forms like perpetual contracts often provide deeper liquidity and a simpler implementation. The leverage structure of perpetual contracts is also easier to understand, so I believe they are the most product-market-fit "crypto-native derivatives." Additionally, I believe emerging market stocks are one of the most deserving categories to be "perpetualized." For example, the zero-day-to-expiry (0DTE) options market for certain stocks often exhibits higher liquidity than the spot market, making it an ideal target for perpetualization experiments.
Ultimately, this boils down to a question of "perpetualization vs. tokenization." In any case, we will see more crypto-native forms of RWA tokenization in the coming years.
Similarly, in the stablecoin field, in 2026, we will see "not just tokenization, but on-chain native issuance." Stablecoins had fully entered the mainstream in 2025, and their issuance volume continues to grow.
However, stablecoins without a robust credit infrastructure are essentially akin to a "narrow bank" — they only hold a small portion of highly secure liquid assets. A narrow bank is certainly an effective product, but I don't believe it will become the long-term backbone of the on-chain economy.
We are currently seeing some new asset managers, asset orchestrators, and protocols starting to promote on-chain loans collateralized by off-chain assets. These loans are typically originated off-chain and then tokenized. However, besides facilitating distribution to on-chain users, I don't see much advantage in the off-chain origination and tokenization of loans.
This is why debt-based assets should be originated on-chain directly, rather than off-chain and then tokenized. On-chain origination can reduce loan servicing costs, backend infrastructure costs, and improve accessibility. The real challenge lies in compliance and standardization, but there are teams working on advancing these issues.
— Guy Wuollet, a16z crypto General Partner
Most banks still operate modern-day developer-unfriendly legacy software systems: in the 1960s–70s, banks were early adopters of large-scale software systems; second-generation core banking systems emerged around the 1980s–90s (such as Temenos' GLOBUS, Infosys' Finacle). However, these systems have gradually aged, with upgrade rates far behind the demands of the times.
Hence, the most critical core ledgers in the banking system — databases that record deposits, collateral, and various financial obligations — often still run on mainframes, coded in COBOL, and rely on batch file interfaces rather than APIs.
The vast majority of global assets are also stored in these equally "decades-old" core ledgers. While these systems have been battle-tested for a long time, are regulatory-approved, and deeply embedded in complex business processes, they also significantly limit the pace of innovation.
For instance, adding functionalities like real-time payments (RTP) often take months or even years and must navigate through layers of technical debt and regulatory hurdles.
This is where stablecoins come into play. Over the past few years, stablecoins have found a true product-market fit and entered the mainstream, and this year, traditional financial institutions have reached new heights of acceptance of stablecoins.
Stablecoins, tokenized deposits, tokenized treasury bonds, and on-chain bonds enable banks, fintech firms, and institutions to build new products and serve new customers. More importantly, they do not need to rewrite those core systems that, while old, have been running reliably for a long time. As a result, stablecoins have become a new path for institutional innovation.
——Sam Broner
As smart agents scale up, more and more business interactions will no longer rely on user clicks but will be automatically executed in the background. At that time, the way value moves must also change accordingly.
In a world where a system acts based on 'intent' rather than step-by-step instruction execution. When AI agents automatically move funds due to identified needs, obligations fulfillment, or triggered outcomes, value must flow rapidly and freely like information. This is where blockchain, smart contracts, and new protocols come into play.
Smart contracts can already settle global dollar transactions within seconds. By 2026, new primitives like x402 will enable this settlement to have programmability and responsiveness:
Agents can instantly and permissionlessly pay each other for data, GPU time, or API fees—without the need for invoices, reconciliation, or batch processing;
Software updates released by developers directly embed payment rules, limits, and audit trails—without the need for integrating fiat systems, merchant onboarding, or bank integration;
Prediction markets can self-settle in real-time as events unfold—odds updates, agent transactions, profits globally settled within seconds...without the need for custodians or exchanges.
When value can move in this way, the 'payment flow' will no longer be a discreet operational layer but rather a network behavior: banks become part of the underlying internet pipeline, and assets become infrastructure.
If currency becomes a 'data packet' routable by the internet, then the internet supports not just the financial system—it becomes the financial system itself.
——Christian Crowley and Pyrs Carvolth, a16z crypto commercialization team
For a long time, personalized wealth management services have been geared towards high-net-worth clients, as providing tailored advice and portfolio management for different asset classes has been expensive and complex. However, as more assets get tokenized, the crypto network enables these strategies to be executed and rebalanced instantly with AI-generated recommendations and assistance, almost at zero cost.
This is not just 'robo-advisory'; active management will become accessible to everyone, not just passive management.
In 2025, traditional financial institutions increased their exposure to crypto assets (directly or through ETPs), but that was just the beginning. In 2026, we will see more platforms focused on 'wealth accumulation' (rather than just wealth preservation)—especially those financial technology companies (like Revolut, Robinhood) and centralized exchanges (like Coinbase) that can leverage their technological stack advantage.
Meanwhile, DeFi tools such as Morpho Vaults can automatically allocate assets to the most risk-adjusted yield-optimal lending markets, becoming a foundational yield allocation in the portfolio. Holding other liquid assets in stablecoins rather than fiat or in tokenized money market funds rather than traditional MMFs further expands yield possibilities.
Lastly, retail investors now have easier access to lower liquidity private market assets such as private credit, Pre-IPO companies, and private equity. Tokenization enhances both accessibility and maintains necessary compliance and reporting requirements.
As various asset classes in a balanced portfolio (from bonds to stocks to private placements and alternatives) gradually get tokenized, they can also achieve automatic, intelligent rebalancing without the need for inter-bank wire transfers.
— Maggie Hsu, a16z crypto Go-To-Market Team
The bottleneck of the agency economy is shifting from intelligence itself to identity.
In financial services, the quantity of "non-human identity" now exceeds human employees at a ratio of 96:1 — yet these identities remain unbanked "bankless ghosts." The most missing foundational capability currently is KYA: Know Your Agent.
Just as humans need a credit score to get a loan, AI agents also need credential signatures to transact — these credentials must bind the agent to its principal, behavioral constraints, and liability boundaries. Until this infrastructure emerges, merchants will continue to block agent access at the firewall level.
The industry, which has spent decades building KYC infrastructure, now has only a few months to solve KYA.
— Sean Neville, Circle Co-founder, USDC Architect; Catena Labs CEO
As a mathematical economist, in January this year, I struggled to get consumer-grade AI models to understand my research process; but by November, I could give abstract instructions to the model like guiding a PhD student... and receive sometimes novel and correctly executed answers.
More broadly, we are seeing AI starting to be used for real research activities — especially in the realm of reasoning, where models can not only assist in discovery but can even independently solve Putnam-level mathematical problems (one of the world's hardest university math competitions).
It is still unclear which disciplines will benefit the most and how exactly they will benefit. However, I believe AI will promote and reward a new kind of "polymathic" research style: being able to hypothesize between ideas and rapidly extrapolate from more exploratory interim results.
These answers may not be entirely accurate, but they may still point in the right direction (at least in some topological sense).
In a way, this is leveraging the model's "illusionary power": when models are "smart" enough, their back-and-forth collisions in abstract space may generate meaningless content, but occasionally they may trigger a true breakthrough akin to human nonlinear thinking.
To reason in this way requires a new style of AI workflow—not just collaboration between agents, but "agent-wrapping-agent": multi-layered models assessing early models' attempts and continuously refining the truly valuable parts. I am writing papers this way, while others are using it for patent searches, creating new art forms, or (unfortunately) designing novel smart contract attacks.
However, to truly make this "wrapped reasoning agent cluster" serve research, two issues must be addressed: interoperability between models and how to identify and justly compensate the contributions of each model—both of which could be addressed through cryptographic techniques.
—Scott Kominers, a16z crypto Research Team; Harvard Business School Professor
The rise of AI agents is imposing an invisible tax on open networks, fundamentally shaking their economic foundations.
The disruption stems from the misalignment of the Internet's "Context Layer" and "Execution Layer": at present, AI agents extract data from content websites reliant on ad revenue (Context Layer) to provide users with convenience, systematically bypassing the revenue sources (ads and subscriptions) that support this content.
To prevent the erosion of open networks (and to prevent the weakening of the content ecosystem on which AI itself depends), we need large-scale deployment of technological and economic mechanisms: this may include a new generation of sponsored content models, micro-attribution systems, or other new fund allocation models.
The current AI licensing agreements have proven to be unsustainable—the amounts they pay to content providers often only represent a fraction of the losses caused by AI erosion of traffic.
Open networks require a new technical-economic framework to allow value to flow automatically. The most critical shift in the coming year is moving from static licensing to real-time, usage-based compensation models.
This means we need to test and scale the system—potentially based on blockchain-enabled nanopayments and granular attribution standards—to automatically reward every entity contributing information that leads to a successful agent task.
—Liz Harkavy, a16z crypto Investment Team
Privacy is a key capability driving the global shift to on-chain finance. It is also a feature lacking in nearly all existing blockchains. For most chains, privacy has long been a mere "nice-to-have" feature.
However, today, privacy alone is enough to distinguish a chain from all others. What's even more important: privacy can create chain-level lock-in effects—a "privacy version of network effects," especially in an era where performance competition no longer provides differentiation.
With the existence of cross-chain protocols, as long as everything is public, moving from one chain to another is nearly costless. But once privacy is introduced, the situation changes drastically: transferring tokens across chains is easy, transferring "secrets" between chains is hard.
Any process of moving from a private domain chain to a public chain will expose your identity to those observing the blockchain, mempool, or network traffic. Migration between different private chains also leaks various metadata, such as time correlation or amount correlation, making tracking easier.
In contrast, those new chains lacking differentiation and whose costs will be driven down to zero in competition (due to highly homogeneous blockspace), privacy chains can actually build stronger network effects.
The reality is: a "generic chain" without a thriving ecosystem, killer apps, or distribution advantage has little reason to attract users or developers, making it harder to foster loyalty.
When users are on a public chain, the choice of chain is not crucial as long as there is free interaction between chains. But once users enter a private domain chain, the choice of chain becomes very important—because once in, they are less willing to migrate, exposing the risk.
This will create a "winner-takes-most" dynamic.
And because privacy is crucial for most real-world applications, in the end, there may only be a few privacy chains that dominate the majority of the crypto economy.
—Ali Yahya, a16z crypto General Partner
As we move towards the era of quantum computing, many encryption-dependent communication applications (Apple, Signal, WhatsApp) have done a lot of cutting-edge work. However, the issue is this: Today, all mainstream communication tools rely on privately operated servers from a single organization.
These servers are vulnerable points that can be shut down by the government, implanted with backdoors, or required to hand over data.
If a country can directly shut down servers; if a company holds the server keys; or if there simply exists a "private server"… then what does quantum-level encryption really mean?
Private servers require "trust me"; but the absence of servers means "you don't have to trust me."
Communication does not need a centralized company standing in the middle. What we need is an open protocol where no one needs to be trusted.
To achieve this, the network must be decentralized: no private servers; no single app; all code open-source; top-tier encryption (including post-quantum).
In an open network, no one, whether an individual, company, nonprofit, or country, can take away our ability to communicate. Even if a country or company shuts down an app, 500 new versions will appear the next day.
Even if a node is shut down, new nodes will immediately join as replacements due to mechanisms like blockchain's economic incentives.
When people control their information with their keys as they do with their "money," everything changes. Apps may come and go, but users always control messages and identities - users own messages, not apps.
This is not just a matter of quantum resistance or encryption but of ownership and decentralization.
Without these two aspects, we are simply building an encryption that is "unbreakable yet can still be shut down".
— Shane Mac, Co-founder & CEO of XMTP Labs
Behind every model, every agent, every automated system lies one thing: data.
But today, most data pipelines - model inputs and outputs - are opaque, mutable, and unauditable.
For some consumer-facing applications, this may suffice, but for industries dealing with sensitive data (such as finance and healthcare), such mechanisms fall far short.
This is also a key obstacle preventing institutions from fully tokenizing real-world assets.
So, how can we achieve security, compliance, self-sovereignty, and global interoperability while maintaining privacy?
We need to start with data access control: Who controls sensitive data? How does data move? Who (or what system) can access it?
In the absence of data access control, anyone looking to protect privacy must rely on centralized services or build complex systems themselves—this is not only time-consuming and costly but also hinders traditional financial institutions from fully leveraging the benefits of on-chain data management.
As intelligent agents begin to autonomously browse, transact, and make decisions, what users and institutions need is not 'best-effort trust' but cryptographic-level assurance.
Therefore, we need 'Privacy on Demand': a new technology that offers programmable, native data access rules; client-side encryption;
decentralized key management—explicitly stating who can decrypt what data under what conditions and for how long... all enforced on-chain.
Combined with verifiable data systems, 'privacy' will become the underlying public infrastructure of the internet, rather than a 'post-application patch.'
Privacy will become part of the infrastructure, not a peripheral feature.
—Adeniyi Abiodun, Co-Founder and Chief Product Officer at Mysten Labs
Recent DeFi exploit incidents, even on mature protocols with years of real-world exposure, strong teams, and rigorous audits, have exposed a troubling reality: today's security practices are fundamentally empirical and 'case-by-case.'
To advance DeFi security into a mature stage, we must shift from vulnerability patterns to design-level properties, from 'best-effort' to 'principled' systemic approaches:
Static/pre-deployment security (testing, audits, formal verification)
The future focus is on systematically proving global invariants rather than just verifying a few manually selected local properties.
Several teams are now building AI-assisted proof tools to help write specifications, propose invariants, and automatically handle a large amount of proof engineering work that was previously manual and costly.
Dynamic / Post-Deployment Security (runtime monitoring, runtime enforcement, etc.)
After on-chain deployment, these invariants can become real-time guardrails for the system: serving as the last line of defense.
These guardrails will be encoded as runtime assertions, requiring every transaction to satisfy the relevant security conditions.
In other words, we no longer assume that "all vulnerabilities have been caught before deployment," but rather have the code itself enforce core security properties, automatically rolling back any transactions that violate these properties.
This is not just theoretical but also practical.
In fact, almost every past attack could potentially trigger these checks during execution, aborting the hack.
Thus, the once popular idea of "code is law" is evolving into "spec is law."
Even novel attack vectors must adhere to the same set of security properties in the system design; therefore, the attack surface is compressed, leaving only minimal or highly impractical possibilities.
—Daejun Park, a16z crypto engineering team
Prediction markets have already gone mainstream. In the coming year, they will grow in scale, coverage, and intelligence with the convergence of crypto and AI, bringing new challenges that builders need to address collectively.
Firstly, there will be more types of contracts listed. This means that in the future, we will have real-time odds not only for a major election or geopolitical event but also for various niche outcomes and complex event combinations. As these new contracts continuously disclose information and integrate into the news ecosystem (which is already happening), society will have to face a question: how do we balance the value of this information, and how do we design a more transparent, auditable prediction system?
Cryptographic technology can provide tools for this purpose.
In order to address a larger volume of prediction contracts, we need a new "truth alignment" mechanism to advance contract settlement. While the arbitration mechanism of centralized platforms (such as whether an event has occurred and how to confirm it) is important, cases like the Zelensky Litigation Market and the Venezuela Election Market have exposed its limitations.
Therefore, to expand the scale and utility of prediction markets, a new decentralized governance mechanism and an LLM Oracle will become important tools for resolving disputes and reaching true value.
The possibilities brought by AI are not limited to LLM. AI agents can autonomously trade on prediction platforms, scan the world for signals, and search for short-term advantages. This helps us discover new ways of thinking and also assists in predicting "what will happen next" (projects like Prophet Arena have already demonstrated the early excitement in this area).
In addition to serving as a queryable "advanced political analyst," the emergence strategy of AI agents can even help us understand the fundamental predictive factors of complex social events in reverse.
Will prediction markets replace polls? No, they will make polls better.
Poll data can even become input for prediction markets. As a political economist, what excites me the most is seeing prediction markets and a healthy, diverse polling ecosystem working together. But to achieve this, we need to leverage new technologies: AI can improve the survey experience; encryption technology can prove respondents are real humans and not bots, bringing more innovation.
—Andy Hall, a16z crypto Research Consultant; Stanford University Political Economics Professor
The traditional media model (especially the "objectivity" assumption) has shown cracks. The Internet has enabled everyone to have a voice, and more and more industry players, practitioners, and builders are starting to express their views directly to the public. Ironically, the audience often respects them not because "even though they have interests," but because they have interests.
The real change is not social media, but: cryptographic tools enable people to make public, verifiable commitments.
As AI lowers the barrier to creating content infinitely—where any perspective, any identity (whether real or fictional) can be infinitely replicated—simply relying on "what is said" is no longer enough to establish trust.
Tokenized assets, programmable locking, prediction markets, and on-chain history provide a more robust foundation of trust:
Commenters can express their opinions and prove they have skin in the game
by putting their money where their mouth is;
Podcasters can lock up tokens to demonstrate they won't engage in pump and dump
schemes;
Analysts can anchor their predictions to a publicly settled market, thus creating an auditable record.
This is precisely what I've been calling an early form of staked media
: a breed of media that embraces the skin in the game
ethos and provides verifiable evidence.
In this model, credibility no longer comes from pretend neutrality
or unsubstantiated claims
but from provable stake taking.
Staked media is not here to replace the existing media, but to complement the current ecosystem.
It offers a new signal: not trust me, I'm neutral
, but look at what risk I'm willing to take, and you can verify if I'm truthful
.
——Robert Hackett, a16z crypto editorial team
For years, SNARKs (cryptographic proofs of verifiable computation) have been almost exclusively used in the blockchain space. The reason is simple: the cost of generating proofs was too high—potentially a million times more expensive than directly computing.
It was worth it when the cost could be spread across thousands of validators, but in other scenarios, it was nearly impossible.
All of this is about to change.
By 2026, zkVM provers will reduce the cost by around 10,000x, with memory footprints in mere hundreds of MBs: fast enough to run on a smartphone and affordable enough to deploy anywhere.
Why might 10,000x be a magic number
? Because the parallelism of high-end GPUs is roughly 10,000 times that of a laptop CPU.
By the end of 2026, a single GPU will be able to generate proofs of CPU computations in real-time.
This will unlock a long-standing vision from old papers: verifiable cloud computing.
If your workloads were ever running on cloud CPUs due to low compute requirements, lack of GPU-capable tasks, or legacy reasons.
In the future, you will be able to achieve computational integrity in cryptographic proofs at a reasonable cost.
The prover itself is optimized for GPUs, and your code does not need to be modified.
—Justin Thaler, a16z crypto Research Team; Assistant Professor of Computer Science at Georgetown University
Today, apart from stablecoins and a few core infrastructures, almost all well-run crypto projects have pivoted to transactional businesses or are in the process of doing so. If "every crypto company eventually becomes a transaction platform," what will the endgame look like?
When a large number of players do the same thing, they squeeze each other out, leaving only a handful of winners in the end.
Companies that pivot to transactions too early or too quickly may miss the opportunity to build a more defensive and sustainable business.
I understand founders constantly exploring to make their financial model work, but chasing "seemingly immediate Product-Market Fit (PMF)" also comes with a cost.
Especially in the crypto space, the unique dynamics of tokenomics and speculative culture can lead founders down the path of "instant gratification," overlooking deeper product issues.
In a sense, this is a "cotton candy test." Transactions themselves are not an issue; they are a crucial market function. But they do not have to be the endgame.
Founders who truly focus on the "product" part of PMF are often the ultimate winners.
—Arianna Simpson, a16z crypto General Partner
Over the past decade, one of the biggest obstacles to building blockchain networks in the U.S. has been legal uncertainty.
Securities laws have been stretched and selectively enforced, forcibly fitting founders into a regulatory framework designed for "companies" rather than "networks."
Over the years, "mitigating legal risk" has replaced "product strategy"; engineers have been replaced by lawyers.
This dynamic has led to many strange distortions:
Founders are advised to avoid transparency;
Token distribution has become legally arbitrary and unnatural;
Governance has devolved into theater;
Organizational structure prioritizes legal arbitrage;
Tokens are forced to be designed to not carry economic value, with no business model;
Worse, those projects that are less compliant seem to move faster.
But now, US crypto market structure legislation is closer to passing than ever before, poised to eliminate these distortions next year.
Once passed, this legislation will: Incentivize transparency; Establish clear standards; Replace today's "regulatory roulette" with a clear, structured path for fundraising, token issuance, and decentralization.
Post-GENIUS Act, stablecoin growth exploded; the changes brought by crypto market structure legislation will be even more profound—this time focusing on the networks themselves.
In other words, this type of regulation will allow blockchain networks to operate as they were meant to be: open, autonomous, composable, trust-minimized, and decentralized.
—Miles Jennings, a16z crypto Policy Team; General Counsel
Welcome to join the official BlockBeats community:
Telegram Subscription Group: https://t.me/theblockbeats
Telegram Discussion Group: https://t.me/BlockBeats_App
Official Twitter Account: https://twitter.com/BlockBeatsAsia