BlockBeats News, January 27th, Vitalik published a post stating, "The scalability hierarchy of blockchain can be summarized as: computation, data, and state.
Computation is the easiest to scale. It can be parallelized, and block producers can be required to provide various hints, or even replace actual computation of arbitrary scale with proofs of the computation result.
Data sits in the middle layer. If guaranteeing data availability is necessary, then such a guarantee is unavoidable. However, data can be sharded and erasure-coded. At the same time, "progressive degradation" can be achieved: if a node's data processing power is only 1/10 of other nodes, it can still produce blocks that are 1/10 the size of other nodes.
State is the hardest to scale. Even to guarantee the ability to validate a single transaction, a complete state is needed. If you replace the state with a tree and keep only the root hash, updating this root still requires the complete state. There are indeed methods to shard the state, but they all require architectural changes and fundamentally lack generality.
Therefore, if you can replace state with data without introducing new centralization risks, you should default to seriously considering doing so; similarly, if you can replace data with computation without introducing new centralization risks, it is also worth prioritizing."
