header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

The ByteSeed team has released Seed3D 2.0, upgrading geometric precision and material generation architecture.

According to Dynamic Observation Beating monitoring, ByteDance's Seed team has released Seed3D 2.0, a 3D model generator that can turn a single image into a textured 3D asset. This upgrade focuses on geometric precision and material realism, and the API has been integrated into the Volcano Ark platform.

The geometric generation adopts a Coarse-to-Fine two-stage strategy: first establishing a coarse topology using a large-parameter DiT, then recovering sharp edges and fine surfaces. On the material side, the MoE architecture is used to enhance high-resolution details, introducing VLM priors to improve material decomposition stability under unknown lighting conditions. It outputs complete PBR maps that can be directly plugged into a standard rendering pipeline.

Sixty evaluators with 3D modeling experience were blinded tested with about 200 test cases. Seed3D 2.0 was compared pairwise withHunyuan3D-2.5/3.1, Tripo 3.0, Rodin Gen2, HiTem v2.0, and the previous generation Seed3D 1.0. The geometric generation preference rates ranged from 65.1% to 98.3%, while the textured 3D asset preference rates all exceeded 69%.

In terms of downstream tasks, Seed3D 2.0 can segment 3D assets into separate parts by function, add joint information, and output URDF-compatible formats for simulation engines like Isaac Sim, suitable for dynamic interaction scenarios such as robot grasping. At the scene level, it supports text, multi-view images, or video inputs, combining multiple assets to create a complete scene.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish