Papers
arxiv:2603.25745

Less Gaussians, Texture More: 4K Feed-Forward Textured Splatting

Published on Mar 26
· Submitted by
taesiri
on Mar 27
Authors:
,
,
,
,
,
,
,
,
,

Abstract

LGTM is a feed-forward framework that enables high-fidelity 4K novel view synthesis by predicting compact Gaussian primitives with per-primitive textures, decoupling geometric complexity from rendering resolution.

AI-generated summary

Existing feed-forward 3D Gaussian Splatting methods predict pixel-aligned primitives, leading to a quadratic growth in primitive count as resolution increases. This fundamentally limits their scalability, making high-resolution synthesis such as 4K intractable. We introduce LGTM (Less Gaussians, Texture More), a feed-forward framework that overcomes this resolution scaling barrier. By predicting compact Gaussian primitives coupled with per-primitive textures, LGTM decouples geometric complexity from rendering resolution. This approach enables high-fidelity 4K novel view synthesis without per-scene optimization, a capability previously out of reach for feed-forward methods, all while using significantly fewer Gaussian primitives. Project page: https://yxlao.github.io/lgtm/

Community

The decoupling of geometric complexity from rendering resolution is a smart approach — similar to what we've seen work in neural radiance fields with feature grids. The per-primitive textures remind me of UV atlas techniques from traditional graphics, but applied to feed-forward networks. One question: does the texture prediction add significant memory overhead during inference? 4K synthesis is impressive, but I'm curious about the VRAM footprint compared to pixel-aligned approaches at the same output resolution.

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2603.25745
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2603.25745 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2603.25745 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2603.25745 in a Space README.md to link it from this page.

Collections including this paper 4