Instructions to use Intel/dpt-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Intel/dpt-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("depth-estimation", model="Intel/dpt-large")# Load model directly from transformers import AutoImageProcessor, AutoModelForDepthEstimation processor = AutoImageProcessor.from_pretrained("Intel/dpt-large") model = AutoModelForDepthEstimation.from_pretrained("Intel/dpt-large") - Notebooks
- Google Colab
- Kaggle
Some weights of DPTForDepthEstimation were not initialized from the model checkpoint at Intel/dpt-large
#9
by jigu - opened
It results from https://github.com/huggingface/transformers/blob/edb170238febf7fc3e3278ed5b9ca0b2c40c70e3/src/transformers/models/dpt/modeling_dpt.py#L791. The weight of the first residual layer (although not used) is not in the checkpoint.