--- tags: - time-series - temporal-point-processes - hawkes-processes - scientific-ml license: mit library_name: transformers --- # FIM-PP Model Card `FIM-PP` is the Foundation Inference Model for marked temporal point processes. It infers conditional intensity functions from a context set of event sequences and supports zero-shot use as well as downstream fine-tuning. ## Loading Install the `fim` package first, then load the model with Transformers: ```python from transformers import AutoModel model = AutoModel.from_pretrained("FIM4Science/FIM-PP", trust_remote_code=True) model.eval() ``` ## Notes - The released checkpoint is configured for up to 22 event marks. - The model expects Hawkes-style context and inference tensors as described in the OpenFIM point-process tutorial. - If needed, the lower-level fallback remains available through `fim.models.hawkes.FIMHawkes.load_model(...)`. ## Reference If you use this model, please cite: ```bibtex @inproceedings{fim_pp, title={In-Context Learning of Temporal Point Processes with Foundation Inference Models}, author={David Berghaus and Patrick Seifner and Kostadin Cvejoski and Cesar Ojeda and Ramses J. Sanchez}, booktitle={The Fourteenth International Conference on Learning Representations}, year={2026}, url={https://openreview.net/forum?id=h9HwUAODFP} } ```