Instructions to use freedomking/mc-bert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use freedomking/mc-bert with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("freedomking/mc-bert", dtype="auto") - Notebooks
- Google Colab
- Kaggle
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
MC-BERT is a novel conceptualized representation learning approach for the medical domain. First, we use a different mask generation procedure to mask spans of tokens, rather than only random ones. We also introduce two kinds of masking strategies, namely whole entity masking and whole span masking. Finally, MC-BERT split the input document into segments based on the actual "sentences" provided by the user as positive samples and sample random sentences from other documents as negative samples for the next sentence prediction.
More detail: https://github.com/alibaba-research/ChineseBLUE
- Downloads last month
- 448
