Instructions to use MilaDeepGraph/ProtST-ESM1b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use MilaDeepGraph/ProtST-ESM1b with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="MilaDeepGraph/ProtST-ESM1b", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("MilaDeepGraph/ProtST-ESM1b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md (#3)
Browse files- Update README.md (cf8ba9a1c96d9f5efcb032e336e1f49a9b21a5f7)
Co-authored-by: Jiqing.Feng <Jiqing@users.noreply.huggingface.co>
README.md
CHANGED
|
@@ -6,7 +6,7 @@ Current protein language models (PLMs) learn protein representations mainly base
|
|
| 6 |
|
| 7 |
## Example
|
| 8 |
The following script shows how to run ProtST with [optimum-intel](https://github.com/huggingface/optimum-intel) optimization on zero-shot classification task.
|
| 9 |
-
```
|
| 10 |
import logging
|
| 11 |
import functools
|
| 12 |
from tqdm import tqdm
|
|
|
|
| 6 |
|
| 7 |
## Example
|
| 8 |
The following script shows how to run ProtST with [optimum-intel](https://github.com/huggingface/optimum-intel) optimization on zero-shot classification task.
|
| 9 |
+
```diff
|
| 10 |
import logging
|
| 11 |
import functools
|
| 12 |
from tqdm import tqdm
|