Added vLLM support
This commit is contained in:
parent
25616d71ea
commit
a02ed32b3f
@ -51,7 +51,7 @@ Dolphin achieves promising performance across diverse page-level and element-lev
|
||||
Try our demo on [Demo-Dolphin](http://115.190.42.15:8888/dolphin/).
|
||||
|
||||
## 📅 Changelog
|
||||
- 🔥 **2025.06.27** Added vLLM support for accelerated inference.
|
||||
- 🔥 **2025.06.27** Added vLLM support for accelerated inference!
|
||||
- 🔥 **2025.06.13** Added multi-page PDF document parsing capability.
|
||||
- 🔥 **2025.05.21** Our demo is released at [link](http://115.190.42.15:8888/dolphin/). Check it out!
|
||||
- 🔥 **2025.05.20** The pretrained model and inference code of Dolphin are released.
|
||||
|
Loading…
Reference in New Issue
Block a user