Files
Docling/docling/models/picture_description_vlm_model.py
Zach Cox cc453961a9 fix: enable cuda_use_flash_attention2 for PictureDescriptionVlmModel (#1496)
fix: enable use_cuda_flash_attention2 for PictureDescriptionVlmModel

Signed-off-by: Zach Cox <zach.s.cox@gmail.com>
2025-04-30 08:02:52 +02:00

4.1 KiB