llama.cpp/awq-py/requirements.txt

3 lines
34 B
Plaintext

torch>=2.1.1
transformers>=4.32.0