Files
tueit_Transkriptor/api
thomas.kopp c7cad4bb2a feat: add whisper.cpp ROCm backend support for AMD GPU acceleration
- transcription.py: new _transcribe_remote_whispercpp() using /inference endpoint
- transcription.py: backend param routes to openai or whispercpp remote path
- config.py: whisper.backend default 'openai', alt 'whispercpp'
- pipeline.py: passes backend from config to transcribe_file
- settings: backend dropdown (OpenAI-compat / whisper.cpp)
- SETUP.md: whisper.cpp ROCm build and systemd setup instructions

whisper-cpp-server running on beastix :8080 (ROCm0, gfx1030, RX 6800 XT)
2026-04-02 01:33:32 +02:00
..