language_models: Add thinking support to LM Studio provider (#32337)
Umesh Yadav
created
It works similar to how deepseek works where the thinking is returned as
reasoning_content and we don't have to send the reasoning_content back
in the request.
This is a experiment feature which can be enabled from settings like
this:
<img width="1381" alt="Screenshot 2025-06-08 at 4 26 06 PM"
src="https://github.com/user-attachments/assets/d2f60f3c-0f93-45fc-bae2-4ded42981820"
/>
Here is how it looks to use(tested with
`deepseek/deepseek-r1-0528-qwen3-8b`
<img width="528" alt="Screenshot 2025-06-08 at 5 12 33 PM"
src="https://github.com/user-attachments/assets/f7716f52-5417-4f14-82b8-e853de054f63"
/>
Release Notes:
- Add thinking support to LM Studio provider