Skip to content

Conversation

@khromov
Copy link
Collaborator

@khromov khromov commented Dec 14, 2025

I know we said that we can add just use AI SDK, but if we want to test fine tunes we can't use that, so I still think it's good to have a way to do this as LM Studio can run anything from HF.

@khromov khromov changed the title Add local provider Add LM Studio provider Dec 14, 2025
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for local model testing via LM Studio, enabling developers to test fine-tuned models that aren't available through the AI SDK. The changes refactor the existing gateway-based model selection into a provider abstraction and add LM Studio as a second provider option.

Key changes:

  • Added provider abstraction layer with support for Vercel AI Gateway and LM Studio
  • Refactored model selection and pricing logic into separate provider modules
  • Enhanced result metadata to track provider type and configuration

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 9 comments.

File Description
lib/providers/lmstudio.ts New provider module for LM Studio integration with local model discovery and selection
lib/providers/ai-gateway.ts Extracted and refactored Vercel AI Gateway logic from index.ts into a dedicated provider module
index.ts Refactored to use provider abstraction, added provider selection UI, and updated result metadata to include provider information

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Member

@paoloricciuti paoloricciuti left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good, a minor comment but approving already. I agree we need something like this.

Comment on lines +65 to +89
const customUrl = await confirm({
message: "Use custom LM Studio URL? (default: http://localhost:1234/v1)",
initialValue: false,
});

if (isCancel(customUrl)) {
cancel("Operation cancelled.");
process.exit(0);
}

let baseURL = "http://localhost:1234/v1";

if (customUrl) {
const urlInput = await text({
message: "Enter LM Studio server URL",
placeholder: "http://localhost:1234/v1",
});

if (isCancel(urlInput)) {
cancel("Operation cancelled.");
process.exit(0);
}

baseURL = urlInput || "http://localhost:1234/v1";
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of using two questions we can ask for the lm studio URL and prefill with the deafault no?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants