how it is able to answer 2nd question? is there any llm giving this answer?

#1
by ABeyonder - opened
Hugging Face MCP Course org

๐Ÿ˜ƒ: Prime factorization of 68
๐Ÿค–: [2, 2, 17]
๐Ÿ˜ƒ: who is prime minister of india?
๐Ÿค–: Narendra Modi

Hugging Face MCP Course org

in the mcp client code, I don't see any llm getting used, so how it is able to answer questions like
who is prime minister of india?
what is the capital of India?

ABeyonder changed discussion status to closed
ABeyonder changed discussion status to open
Hugging Face MCP Course org
โ€ข
edited 10 days ago

@ABeyonder by default, the InferenceClientModel class uses the Qwen/Qwen2.5-Coder-32B-Instruct model

def __init__(
    self,
    model_id: str = "Qwen/Qwen2.5-Coder-32B-Instruct",
    provider: str | None = None,
    token: str | None = None,
    timeout: int = 120,
    client_kwargs: dict[str, Any] | None = None,
    custom_role_conversions: dict[str, str] | None = None,
    api_key: str | None = None,
    bill_to: str | None = None,
    base_url: str | None = None,
    **kwargs,
):
    ...

Sign up or log in to comment