Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Spaces:

Duplicated fromย  huggingface/inference-playground

FallnAI
/
LLM-Inference
Sleeping

App Files Files Community
Fetching metadata from the HF Docker repository...
LLM-Inference / src /lib /components /InferencePlayground
Ctrl+K
Ctrl+K
  • 5 contributors
History: 150 commits
mishig's picture
mishig HF Staff
window.parent.postMessage({ modelId: model.id }, parentOrigin);
74d5501 9 months ago
  • InferencePlayground.svelte
    12.5 kB
    Rm last message on error if empty 9 months ago
  • InferencePlaygroundCodeSnippets.svelte
    9.52 kB
    format 9 months ago
  • InferencePlaygroundConversation.svelte
    2.22 kB
    fix weird jumping behaviour 9 months ago
  • InferencePlaygroundGenerationConfig.svelte
    2.08 kB
    handle when /api/model err 10 months ago
  • InferencePlaygroundHFTokenModal.svelte
    4.57 kB
    format 9 months ago
  • InferencePlaygroundMessage.svelte
    1.58 kB
    [Fix] Autoresize message boxes on browsers other than Chrome 9 months ago
  • InferencePlaygroundModelSelector.svelte
    2.5 kB
    window.parent.postMessage({ modelId: model.id }, parentOrigin); 9 months ago
  • InferencePlaygroundModelSelectorModal.svelte
    6.17 kB
    misc 9 months ago
  • generationConfigSettings.ts
    934 Bytes
    default steps 9 months ago
  • inferencePlaygroundUtils.ts
    2.29 kB
    typo maxTokens vs max_tokens 9 months ago
  • types.ts
    607 Bytes
    System message as part of Conversation 10 months ago