Add new Llama 4 model options to Groq (#4278)
Add new Groq model options to `groqChat` section in `models.json` * Add `meta-llama/llama-4-maverick-17b-128e-instruct` * Add `meta-llama/llama-4-scout-17b-16e-instruct`
This commit is contained in:
parent
36870e94d4
commit
5faff52053
|
|
@ -575,6 +575,14 @@
|
||||||
{
|
{
|
||||||
"label": "mixtral-8x7b-32768",
|
"label": "mixtral-8x7b-32768",
|
||||||
"name": "mixtral-8x7b-32768"
|
"name": "mixtral-8x7b-32768"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"label": "meta-llama/llama-4-maverick-17b-128e-instruct",
|
||||||
|
"name": "meta-llama/llama-4-maverick-17b-128e-instruct"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"label": "meta-llama/llama-4-scout-17b-16e-instruct",
|
||||||
|
"name": "meta-llama/llama-4-scout-17b-16e-instruct"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue