Model frequently hallucinates when answering a question written in non-English.
#45
by
devopsML
- opened
When we tried to input a Vietnamese prompt to this model, it frequently popped up words made up from other languages (Chinese, Thai, Japanese).............. which confused the answer and even made it unappealing and even sometimes incorrect.
Here is an example:
please fix this problem as soon as possible.
devopsML
changed discussion title from
Model frequently hallucinates (we mean randomly popping up words from other languages) when answering a question written in non-English.
to Model frequently hallucinates when answering a question written in non-English.