They literally admit this shit won't work WHY DO IT THEN
Generative AI technology is designed to generate text in response to any prompt, regardless of whether it “knows” the answer or not. However, by asking DuckAssist to only summarize information from Wikipedia and related sources, the probability that it will “hallucinate” — that is, just make something up — is greatly diminished. In all cases though, a source link, usually a Wikipedia article, will be linked below the summary, often pointing you to a specific section within that article so you can learn more.
Nonetheless, DuckAssist won’t generate accurate answers all of the time. We fully expect it to make mistakes. Because there’s a limit to the amount of information the feature can summarize, we use the specific sentences in Wikipedia we think are the most relevant; inaccuracies can happen if our relevancy function is off, unintentionally omitting key sentences, or if there’s an underlying error in the source material given. DuckAssist may also make mistakes when answering especially complex questions, simply because it would be difficult for any tool to summarize answers in those instances. That’s why it’s so important for our users to share feedback during this beta phase: there’s an anonymous feedback link next to all DuckAssist answers where you can let us know about any problems, so we can identify where things aren’t working well and take quick steps to make improvements.