SP
BravenNow
Hallucination (artificial intelligence)
🌐 Entity

Hallucination (artificial intelligence)

Erroneous AI-generated content

πŸ“Š Rating

2 news mentions Β· πŸ‘ 0 likes Β· πŸ‘Ž 0 dislikes

πŸ“Œ Topics

  • AI reliability (1)
  • Neuro-symbolic integration (1)
  • Domain knowledge grounding (1)
  • Healthcare AI (1)
  • Medical informatics (1)
  • Large language models (1)

🏷️ Keywords

Language models (1) Β· Neuro-symbolic inference (1) Β· OpenMath ontology (1) Β· Retrieval-augmented generation (1) Β· Mathematical domain knowledge (1) Β· AI hallucination (1) Β· MATH benchmark (1) Β· GRAIL framework (1) Β· Electronic health records (1) Β· Clinical prediction (1) Β· LLM hallucination (1) Β· Hyperbolic representations (1) Β· Medical AI (1) Β· Patient trajectories (1)

πŸ“– Key Information

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation, or delusion) is a response generated by AI that contains false or misleading information presented as fact. This term draws a loose analogy with human psychology, where a hallucination typically involves false percepts. However, there is a key difference: AI hallucination is associated with erroneously constructed responses (confabulation), rather than perceptual experiences.

πŸ“° Related News (2)

πŸ”— Entity Intersection Graph

Electronic health record(1)Language model(1)Hallucination (artificial intelligence)

People and organizations frequently mentioned alongside Hallucination (artificial intelligence):

πŸ”— External Links