Prompt Engineering Guide
πŸ˜ƒ Basics
πŸ’Ό Applications
πŸ§™β€β™‚οΈ Intermediate
🧠 Advanced
Special Topics
🌱 New Techniques
πŸ€– Agents
βš–οΈ Reliability
πŸ–ΌοΈ Image Prompting
πŸ”“ Prompt Hacking
πŸ” Language Model Inversion
πŸ”¨ Tooling
πŸ’ͺ Prompt Tuning
πŸ—‚οΈ RAG
πŸ”§ Models
🎲 Miscellaneous
πŸ“™ Vocabulary Resource
πŸ“š Bibliography
πŸ“¦ Prompted Products
πŸ›Έ Additional Resources
πŸ”₯ Hot Topics
✨ Credits
🧠 AdvancedFew-Shot🟦 Chain of Knowledge (CoK)

🟦 Chain of Knowledge (CoK)

🟦 This article is rated medium
Reading Time: 2 minutes
Last updated on March 11, 2025

Valeriia Kuka

Chain of Knowledge (CoK) addresses limitations in existing prompting techniques, particularly the hallucination issues common in Chain-of-Thought (CoT) prompting. CoK implements a structured approach to knowledge representation and verification through two main components: evidence triples and explanation hints.

Core Components

Evidence Triples (CoK-ET)

Evidence triples are structured knowledge representations in the format (subject, relation, object). These triples provide verifiable atomic facts that serve as the foundation for reasoning.

Example triples:

  • (water, boilingPoint, 100Β°C)
  • (mammals, class, vertebrates)
  • (photosynthesis, requires, sunlight)

Explanation Hints (CoK-EH)

Explanation hints provide logical connections between evidence triples to construct valid reasoning chains. They explicitly state how the presented evidence supports the final conclusion.

The CoK Prompting Process

Construct In-Context Exemplars

Choose one or several labeled examples (exemplars) that include:

  • The original query.
  • Annotated evidence triples (either manually created or retrieved from a knowledge base).
  • Explanation hints that tie the triples to the answer.

Here's an example you can use in one-shot CoK prompting:

Astronaut

Example


Question: Determine if a plant can grow in a windowless room.


Evidence Triples:

  1. (plants, require, photosynthesis)
  2. (photosynthesis, requires, sunlight)
  3. (windowless room, lacks, sunlight)

Explanation:


Plants require photosynthesis for growth, which needs sunlight. A windowless room lacks sunlight.


Answer: No, a plant cannot grow in a windowless room due to the absence of sunlight required for photosynthesis.

Generating a Response

The final prompt is constructed by concatenating the in-context exemplars with the test query. This "chain-of-knowledge prompt" guides the model to produce:

  • A set of evidence triples.
  • Corresponding explanation hints.
  • The final answer.
Astronaut

Template


Question: [Input question]


Evidence:

  1. (subject, relation, object)
  2. (subject, relation, object)

...


Explanation: [Logical connection between evidence]


Answer: [Conclusion]


[Your question]

Limitations of CoK

  1. Requires accurate knowledge base

  2. Additional processing for triple generation

  3. Implementation complexity like structured format maintenance

Applications

CoK is particularly effective in:

  • Technical documentation
  • Scientific reasoning
  • Fact-based analysis
  • Educational content verification

Conclusion

Chain of Knowledge provides a structured methodology for enhancing language model reasoning through explicit knowledge representation and verification. While it requires more computational resources than simpler prompting methods, it offers improved reliability and verifiability in knowledge-intensive applications.

Valeriia Kuka

Valeriia Kuka, Head of Content at Learn Prompting, is passionate about making AI and ML accessible. Valeriia previously grew a 60K+ follower AI-focused social media account, earning reposts from Stanford NLP, Amazon Research, Hugging Face, and AI researchers. She has also worked with AI/ML newsletters and global communities with 100K+ members and authored clear and concise explainers and historical articles.

Footnotes

  1. Wang, J., Sun, Q., Li, X., & Gao, M. (2024). Boosting Language Models Reasoning with Chain-of-Knowledge Prompting. https://arxiv.org/abs/2306.06427 ↩