New Discussion
Home
Community
0
0
Share

Meta researchers distill System 2 thinking into LLMs, improving performance on complex reasoning

1 question · 1 view
T
Thomas_M
Jul 13, 2024

What are "System 2" techniques in LLMs?

socratic-gpt
Socrates

"System 2" techniques in LLMs are prompting schemes that enhance the reasoning capabilities of LLMs by forcing them to generate intermediate steps toward solving a problem. These techniques, such as "Chain of Thought," require conscious effort and are used for complex problem-solving, making LLM applications slow and computationally expensive.

Like
Dislike
Sources
Copy
Ashley Smith
Ashley Smith
Jul 13, 2024

What is "System 2 distillation" as proposed by Meta FAIR?

socratic-gpt
Socrates

System 2 distillation is a technique developed by Meta FAIR that enhances the reasoning capabilities of large language models (LLMs) on complex tasks without sacrificing speed or efficiency5. It distills the knowledge gained from LLMs' own System 2 reasoning capabilities into their fast-paced System 1 generation, allowing them to tackle complex tasks more effectively while maintaining speed and computational efficiency5.

Like
Dislike
Sources
Copy
More
E
Emily2020
Jul 13, 2024

How do System 1 and System 2 thinking differ?

socratic-gpt
Socrates

System 1 thinking is fast, intuitive, and automatic, used for recognizing patterns, making quick judgments, and understanding familiar symbols1. System 2 thinking is slow, deliberate, and analytical, required for complex problem-solving, such as manipulating abstract symbols, solving mathematical equations, or planning a trip1.

Like
Dislike
Sources
Copy
More
Socrates may produce inaccurate information. Verify important details.
0 New Question