UniSpLLM: An Integrated Approach for Enhancing Reasoning and Education with Large Language Models
DOI:
https://doi.org/10.58459/icce.2024.4812Abstract
Large Language Models (LLMs) show constrained performance when confronted with a set of mathematical problems spanning various knowledge concepts. Unlike natural language tasks, the understanding and solution strategies for math problems significantly vary, presenting a great challenge for LLMs to consistently generate precise solutions for different problem types. To address this limitation, we propose UniSpLLM(A Universal Template integrated with Specific methods using Large Language Models), a strategy devised to bolster LLMs' efficiency in solving problems enriched with varied knowledge concepts. UniSpLLM innovates by crafting a Universal Template, versatile enough to accommodate any problem type. Specifically, we design six Specific Methods that can be adapted to different problem types. UniSpLLM distinguishes itself from other prompt-based approaches, which typically cater to a singular problem type. By achieving an improvement of nearly 15% on the latest dataset TAL_SAQ6K_EN from AAA12024 and surpassing the GPT-4 baseline by almost 17% on the MMLU-Math dataset, UniSpLLM significantly elevates the utility of LLMs within educational fields. Our approach and results hold significant educational value, as they aid students in acquiring diverse thinking skills tailored to various problem types.