Reaction representation learning is of paramount importance for adopting deep-learning-based chemistry modeling to solve real-world tasks such as synthesis planning. Most prevailing models are prestrained by self-supervised objectives that rely solely on the chemical structure information. Since structurally similar reactions could possess entirely distinct properties (e.g., reaction yields) and the synthesis-related tasks are highly heterogeneous, there are inherent limitations in constructing a foundational reaction model within the existing approaches. To tackle this limitation, we propose HiCLR, a knowledge-induced hierarchical contrastive learning framework for chemical reactions, by introducing relational inductive bias to forge chemically meaningful and generally applicable reaction fingerprints. Critically, the pretraining scheme combining both retrosynthesis prediction and contrastive loss enables HiCLR to tackle generation-based and understanding-based tasks simultaneously. Comprehensive experiments demonstrate that HiCLR successfully organizes the reaction space into hierarchical global semantic clusters, aligned well with prior knowledge. Consequently, HiCLR is the first foundation model that can be broadly applied to various synthesis-related tasks, and it achieves state-of-the-art performance in reaction classification, reaction condition recommendation, reaction yield prediction, synthesis planning, and even molecular property prediction. HiCLR demonstrates clear benefits in incorporating domain knowledge to guide the learning of neural networks, expediting AI-driven advancements in chemistry.