The integration of symmetry, such as permutation equivariance, into Quantum Graph Neural Networks (QGNNs), referred to as Equivariant Quantum Graph Neural Networks (EQGNNs), markedly improves the model's generalization performance on graph-structured data. Despite this advancement, current research has not yet extended rotational equivariance to QGNN frameworks. Furthermore, processing large-scale graph data increases computational complexity due to numerous inter-node connections, significantly raising the required number of qubits. To address these challenges, a novel Rotationally Equivariant Quantum Graph Neural Network (REQGNN) with trainable compression encoder and entanglement-enhanced aggregation mechanism is proposed. By adopting quantum fidelity as the evaluation metric, we design a quantum autoencoder to effectively compress feature dimensionality, substantially lowering the qubit requirements of the model while preserving essential global structural details. To achieve rotational equivariance in the model, we propose an entanglement-enhanced layer that incorporates distance and angle information between nodes. This layer performs entanglement by extracting diverse edge information, thereby further refining edge feature extraction. Additionally, an auxiliary entanglement layer is introduced to mitigate the over-smoothing issue. Experimental results demonstrate REQGNN is significantly better for graph classification tasks than GIN, Gra+QSVM, and Gra+QCNN on four datasets in all metrics and achieves better results than egoGQNN in accuracy on PTC dataset, and it also has advantage for graph regression tasks over the classical models, including EGNN and EquiformerV2, and reduces the MAE of Cv task unit by 20% on average compared with a previous quantum model QGCNN. Our approach offers an effective solution for achieving rotational equivariance while providing a novel perspective for exploring symmetry in graph neural networks (GNNs).