Pricing strategy is crucial for improving the revenue of mobility on-demand (MoD) systems by achieving supply-demand equilibrium across different city zones. Modern MoD systems commonly utilize order ridesharing and vehicle repositioning to improve the order completion rate while supporting this equilibrium, thereby improving the revenue. However, most existing pricing strategies overlook the effects of ridesharing and repositioning, resulting in supply-demand mismatch and revenue decline. To fill this gap, we propose a multi-agent reinforcement learning (MARL) based pricing strategy via a mutual attention mechanism, named R2Pricing, where the impact of ridesharing and repositioning is considered. First, we formulate the pricing with ridesharing and repositioning as an optimization problem toward maximum overall revenue. Then, we transform it into a MARL model, where the agent makes coupled decisions about order fare with ridesharing and vehicle income with repositioning for each zone. Next, the agents are clustered based on supply-demand observation and reward to train more efficiently. The pricing messages between agents are generated based on mutual information theory, which is then aggregated with an attention mechanism to estimate the impact of price differences among zones. Finally, simulations based on real-world data are conducted to demonstrate the superiority of R2Pricing over the benchmarks.