Recently, S. Arlot and R. Genuer have shown that a model of random forests outperforms its single-tree counterpart in the estimation of $\alpha-$Hölder functions, $\alpha\leq2$. This backs up the idea that ensembles of tree estimators are smoother estimators than single trees. On the other hand, most positive optimality results on Bayesian tree-based methods assume that $\alpha\leq1$. Naturally, one wonders whether Bayesian counterparts of forest estimators are optimal on smoother classes, just like it has been observed for frequentist estimators for $\alpha\leq 2$. We dwell on the problem of density estimation and introduce an ensemble estimator from the classical (truncated) Pólya tree construction in Bayesian nonparametrics. The resulting Bayesian forest estimator is shown to lead to optimal posterior contraction rates, up to logarithmic terms, for the Hellinger and $L^1$ distances on probability density functions on $[0;1)$ for arbitrary Hölder regularity $\alpha>0$. This improves upon previous results for constructions related to the Pólya tree prior whose optimality was only proven in the case $\alpha\leq 1$. Also, we introduce an adaptive version of this new prior in the sense that it does not require the knowledge of $\alpha$ to be defined and attain optimality.