[en] Stein variational gradient descent (SVGD) \citep{DBLP:conf/nips/LiuW16} is a particle-based technique for Bayesian inference. SVGD has recently gained popularity because it combines the ability of variational inference to handle tall data with the modeling power of non-parametric inference. Unfortunately, the number of particles required to represent a model adequately grows exponentially with the dimensionality of the model. Stein mixtures \citep{nalisnick2017variational} alleviate the exponential growth in particles by letting each particle parameterize a distribution. However, the inference algorithm proposed by \cite{nalisnick2017variational} can be numerically unstable. We show that their algorithm corresponds to inference with the R\'enyi
-divergence for
and that using other values for
can lead to more stable inference. We empirically study the performance of Stein mixtures inferred with different
values on various real-world problems, demonstrating significantly improved results when using
, which coincides with using the evidence lower bound (ELBO). We call this instance of our algorithm ELBO-within-Stein. A black-box version of the inference algorithm (for arbitrary
) is available in the deep probabilistic programming language NumPyro \citep{phan2019}.