We compute lower bounds on the mean-square error of multiple change-point estimation. In this context, the parameters are discrete and the Cramer-Rao bound is not applicable. Consequently, we focus on computing the Barankin bound (BB), the greatest lower bound on the covariance of any unbiased estimator, which is still valid for discrete parameters. In particular, we compute the multi-parameter version of the Hammersley- Chapman-Robbins, which is a Barankin-type lower bound. We first give the structure of the so-called Barankin information matrix (BIM) and derive a simplified form of the BB. We show that the particular case of two change points is fundamental to finding the inverse of this matrix. Several closed-form expressions of the elements of BIM are given for changes in the parameters of Gaussian and Poisson distributions. The computation of the BB requires finding the supremum of a finite set of positive definite matrices with respect to the Loewner partial ordering. Although each matrix in this set of candidates is a lower bound on the covariance matrix of the estimator, the existence of a unique supremum w.r.t. to this set, i.e., the tightest bound, might not be guaranteed. To overcome this problem, we compute a suitable minimal-upper bound to this set given by the matrix associated with the Loewner-John Ellipsoid of the set of hyper-ellipsoids associated to the set of candidate lower-bound matrices. Finally, we present some numerical examples to compare the proposed approximated BB with the performance achieved by the maximum likelihood estimator.