In this paper, we analyze the performance of a cloud radio access network (CRAN), consisting of multiple randomly distributed remote radio heads (RRHs) and a macro base station (MBS), each equipped with multiple antennas. To model the spatial distribution of RRHs and analyze its performance, we use stochastic geometry tools. In contrast to previous works on CRAN that consider Poisson Point Process (PPP) model for the spatial distribution of RRHs, we consider a more realistic Matérn hard-core point process (MHCPP) model that imposes a certain minimal distance (referred to as hard-core distance) between the two RRHs so that the RRHs are not too close to each other. To compare system performance of CRAN when different transmission strategies are used, three RRH selection schemes are adopted including 1) the best RRH selection (BRS); 2) all RRHs participation (ARP); and 3) nearest RRH selection (NRS). Considering downlink transmission, the ergodic capacity, outage probability, and system throughput of CRAN are analytically characterized for different RRH selection schemes. The presented results demonstrate that compared to PPP model, the increase in hard-core distance will result in a higher outage probability and cause a negative impact on ergodic capacity. Furthermore, when the same total transmit power is consumed, BRS scheme provides the best outage performance while ARP scheme is the best RRH selection scheme when the same transmit SNR at each RRH is assumed. Moreover, it is shown that the hard-core distance has a more significant impact on systems with higher intensity of PPP distributed candidate points and in large hard-core distance regime increasing the intensity of candidate points can only provide a small improvement in outage performance. We extend our work to multiuser case with zero-forcing (ZF) precoding where it is proven that the results in multiuser case reduce to the derived results in this work by substituting K=1 for single-user.