Discovery of causal structure from observational data in the presence of latent variables is an active area of research. Constraint-based causal discovery algorithms iteratively perform many statistical independence tests on data to constrain the structures that are consistent with the test results; they then output a causal model, which possibly contains latent variables that influence one or more measured variables. However, they do not provide the probability estimates of the whole causal model that is output or of the causal relationships within the model. In contrast, Bayesian methods can provide such probabilities, but they are often computationally infeasible to apply when modeling latent variables. We introduce a Bayesian method for deriving the probability estimate that a set of independence tests are jointly correct, which is directly related to the probability that the causal model that is consistent with those constraints is correct. Consequently, this helps us to find the most probable structure when we have many of them. The proposed method is evaluated on a set of real Bayesian network structures using simulated data.