Bayesian framework basically intends to improve 'parameter estimates' of a stochastic process based upon evidence-in-hand. Suppose, you have some data and you have assumed that the data came from a specific stochastic process (e.g. Gaussian). You have also estimated the parameters using either MOM (Method of moments) or MLE (Maximum Likelihood Estimation). In future, suppose you have collected more data on the process. Now you want to improve the parameter estimates. What do you do? Remember that parameter estimates also have stochastic distribution. With the first data set, you have got one estimate and its distribution. Given that you already have a distribution(called Prior distribution) for the parameter, you are finding improved distribution (Posterior distribution) for the parameter, conditioned on the second data set. Even, you can assume a prior distribution for the parameters and use the very first data set to build posterior distribution.
Lots of ready-to-use packages are available for Bayesian Estimation in R. It is a free software. You can go and explore the below CRAN link: