If entropy shows a sudden and abrupt drop at the critical point and if the extent of drop increases with increasing system size then can I conclude that the corresponding system is undergoing a first order phase transition?
Distinguishing first and second order phase transitions is always tricky. Whenever various quantities show a discontinuity at the same point, this is a good first hint for a first order phase transition. But it would be rather strange, if only the entropy were to show such behaviour. This is particularly problematic since, at least for MC simulation methods, the entropy is one of the least accessible quantities. It would in any case be good to know what happens with other quantities. Also, to see whether you can identify the high temperature phase, the low temperature phase, and somehow get them to coexist.
A phase transition is associated with discontinuity of the Gibbs surfaces. You can calculate the volume or entropy when you go phase surface to the other as:
V(p,T) = (\partial G / \partial p)_T
S(p,T) = (\partial G/\partial T)_p
where G is Gibbs free energy function, T temperature and p is the pressure.
If the both surfaces joint to each other then both S and V are continuous and the phase transition is of the second or higher order; if the V(T,p) or S(T,p) are discontinuous then the transition is first order.
From the simulation point of view, a curve of V as a function of p for fixed T is more appreciated since the V and p are conjugated variable to each other and so V as a function of p is a monotonic function, on the other hand V versus T curve for p fixed may contain enough information, but it is not necessary a monotonic function since V and T are not conjugated variables.
Thank you professor Leyvraz for taking the pain to write an excellent answer. I appreciate it very much. It was nice to hear from you. When I was working on aggregation I read your articles a lot. Reading a paper is like learning the author. Your papers helped a lot in pursuing my research. Indeed, the difference between first and second order phase transition is quite tricky. I am now looking at other parameters as well. Take care.
Hello Prof. Kamberaj , Thanks a lot for your answer. But unfortunately I can't calculate the free energy of system. Your suggestion is highly appreciated. All the best.
Now I am a bit puzzled: how do you know the entropy if you cannot calculate the free energy? After all, I would believe that, for any reasonable system, you ought to be able to compute the internal energy U, and surely you have the temperature T as well. But if you additionally have the entropy, then
F = U-TS
gives the free energy. Of course, my comment concerning the entropy being a rather inaccessible quantity referred to the fact that it requires rather intricate techniques (thermo- dynamic integration, or Wang-Landau methods) to be obtained.
Probability is not a physical parameter, whereas the temperature - definitely is.
There is also no "Shannon Entropy" (it is just nothing more than a joke), whereas the temperature is known to be proportional to the algorithm of a handy algebraic function of temperature. Indeed, the expression for the latter function could be derived using the probability theory toolbox... But how exactly you derive this has nothing to do with actual physical problems...
In this connection, could you please tell us in more detail A) what is your problem and B) how do you plan to solve it?
At the critical point there are no abrupt transitions. You need to give some additional information. You say you are calculating the Shannon entropy. What is your system? Are you confident you know all microstates to calculate their probability?
I don't know what makes you say that the Shannon Entropy is just nothing more than a joke. Have you worked with it? In physics there should not be any room for faith. Everything can be under scrutiny. We all can criticize but not brush away. I have been working on it and I find it extremely consistent with the expected finding. You know percolation has been as a model for continuous phase transition for more than 60 years yet we had trouble to define entropy. A model for phase transition and not knowing how to get entropy makes the model weak. Of course you cannot define thermal entropy like Boltzmann or Gibb's entropy. Percolation being probabilistic model, the best hope is to use Shannon entropy. However, one has to keep in mind that entropy should be zero where order parameter is maximum and vice versa since entropy measures degree of disorder and order parameter quantifies the extent of order. Thus both cannot be zero at the same time. Our study proves that percolation transition is exactly like paramagnetic to ferromagnetic transition (please see the above figure. We have plotted for explosive percolation in Erdos-Renyi network).
Yes, I am working on the foundations of thermodynamics already for about a decade.
If you mean the formula S = k ln (W), when speaking of 'Shannon Entropy', then the formula is old and good. It is serving as a mathematical basis for such tried and true physical theories as statistical mechanics and quantum physics.
The author of the formula is Ludwig Boltzmann. Further, Max Planck has realized that Boltzmann's formula is capable of working correctly. This is why, the formula S = k ln (W) is conventionally known as Boltzmann-Planck formula.
There is but one very significant point in connection with the formula in question.
Ludwig Boltzmann has ingeniously guessed the formula, but never formally theoretically-physically derived and/or analyzed it.
Likewise, Max Planck was never trying to bridge this conceptual gap. He was solely reiterating Boltzmann's guess that the W under the logarithm sign ought to be a kind of 'probability'... He had not given a firm physical account as for 'probability of WHAT' might it be. And so did his numerous followers...
In connection with this, there is still no conventional answer to the poser of "What is Entropy?"...
Now, let us but come back to Claude Shannon, who was working with the mathematical theory of communication.
Shannon could formally arrive at the formula similar to 'S = k ln (W)', but in the sense of the Information Theory, with S standing the average amount of information produced by a stochastic source of data:
Shannon entropy (see attached image)
To dub the latter S an 'entropy' was a very well known joke by a well-known mathematician John von Neumann, who was - on the one hand - a close friend of Shannon and - on the other hand - he was successfully bridging the numerous logical gaps of the theoretical physics of his time. Neumann stated that 'nobody knows, what entropy in fact is', and that had been the ultimately competent estimate of the state of art...
Hence, to dub the entropy employed in the natural sciences a 'Shannon Entropy' is joke. To call the corresponding formula 'Gibbs Entropy' or even 'von Neumann entropy' ought to be the pertinent choice, to my mind...
von Neumann entropy (see attached image)
With my friendly collegial greetings and best wishes,
You state in your rebuttal to me: ' However, one has to keep in mind that entropy should be zero where order parameter is maximum and vice versa since entropy measures degree of disorder and order parameter quantifies the extent of order. '
The physical notion of entropy has actually nothing to do with the 'order/disorder'. The Entropy is an excellent dub for the ubiquitous hindrances/obstacles/hurdles on the pathway of the realistic process of your study.
Moreover, entropy can come to zero only mathematically but not physically/chemically/biologically/etc., for the more progress in the process of your study, the more hindrances arise.
Zero entropy, i.e., zero hindrances, means nothing more and nothing less than that there is zero progress in your process.
Still, the good news (communicated to us by Carnot, Clapeyron, Clausius and Lord Kelvin) is that the hindrances/obstacles/hurdles to the progress have to reach their maximum.
Then, if the driving force of your process is enough to overcome such a maximum, the aim your process is heading for might be reached after overcoming this maximum.
You would ask, how might we estimate beforehand, whether we have the driving force enough to achieve the goal of our process?
To answer the poser you ought to analyze the potential energy of your system in detail, for the potential energy ought to be the Energy Stock of the system.
Indeed, what is the potential energy of a system?
It is the energy of interaction between/among the subsystems. If you would like to go to the atomic/molecular level, you'd have to analyze the energy of interatomic/intermolecular interactions...
Further, you state: 'Percolation being probabilistic model, the best hope is to use Shannon entropy.'
First of all, percolation is a physical phenomenon of "filtering through" or "trickling through" some medium.
Sure, it is possible to model this event using the probability theory. But, as far as I know, there is still no consistent analytical model of it, for there is no consistent, detailed and thorough physical-theoretical consideration of this very interesting and important phenomenon.
The computer simulations do help in describing, but regretfully not in understanding of the percolation...
With this entirety in mind, my best hope is that soon young active, proactive colleagues would come and perform a true theoretical-physical analysis of the event, using Mathematical Physics, but not the Physical Mathematics, unlike it was happening till now...
I see there is absolutely no problem with Shannon entropy. The problem only arises when you try confuse it with thermal entropy. It has nothing to do with thermal entropy. It is just an equivalent counterpart of thermal entropy. Consider you have a system where all the microsites are equally, like a fair dice, like then Shannon entropy gives maximum value. On the other hand, if you make dice heavily biased so tat you always get the number then the Shannon entropy suggest it is equal to zero. If we could move distribution of probabilities uneven distribution we see the two extreme values interpolates nicely. Here, when you get the maximum entropy it is like you are most confused or ignorant about the outcome and in the other extreme you are least confused or most knowledgeable about the outcome. The former you can regard as the disordered system and the other extreme case as the ordered. It works fine in my opinion. But one has to be careful about mixing it with thermal entropy.
many sincere thanks for your prompt and useful answer!
I agree with you to 100%, but up to one very important point.
What you are mentioning are skillful mathematical tricks to fully correctly describe your experimental data. What you are arriving at as a result is undoubtedly a great finding, because you are now possessed of a valid model for your system.
But it remains to be unclear, in how far your model is a truly universal one.
Remarkably, a more general way of proceeding is throughout possible.
In fact, you do not need any artificial probabilistic reasoning, any appealing to some 'mystic' order/disorder - or similar very nice, but purely emotional moves.
The universal functional dependence of thermodynamic entropy versus absolute temperature is known already for about 100 years - it was remaining widely unknown, but it has been published and therefore available.
With this enttirety at hand it is throughout possible to formally reconstruct the actual probability distribution function. It turns out to be just the Beta distribution, which might be mathematically boiled down to the Gaussian Normal distribution, which is just a limiting case for Beta...
It is a general result, for studying a system consistent of a 'huge number' of atoms/molecules (whatever) is essentially a fuzzy problem, for you might never arraive at some general definition of what is in effect 'HUGE'. This is just so-called Sorites, or Heap, Paradox...
It is well known that working with fuzzy systems means working with the 'possibilities', but not with 'probabilities'... Meanwhile, it is formally possible to boil down this fuzzy stuff to the formal probability theory. This is how the Beta probability distribution function comes into the mathematical play.
...Howbeit, my intention here is not to speechify the main problem. It is just to turn your attention to the wealth of possible consideration variants...