Thursday, June 9, 2016

IB Physics 2015 Higher level Paper 2 Question 4

QuestionThis question is about the entropy of an ideal gas in a cyclic process. Students are expected to explain whether there is any change in the entropy of the gas immediately after it has completed the cyclic process. 


Mark Scheme: (1) The entropy is unchanged. (2) The gas returned to its original state. 

Comments
Based on the mark scheme, one may expect a reasonably good answer is “the entropy is unchanged because the gas has returned to its original state.” However, physics teachers should also accept explanation such as “entropy is a state function which is path independent.” Alternatively, Feynman mentions that “we have found another quantity which is a function of the condition, i.e., the entropy of the substance. Let us try to explain how we compute it, and what we mean when we call it a function of the condition (Feynman et al., 1963, section 44-6 Entropy).” Simply phrased, entropy is a function of the condition and the change of entropy is dependent on the initial and final condition of the gas.

On the other hand, the cyclic process ABCDA is not a Carnot cycle. Furthermore, the definition of entropy as dQ/T is based on the condition that the process is reversible. In general, a reversible process is a quasi-static process whose direction can be “reversed” by means of infinitesimal change. In Cheng’s (2006) words, “[a]ny arbitrary reversible cyclic process can be approximated by a very large number of infinitesimal Carnot cycles (p. 48).” In other words, the combination of these large number of Carnot cycles is a “good approximation” to the cyclic process when the temperature difference ΔT between the isothermal processes of any two neighboring Carnot cycles approaches zero. Importantly, the adiabatic processes of the neighboring processes cancel each other because their directions are opposite to each other.

Feynman insights?:
There are at least three insights that we can learn from Feynman’s lectures.

1. Entropy is a state function of temperature and volume: In the words of Feynman, “[w]e can, therefore, say that there is a certain function, which we call the entropy of the substance, that depends only on the condition, i.e., only on the volume and temperature (Feynman et al., 1963, section 44–6 Entropy). More importantly, Feynman elaborates that “[c]onsider the system in two different conditions, much as we had in the experiment where we did the adiabatic and isothermal expansions. (Incidentally, there is no need that a heat engine has only two reservoirs, it could have three or four different temperatures at which it takes in and delivers heats, and so on.) We can move around on a pV diagram all over the place, and go from one condition to another. In other words, we could say the gas is in a certain condition a, and then it goes over to some other condition, b, and we will require that this transition, made from a to b, be reversible. Now suppose that all along the path from a to b we have little reservoirs at different temperatures, so that the heat dQ removed from the substance at each little step is delivered to each reservoir at the temperature corresponding to that point on the path. Then let us connect all these reservoirs, by reversible heat engines, to a single reservoir at the unit temperature. When we are finished carrying the substance from a to b, we shall bring all the reservoirs back to their original condition. Any heat dQ that has been absorbed from the substance at temperature T has now been converted by a reversible machine, and a certain amount of entropy dS has been delivered at the unit temperature as follows: dS = dQ/T (Feynman et al., 1963, section 44–6 Entropy).” Essentially, the entropy of a system is a state function and this is based on the assumption of reversible heat engines.

2. A reversible process is a theoretical idealization: According to Feynman, “we will lose something if the engines contain devices in which there is friction. The best engine will be a frictionless engine. We assume, then, the same idealization that we did when we studied the conservation of energy; that is, a perfectly frictionless engine (Feynman et al., 1963, section 44-3 Reversible engines).” That is, the reversible heat engines should be ideally frictionless. Similarly, in the words of Landau and Lifshitz (1978), “[r]eversible processes are those in which the entropy of the closed system remains constant, and which can, therefore, take place in the reverse direction. A strictly reversible process is, of course, an ideal limiting case; processes actually occurring in Nature can be reversible only to within a certain degree of approximation (p. 33).” In short, a reversible process is an idealized process that is unlikely possible in the real world. Thus, one may explain that the entropy of the ideal gas is unchanged because the process is reversible, but it involves a theoretical idealization and approximation.

3. The entropy of an isolated system or universe increases: Feynman clarifies that “in any process that is irreversible, the entropy of the whole world is increased. Only in reversible processes does the entropy remain constant. Since no process is absolutely reversible, there is always at least a small gain in the entropy; a reversible process is an idealization in which we have made the gain of entropy minimal (Feynman et al., 1963, section 44-6 Entropy).” An important point here is that it is always likely to have an increase in the entropy of the universe. Furthermore, in Feynman Lectures on Computation, Feynman (1996) elaborates that “[f]or an irreversible process, the equality is replaced by an inequality, ensuring that the entropy of an isolated system can only remain constant or increase (p. 141).” More important, the entropy of an isolated system can remain constant or increase depending whether the process is reversible or irreversible.

Note:
1. In the words of Clausius (1862), “I prefer going to the ancient languages for the names of important scientific quantities, so that they mean the same thing in all living tongues. I propose, accordingly, to call S the entropy of a body, after the Greek word [τρoπη], ‘transformation’. I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful.

2. You may want to take a look at this website:
http://feynman-answer.blogspot.sg/2016/06/entropy-remains-unchanged-or-increases.html

References:
1. Cheng, Y.-C. (2006). Macroscopic and Statistical Thermodynamics. Singapore: World Scientific
2. Feynman, R. P., Leighton, R. B., & Sands, M. L. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley. 
3. Landau, L. D., & E.M. Lifshitz, E. M. (1980). Statistical Physics. Vol 1. Oxford: Pergamon Press.
4. Clausius, R. (1862). XXIX. On the application of the theorem of the equivalence of transformations to the internal work of a mass of matter. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science24(160), 201-213.

No comments:

Post a Comment