Presentation is loading. Please wait.

Presentation is loading. Please wait.

Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.

Similar presentations


Presentation on theme: "Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase."— Presentation transcript:

1 Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase of Entropy Principle which states that –the entropy for an isolated system (or a system plus its surroundings) is always increases or, at best, remains the same. –Second Law in terms of Entropy  Learn to use the Entropy balance equation: entropy change = entropy transfer + entropy change.  Analyze entropy changes in thermodynamic process and learn how to use thermodynamic tables  Examine entropy relationships (Tds relations), entropy relations for ideal gases.  Property diagrams involving entropy (T-s and h-s diagrams) In this Chapter, we will:

2 Entropy – A Property Entropy is a thermodynamic property; it can be viewed as a measure of disorder. i.e. More disorganized a system the higher its entropy. Defined using Clausius inequality where  Q is the differential heat transfer & T is the absolute temperature at the boundary where the heat transfer occurs Clausius inequality is valid for all cycles, reversible and irreversible. Consider a reversible Carnot cycle: Since, i.e. it does not change if you return to the same state, it must be a property, by defintion: Let’s define a thermodynamic property entropy (S), such that True for a Reversible Process only

3 Entropy (cont’d) Since entropy is a thermodynamic property, it has fixed values at a fixed thermodynamic states. Hence, the change,  S, is determined by the initial and final state. BUT.. The change is = only for a Reversible Process 1 2 reversible process any process T S Consider a cycle, where Process 2-1 is reversible and 1- 2 may or may not be reversible

4 Increase of Entropy Principle (YAC- Ch. 6-3) Implications: Entropy, unlike energy, is non-conservative since it is always increasing. The entropy of the universe is continuously increasing, in other words, it is becoming disorganized and is approaching chaotic. The entropy generation is due to the presence of irreversibilities. Therefore, the higher the entropy generation the higher the irreversibilities and, accordingly, the lower the efficiency of a device since a reversible system is the most efficient system. The above is another statement of the second law Entropy change Entropy Transfer (due to heat transfer) Entropy Generation The principle states that for an isolated Or a closed adiabatic Or System + Surroundings A process can only take place such that S gen  0 where S ge = 0 for a reversible process only And S ge can never be les than zero. Increase of Entropy Principle

5 Second Law & Entropy Balance (YAC- Ch. 6-4) Increase of Entropy Principle is another way of stating the Second Law of Thermodynamics: Second Law : Entropy can be created but NOT destroyed (In contrast, the first law states: Energy is always conserved) Note that this does not mean that the entropy of a system cannot be reduced, it can. However, total entropy of a system + surroundings cannot be reduced Entropy Balance is used to determine the Change in entropy of a system as follows: Entropy change = Entropy Transfer + Entropy Generation where, Entropy change =  S = S 2 - S 1 Entropy Transfer = Transfer due to Heat (Q/T) + Entropy flow due to mass flow (m i s i – m e s e ) Entropy Generation = S gen  0 For a Closed System: S 2 - S 1 =  Q k /T k + S gen In Rate Form: dS/dt =  Q k /T k + S gen For an Open System (Control Volume): Similar to energy and mass conservation, the entropy balance equations can be simplified Under appropriate conditions, e.g. steady state, adiabatic….

6 Entropy Generation Example Show that heat can not be transferred from the low-temperature sink to the high-temperature source based on the increase of entropy principle. Source 800 K Sink 500 K Q=2000 kJ  S(source) = 2000/800 = 2.5 (kJ/K)  S(sink) = -2000/500 = -4 (kJ/K) S gen =  S(source)+  S(sink) = -1.5(kJ/K) < 0 It is impossible based on the entropy increase principle S gen  0, therefore, the heat can not transfer from low-temp. to high-temp. without external work input If the process is reversed, 2000 kJ of heat is transferred from the source to the sink, S gen =1.5 (kJ/K) > 0, and the process can occur according to the second law If the sink temperature is increased to 700 K, how about the entropy generation?  S(source) = -2000/800 = -2.5(kJ/K)  S(sink) = 2000/700 = 2.86 (kJ/K) S gen =  S(source)+  S(sink) = 0.36 (kJ/K) < 1.5 (kJ/K) Entropy generation is less than when the sink temperature is 500 K, less irreversibility. Heat transfer between objects having large temperature difference generates higher degree of irreversibilities

7 A Brief Introduction to Information Theory  Information theory is a branch of science that deals with the analysis of a communications system  We will study digital communications – using a file (or network protocol) as the channel  Claude Shannon Published a landmark paper in 1948 that was the beginning of the branch of information theory  We are interested in communicating information from a source to a destination Source of Message Encoder NOISE ChannelDecoder Destination of Message

8 Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  In our case, the messages will be a sequence of binary digits –Does anyone know the term for a binary digit?  One detail that makes communicating difficult is noise –noise introduces uncertainty  Suppose I wish to transmit one bit of information what are all of the possibilities? –tx 0, rx 0 - good –tx 0, rx 1 - error –tx 1, rx 0 - error –tx 1, rx 1 - good  Two of the cases above have errors – this is where probability fits into the picture  In the case of steganography, the “noise” may be due to attacks on the hiding algorithm

9 A Brief Introduction to Information Theory  Claude Shannon introduced the idea of self-information  Suppose we have an event X, where X i represents a particular outcome of the event  Consider flipping a fair coin, there are two equiprobable outcomes: –say X 0 = heads, P 0 = 1/2, X 1 = tails, P 1 = 1/2  The amount of self-information for any single result is 1 bit  In other words, the number of bits required to communicate the result of the event is 1 bit

10 A Brief Introduction to Information Theory  When outcomes are equally likely, there is a lot of information in the result  The higher the likelihood of a particular outcome, the less information that outcome conveys  However, if the coin is biased such that it lands with heads up 99% of the time, there is not much information conveyed when we flip the coin and it lands on heads

11 A Brief Introduction to Information Theory  Suppose we have an event X, where X i represents a particular outcome of the event  Consider flipping a coin, however, let’s say there are 3 possible outcomes: heads (P = 0.49), tails (P=0.49), lands on its side (P = 0.02) – (likely MUCH higher than in reality) –Note: the total probability MUST ALWAYS add up to one  The amount of self-information for either a head or a tail is 1.02 bits  For landing on its side: 5.6 bits

12 Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Entropy is the measurement of the average uncertainty of information –We will skip the proofs and background that leads us to the formula for entropy, but it was derived from required properties –Also, keep in mind that this is a simplified explanation  H – entropy  P – probability  X – random variable with a discrete set of possible outcomes –(X 0, X 1, X 2, … X n-1 ) where n is the total number of possibilities

13 A Brief Introduction to Information Theory  Entropy is greatest when the probabilities of the outcomes are equal  Let’s consider our fair coin experiment again  The entropy H = ½ lg 2 + ½ lg 2 = 1  Since each outcome has self-information of 1, the average of 2 outcomes is (1+1)/2 = 1  Consider a biased coin, P(H) = 0.98, P(T) = 0.02  H = 0.98 * lg 1/0.98 + 0.02 * lg 1/0.02 = = 0.98 * 0.029 + 0.02 * 5.643 = 0.0285 + 0.1129 = 0.1414

14 A Brief Introduction to Information Theory  In general, we must estimate the entropy  The estimate depends on our assumptions about about the structure (read pattern) of the source of information  Consider the following sequence: 1 2 3 2 3 4 5 4 5 6 7 8 9 8 9 10  Obtaining the probability from the sequence –16 digits, 1, 6, 7, 10 all appear once, the rest appear twice  The entropy H = 3.25 bits  Since there are 16 symbols, we theoretically would need 16 * 3.25 bits to transmit the information

15 A Brief Introduction to Information Theory  Consider the following sequence: 1 2 1 2 4 4 1 2 4 4 4 4 4 4 1 2 4 4 4 4 4 4  Obtaining the probability from the sequence –1, 2 four times (4/22), (4/22) –4 fourteen times (14/22)  The entropy H = 0.447 + 0.447 + 0.415 = 1.309 bits  Since there are 22 symbols, we theoretically would need 22 * 1.309 = 28.798 (29) bits to transmit the information  However, check the symbols 12, 44  12 appears 4/11 and 44 appears 7/11  H = 0.530 + 0.415 = 0.945 bits  11 * 0.945 = 10.395 (11) bits to tx the info (38 % less!)  We might possibly be able to find patterns with less entropy


Download ppt "Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase."

Similar presentations


Ads by Google