Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.

Slides:



Advertisements
Similar presentations
Department of Mechanical Engineering ME 322 – Mechanical Engineering Thermodynamics Lecture 18 Introduction to 2 nd Law and Entropy.
Advertisements

Phy 212: General Physics II Chapter 20: Entropy & Heat Engines Lecture Notes.
Entropy Cengel & Boles, Chapter 6 ME 152.
Entropy Change Property diagrams (T-s and h-s diagrams) –From the definition of the entropy, it is known that  Q=TdS during a reversible process. –Hence.
EGR 334 Thermodynamics Chapter 6: Sections 6-8
Lec 18: Isentropic processes, TdS relations, entropy changes
Exergy: A Measure of Work Potential Study Guide in PowerPoint
Entropy Thermodynamics Professor Lee Carkner Lecture 13.
Phy 212: General Physics II
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
P M V Subbarao Professor Mechanical Engineering Department
Entropy and the Second Law of Thermodynamics
Phy 202: General Physics II Ch 15: Thermodynamics.
Entropy: A Measure of Disorder. 2 Entropy and the Clausius Inequality The second law of thermodynamics leads to the definition of a new property called.
Thermodynamics Lecture Series Applied Sciences Education.
The Second Law of Thermodynamics Chapter Introduction The first law of thermodynamics is simple, general, but does not constitute a complete theory.
Second Law of Thermodynamics (YAC Ch.5) Identifies the direction of a process. (e.g.: Heat can only spontaneously transfer from a hot object to a cold.
Chapter 5 The First Law of Thermodynamics
Evaluating entropy changes
Thermodynamics I Chapter 6 Entropy Mohsin Mohd Sies Fakulti Kejuruteraan Mekanikal, Universiti Teknologi Malaysia.
PTT 201/4 THERMODYNAMIC SEM 1 (2013/2014) CHAPTER 7: Entropy.
Chapter 7 ENTROPY Mehmet Kanoglu
PTT 201/4 THERMODYNAMIC SEM 1 (2012/2013). Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify.
Thermodynamics Chapter 15. Expectations After this chapter, students will:  Recognize and apply the four laws of thermodynamics  Understand what is.
THERMODYNAMICS CH 15.
Topic 10.3 Second Law of Thermodynamics and Entropy
Chapter 6 Using Entropy.
Spontaneity of Chemical and Physical Processes: The Second and Third Laws of Thermodynamics 1.
Lecture slides by Mehmet Kanoglu
Entropy Chapter The important point is that since entropy is a property, the change in the entropy of a substance in going from one.
Lesson 8 SECOND LAW OF THERMODYNAMICS
Chapter 5 The Second Law of Thermodynamics. Learning Outcomes ►Demonstrate understanding of key concepts related to the second law of thermodynamics,
The Laws of Thermodynamics
Thermodynamics Chapter 24. Topics Thermodynamics –First law –Second law Adiabatic Processes Heat Engines Carnot Efficiency Entropy.
ERT 206 THERMODYNAMICS WAN KHAIRUNNISA WAN RAMLI
Entropy Change by Heat Transfer Define Thermal Energy Reservoir (TER) –Constant mass, constant volume –No work - Q only form of energy transfer –T uniform.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Thermodynamics The First Law of Thermodynamics Thermal Processes that Utilize an Ideal Gas The Second Law of Thermodynamics Heat Engines Carnot’s Principle.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
The Second Law of Thermodynamics Chapter 6. The Second Law  The second law of thermodynamics states that processes occur in a certain direction, not.
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
The Second Law of Thermodynamics
Chapter 20 Entropy and the Second Law of Thermodynamics 20.1 Some one-way processes Which is closer to ‘common’ sense? Ink diffusing in a beaker of water.
PHY203: Thermal Physics Topic 4: Second Law of Thermodynamics Heat Engines Statements of the Second Law (Kelvin, Clausius) Carnot Cycle Efficiency of a.
1 Property Relationships Chapter 6. 2 Apply the differential form of the first law for a closed stationary system for an internally reversible process.
Chapter 7 ENTROPY Dr. Kagan ERYURUK
ENTROPY AND THIRD LAW OF THERMODYNAMICS. 2 ND LAW OF THERMODYNAMICS  Kelvin-Planck Statement  It is impossible to construct an engine which operating.
Thermodynamics I Inter - Bayamon Lecture 7 Thermodynamics I MECN 4201 Professor: Dr. Omar E. Meza Castillo
6. ENTROPY. Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify the second-law effects. Establish.
kr 1 Lecture Notes on Thermodynamics 2008 Chapter 7 Entropy Prof. Man Y. Kim, Autumn 2008, ⓒ Aerospace.
1 Second Law of Thermodynamics - Entropy. 2 Introduction The second low often leads to expressions that involve inequalities.
CHAPTER 6 Entropy. Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. The Clausius Inequality: This inequality.
Objectives Introduce the thermodynamic property entropy (S) using the Clausius inequality Recognize the fact that the entropy is always increasing for.
Entropy & Real Processes P M V Subbarao Professor Mechanical Engineering Department Entropy View of Real Engineering Process …..
ENGR 2213 Thermodynamics F. C. Lai School of Aerospace and Mechanical Engineering University of Oklahoma.
Thermodynamics. Definitions Thermodynamics is the study of processes in which energy is transferred as work and heat The system is a set of particles.
Thermodynamics Thermodynamics is a branch of physics concerned with heat and temperature and their relation to energy and work.
Entropy (YAC- Ch. 6) Introduce the thermodynamic property called Entropy (S) Entropy is defined using the Clausius inequality Introduce the Increase of.
Second Law of Thermodynamics Heat generally cannot flow spontaneously from a material at lower temperature to a material at higher temperature. The entropy.
L.C. INSTITUTE OF TECHNOLOGY BHANDU. Ch.2  Ch.2 Second Law of Second Law of Thermodynamics Thermodynamics.
Lecture Outline Chapter 12 College Physics, 7 th Edition Wilson / Buffa / Lou © 2010 Pearson Education, Inc.
Chapter: 07 ENTROPY.
G.K.BHARAD INSTITUTE OF ENGINEERING(059)
Lecture 45 Entropy Clausius theorem Entropy as a state function
Thermodynamics: An Engineering Approach Yunus A. Cengel, Michael A
A Brief Introduction to Information Theory
Fundamentals of Thermal-Fluid Sciences
Chapter Seven: Entropy
Entropy Entropy is a measure of molecular disorder, or molecular
Presentation transcript:

Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase of Entropy Principle which states that –the entropy for an isolated system (or a system plus its surroundings) is always increases or, at best, remains the same. –Second Law in terms of Entropy  Learn to use the Entropy balance equation: entropy change = entropy transfer + entropy change.  Analyze entropy changes in thermodynamic process and learn how to use thermodynamic tables  Examine entropy relationships (Tds relations), entropy relations for ideal gases.  Property diagrams involving entropy (T-s and h-s diagrams) In this Chapter, we will:

Entropy – A Property Entropy is a thermodynamic property; it can be viewed as a measure of disorder. i.e. More disorganized a system the higher its entropy. Defined using Clausius inequality where  Q is the differential heat transfer & T is the absolute temperature at the boundary where the heat transfer occurs Clausius inequality is valid for all cycles, reversible and irreversible. Consider a reversible Carnot cycle: Since, i.e. it does not change if you return to the same state, it must be a property, by defintion: Let’s define a thermodynamic property entropy (S), such that True for a Reversible Process only

Entropy (cont’d) Since entropy is a thermodynamic property, it has fixed values at a fixed thermodynamic states. Hence, the change,  S, is determined by the initial and final state. BUT.. The change is = only for a Reversible Process 1 2 reversible process any process T S Consider a cycle, where Process 2-1 is reversible and 1- 2 may or may not be reversible

Increase of Entropy Principle (YAC- Ch. 6-3) Implications: Entropy, unlike energy, is non-conservative since it is always increasing. The entropy of the universe is continuously increasing, in other words, it is becoming disorganized and is approaching chaotic. The entropy generation is due to the presence of irreversibilities. Therefore, the higher the entropy generation the higher the irreversibilities and, accordingly, the lower the efficiency of a device since a reversible system is the most efficient system. The above is another statement of the second law Entropy change Entropy Transfer (due to heat transfer) Entropy Generation The principle states that for an isolated Or a closed adiabatic Or System + Surroundings A process can only take place such that S gen  0 where S ge = 0 for a reversible process only And S ge can never be les than zero. Increase of Entropy Principle

Second Law & Entropy Balance (YAC- Ch. 6-4) Increase of Entropy Principle is another way of stating the Second Law of Thermodynamics: Second Law : Entropy can be created but NOT destroyed (In contrast, the first law states: Energy is always conserved) Note that this does not mean that the entropy of a system cannot be reduced, it can. However, total entropy of a system + surroundings cannot be reduced Entropy Balance is used to determine the Change in entropy of a system as follows: Entropy change = Entropy Transfer + Entropy Generation where, Entropy change =  S = S 2 - S 1 Entropy Transfer = Transfer due to Heat (Q/T) + Entropy flow due to mass flow (m i s i – m e s e ) Entropy Generation = S gen  0 For a Closed System: S 2 - S 1 =  Q k /T k + S gen In Rate Form: dS/dt =  Q k /T k + S gen For an Open System (Control Volume): Similar to energy and mass conservation, the entropy balance equations can be simplified Under appropriate conditions, e.g. steady state, adiabatic….

Entropy Generation Example Show that heat can not be transferred from the low-temperature sink to the high-temperature source based on the increase of entropy principle. Source 800 K Sink 500 K Q=2000 kJ  S(source) = 2000/800 = 2.5 (kJ/K)  S(sink) = -2000/500 = -4 (kJ/K) S gen =  S(source)+  S(sink) = -1.5(kJ/K) < 0 It is impossible based on the entropy increase principle S gen  0, therefore, the heat can not transfer from low-temp. to high-temp. without external work input If the process is reversed, 2000 kJ of heat is transferred from the source to the sink, S gen =1.5 (kJ/K) > 0, and the process can occur according to the second law If the sink temperature is increased to 700 K, how about the entropy generation?  S(source) = -2000/800 = -2.5(kJ/K)  S(sink) = 2000/700 = 2.86 (kJ/K) S gen =  S(source)+  S(sink) = 0.36 (kJ/K) < 1.5 (kJ/K) Entropy generation is less than when the sink temperature is 500 K, less irreversibility. Heat transfer between objects having large temperature difference generates higher degree of irreversibilities

A Brief Introduction to Information Theory  Information theory is a branch of science that deals with the analysis of a communications system  We will study digital communications – using a file (or network protocol) as the channel  Claude Shannon Published a landmark paper in 1948 that was the beginning of the branch of information theory  We are interested in communicating information from a source to a destination Source of Message Encoder NOISE ChannelDecoder Destination of Message

Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  In our case, the messages will be a sequence of binary digits –Does anyone know the term for a binary digit?  One detail that makes communicating difficult is noise –noise introduces uncertainty  Suppose I wish to transmit one bit of information what are all of the possibilities? –tx 0, rx 0 - good –tx 0, rx 1 - error –tx 1, rx 0 - error –tx 1, rx 1 - good  Two of the cases above have errors – this is where probability fits into the picture  In the case of steganography, the “noise” may be due to attacks on the hiding algorithm

A Brief Introduction to Information Theory  Claude Shannon introduced the idea of self-information  Suppose we have an event X, where X i represents a particular outcome of the event  Consider flipping a fair coin, there are two equiprobable outcomes: –say X 0 = heads, P 0 = 1/2, X 1 = tails, P 1 = 1/2  The amount of self-information for any single result is 1 bit  In other words, the number of bits required to communicate the result of the event is 1 bit

A Brief Introduction to Information Theory  When outcomes are equally likely, there is a lot of information in the result  The higher the likelihood of a particular outcome, the less information that outcome conveys  However, if the coin is biased such that it lands with heads up 99% of the time, there is not much information conveyed when we flip the coin and it lands on heads

A Brief Introduction to Information Theory  Suppose we have an event X, where X i represents a particular outcome of the event  Consider flipping a coin, however, let’s say there are 3 possible outcomes: heads (P = 0.49), tails (P=0.49), lands on its side (P = 0.02) – (likely MUCH higher than in reality) –Note: the total probability MUST ALWAYS add up to one  The amount of self-information for either a head or a tail is 1.02 bits  For landing on its side: 5.6 bits

Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Entropy is the measurement of the average uncertainty of information –We will skip the proofs and background that leads us to the formula for entropy, but it was derived from required properties –Also, keep in mind that this is a simplified explanation  H – entropy  P – probability  X – random variable with a discrete set of possible outcomes –(X 0, X 1, X 2, … X n-1 ) where n is the total number of possibilities

A Brief Introduction to Information Theory  Entropy is greatest when the probabilities of the outcomes are equal  Let’s consider our fair coin experiment again  The entropy H = ½ lg 2 + ½ lg 2 = 1  Since each outcome has self-information of 1, the average of 2 outcomes is (1+1)/2 = 1  Consider a biased coin, P(H) = 0.98, P(T) = 0.02  H = 0.98 * lg 1/ * lg 1/0.02 = = 0.98 * * = =

A Brief Introduction to Information Theory  In general, we must estimate the entropy  The estimate depends on our assumptions about about the structure (read pattern) of the source of information  Consider the following sequence:  Obtaining the probability from the sequence –16 digits, 1, 6, 7, 10 all appear once, the rest appear twice  The entropy H = 3.25 bits  Since there are 16 symbols, we theoretically would need 16 * 3.25 bits to transmit the information

A Brief Introduction to Information Theory  Consider the following sequence:  Obtaining the probability from the sequence –1, 2 four times (4/22), (4/22) –4 fourteen times (14/22)  The entropy H = = bits  Since there are 22 symbols, we theoretically would need 22 * = (29) bits to transmit the information  However, check the symbols 12, 44  12 appears 4/11 and 44 appears 7/11  H = = bits  11 * = (11) bits to tx the info (38 % less!)  We might possibly be able to find patterns with less entropy