Question 3 N ! For N distinguishable coins the thermodynamic probability is o= where N ,!( N – N )! N , the number of head and ( N – N .) the number of tails. Assume that N is large enough that Stirling’s approximation is valid, show that: N a) Inw is a maximum for N , 2 b) ????? ze In 2, For N distinguishiable coins the thermodynamic probability is w= N!/(N1!(N-N1)!) where N1 is the number of heads and N-N1 is the number of tails. a.) assume that.
12.5) For N distinguishable coins the thermodynamic probability is? = N ! N1!( N – N1)!, where N1 is the number of heads and N – N1 the number of tails. a) Assume that N is large enough that Stirlings approximation is valid. Show that ln ? is a maximum for N1 = N /2. b) Show that ?max e N ln 2. 12.8 Two distinguishable particles are to be …
(e) Repeat (a)-(d) for the case of 100 coins and for 500 coins. What do you see? 2. For N distinguishable coins the thermodynamic probability of getting N 1 heads is w= N! N 1!(N N 1)! (a) Assume N is large enough that Stirlings approximation|which says that lnn! nlnn n, for n?1|is valid. Show that lnw has a maximum at N 1 = N=2. 2, Problem 12.5 We have N distinguishable coins . The thermodynamic probability for a particular microstate is (slide 9) (a) (Stirlings Formula) Maximum (b) Now for the number of microstates at the maximum 34 (No Transcript) 35 Problem 12.8. In this problem we show explicity the microstates associated with each macrostate.
achieved. This is an unnormalized probability , an integer between zero and infinity, rather than a number between zero and one. 2- Multiplicative property Z 2) N Example: p N p N p N 1 1 1 ( 4) ( 2) ( 2). Example: 2N N . 3- By increasing the number of coins , the maximum value of the thermodynamic probability x!) N NN Z §· ¨¸ ©¹ increases …
The thermodynamic probability (denoted by W) is equal to the number of micro-states which realize a given macro-state, from which it follows that W. 1. The thermodynamic probability is connected with one of the basic macroscopic characteristics of the system, the entropy S , by the Boltzmann relation S = k ln W , where k is Boltzmanns constant.
Answer. Two such sequences, for example, might look like this: H H H T T T T T or this H T H T H T T T. Assuming the coin is fair, and thus that the outcomes of tossing either a head or tail are equally likely, we can use the classical approach to assigning the probability .