nanoHUB U Fundamentals of Nanoelectronics: Basic Concepts/Lecture 4.6: Entropy ======================================== [Slide 1] Welcome back to Unit 4 of our course. And this is the sixth lecture. [Slide 2] Now in the last lecture, we talked about the second law. And one of the points I made is, although most of this course we have really focused on the channel, the second law is really a consequence of the contacts. This is really a property of the contacts, and so you can essentially ignore the channel in this discussion. It is a property that comes from the contacts, and specifically it is related to a very basic property of contacts, namely if you think of a contact at a temperature, T, electrochemical potential mu, and you want to take some energy, E, or number of electrons and from this contact, then this reverse process which wants to do the opposite, the ratio of the process you want and the reverse that you don't want is given by this exponential function. And if no electron exchange is involved, if N is zero, then what it says is if you want to take energy out, then this ratio is exponential minus E over kT. And as I mentioned, that means when E is positive, that ratio is very small. When E is negative, the ratio is high. And that is what you could kind of put in a one-liner that for any contact, it is much harder to take energy from it than to give to it. The question is, why is that? [Slide 3] So, for this purpose, what we think about is a small system. You know, something with two energy levels. So it could be a hydrogen atom for example. You have a lower level and an upper level, and what we know is that if you put an electron in the upper level, it will immediately radiate light or give up energy and come down to the lower level. On the other hand, if you put it in the lower level, it will just stay there, right? And this you are so used to that you probably don't question it either anymore. That is, I often get the question from undergraduates, occasionally at least that, you know, why does it come down and not up? But graduate students seldom ask that question because they have heard it so often, and they kind of realize if you ask questions like that, people get annoyed. So usually they don't ask that. But if you think about it, that is not completely obvious. You know, why is it easier to go down than to go up? And for this, what you should-- the way you should think about it is to understand this very deep point, actually. Think of the system as being connected to some reservoir. I mean, that is the technical word people use. We could call it a contact. Our contacts are essentially like reservoirs, except that usually we have used contact for the physical contact, but the point is, it could be any abstract thing that takes energy or electrons from your system. Now, this reservoir, it is a big thing. And you have this access energy. And what you plot on this side is something like the density of states. That is, you consider a small energy range delta E, and look at how many states the reservoir has in that energy range. So that is what we call W. And a very important property of all contacts is that this is always an increasing function. So as energy goes up, the number of states, or density of states also goes up. And I will try to explain a little more about that shortly. But the way it works, then is, when the electron is in stage 2, the reservoir has an energy zero, but when the electron is in state 1, the reservoir says the electron has less energy, the reservoir must have more energy, so you could kind of write it like a reaction. A chemical reaction. The 2 plus contact with energy E zero, that is kind of your reactants, and the resultant is electron is in 1 and the contact is in E zero plus epsilon. And looking at that, you might say, well, if it is equally likely to go one way or the other, then of course you should be kind of 50-50. So left to itself, the electron should be like 50 percent here, and 50 percent there. But the point is that all normal reservoirs, this density of states is an increasing function of energy. So if you look at it, it is like here you have got let's say ten states, here the energy is higher, so you have got like hundreds of states. So it is kind of like the system wants to go there, but then there is a lot more states. So it immediately kind of diffuses out. So with chemical reactions, one way to make a reaction proceed to the right is that you continuously remove the reactants. And here it is as if you have something like that, see? And that tends to drive this reaction on to the right. That is, from 2 it wants to keep coming down to 1. Whereas, in the opposite direction, it is much less likely. So mathematically you could write this probability of giving up energy epsilon to the contact. So when you go from 2 to 1, you give up an energy epsilon to the contact, and because of our sign convention, taking energy is positive, so giving up energy is like minus epsilon, epsilon being a positive number. And you consider the reverse process, so the top one is like the rate at which you go left to right, lower one is like going the other way. That ratio is this ratio of the Ws, at these two energies. Now, we introduce this very important concept of entropy, which is related to W, but it is like the logarithm of W. You see? [Slide 4] So, and that is what we have here. This S is equal to k log W. And you could invert this to write W as E to the power, or exponential of S divided by k. So if you use this expression to replace the W, you get exponential of S E0 plus epsilon divided by k, divided by exponential of S E0 divided by k. And as you know, when you have this ratio of two exponentials, you could write it as exponential of the difference of the exponents. So all you have done is there is this ratio of Ws, we introduced the entropy, which is the logarithm of W, or W is like the exponential of S, so replace the Ws, and wrote it in terms of the Ss. And then the point to note is that we have got a big reservoir, and this epsilon is like a small energy change in the reservoir. And so what we could do is write this difference in a, like a Taylor series. It is like S the energy has changed a little bit, so you could write this as dS/dE, times the epsilon. So you could write it like this. Exponential of, I guess you have the 1 over k, with this difference between the two entropies, we could write as dS/dE, times epsilon. And this dS/dE, that is actually what is defined as the temperature, as inverse temperature, so that this becomes epsilon over kT. So this is the standard thermodynamic definition of temperature. See? This one here. So in this way then, what you get is this definition of temperature, and what you, what this shows is the ratio of the two Ps, namely the process of coming down and the process of going up, is given by exponential epsilon over kT. So if epsilon is a positive number, the point is the probability of going down is a lot higher than the reverse process of going up. Note then that this is entirely a result of the reservoir or the contact having a lot higher density of states as with an increase in energy than for less energy. [Slide 5] Now, you might say, well if you remember back in Unit 1, we had looked at the density of states of different types of conductors, different electronic states. And if you remember that density of states was actually proportional to energy to the power d over 2 minus 1, where d is the number of dimensions. This was something we did back in Unit 1. And you could have a one-dimensional conductor, or a two-dimensional conductor or a three-dimensional conductor, and we looked at how many states you have, and came up with that expression. Now if you plot that for one dimension, you actually get a decreasing density of states. Actually for two dimensions it was constant. For three dimensions it is increasing. But for one dimension it is actually decreasing. So you might wonder, how can I claim the density of states will always be an increasing function? Here, it looks like, you know, 1-D conductor might even be decreasing. But the point to note is here we are kind of talking about a different density of states. What we are talking about in Unit 1 is like what you call the one electron density of states, that for one electron, how many states are available? Here we are talking about this enormous system with lots of electrons. Let's say N electrons. And the total number of states available to this entire collection of electrons. See? So if you have N electrons, it is almost as if your dimension is not one, two three, but rather it is one, it is like N or 2N or 3N. Like, if you have three particles, if you have a hundred particles in three dimensions, it is like a 300 dimensional problem. You know, the 300 degrees of freedom. So if you work that out, you see, you get a density of states that instead of having d over 2, you would have this Nd over 2. And N being a huge number, of course, what you would have up here is a very big number. And that is why W will always be an increasing function of energy. What about entropy? Well, that is supposed to be this logarithm. So S over k is the logarithm of that, and as you know, when you take logarithm of E to the power alpha, you get the alpha times the logarithm of E. And what we are interested in, what we defined as temperature, was this dS/dE. So if you take the derivative of this, dS/dE, then what you get is as you know logarithm when you take its derivative, you get 1 over E. So when you put E equals E zero, you get divided by E zero. That is what you would get, and note that this is a very big number, so you can neglect the 1 in comparison. Here. See? And this is what we are defined as inverse of temperature, so if you neglect the 1, you could write E zero as kT over 2 times ND. So what is E zero? That is the total energy in the reservoir. And that is equal to this number of degrees of freedom. That is, how many electrons you have, how many dimension you have, times half kT. And that is one of these very important central results of equilibrium statistical mechanics. But you know, there is this half kT of energy per degree of freedom. Okay? But the point I am making here is then that you could have these two ways of defining entropy. [Slide 6] Right? One is the thermodynamic definition. That you consider a large contact, and you ask the question, if I add a little energy to it, how does the entropy change? That is, how does the density of states change? This is dS/dE. And that is this 1 over T. And of course, this is an expression that came like from thermodynamics. Just by experimental observations. Whereas, in the 19th century, Boltzmann came up with this microscopic definition of entropy, relating it to the microscopic density of states. And this is, of course, one of his crowning achievements, which is why this k is called the Boltzmann constant. And you see it often, everywhere. And in terms of this k, then, we have this relation that whenever you have a contact, and you want to take energy from it, this process of taking energy and the reverse process, the ratio would be this, E to the power, minus E over kT. And what we'll show later is how this leads to the law of equilibrium in general. But right now, I just wanted to mention that in terms of entropy you could write the second law in a slightly different way. And that is, if you remember, what I said was for the second law, when you take energies from different reservoirs, you have this basic inequality. But based on this definition of temperature, you can see that for any reservoir, when you-- the change in entropy is related to the change in energy by this expression. But if you add a certain amount of energy to it, more states become available to the system, and so its entropy goes up. And this change in entropy is related to the change in energy, through that relation. So if you use that, you'll notice that this is kind of the change in entropy of that reservoir. This is the change in entropy of that one, and that is the change of entropy of that one. So overall, you could say what the second law is saying, I guess it is a negative change, because all the energy is being taken in, the way I have defined the reference directions. So it is like the negative entropy change of the 3 must add up to a negative number. What that means is the overall total change in entropy must be positive. So that is the form in which the second law is often stated. That the change in the entropy of the world must always be positive. And as I mentioned earlier, this thermodynamic process is kind of entropy driven. In other words, in what direction do they want to proceed? It is in the direction that increases entropy continually. See, and that is again, because in that direction you have a lot more states available. [Slide 7] So what I will try to do next is that using this basic property of contacts that I mentioned, property that it is easier to give up energy to it than to take from it, this ratio is always exponential, this E minus mu over kT. Using this idea we will come up with the law of equilibrium. That is we will say supposing we have a system with energy, various energy levels, and it is in contact, it is exchanging energy and particles with a contact, and given that the contact has this basic property, the question is how will the electrons be distributed in this system? And what we will see is that they will be distributed according to this universal law of equilibrium. And we'll show how that connects up to the fermi function that we have been using before. So that is what we will do then in the next lecture. Thank you.