To answer the question “does coffee gets cold faster than tea” we need to measure how temperature changes with time. As an initial reference, I made a simple experiment with just boiling water.
A first approach can be seen in the following time-lapse video:
We can watch the video frame by frame and write down temperature and time. An easier approach is to write only when temperature changes. The advantage of this approach is that we only need a cellphone, a thermometer and a timer. In fact, we can even omit the timer if we know the frame rate of the camera. Most cameras are very precise in this timing. I included a timer just to have a reference. I’m not even sure if the timer is exactly one-second per tick.
From this first experiment we can learn a couple of things. We know that a cup of water takes more than ten thousand seconds (1:46 hours) to cool down to room temperature. In the first frames the temperature was 93°C and the last temperature was 27°C. Second, temperature does not change very fast, except at the beginning.
This means that we need to record the temperature for several hours, but we only need a sample every few seconds, or even once every minute. The cooling time will depend also on the volume of water. When I use a small hot-water bottle, my feet get cold during the night, but when I use a big bottle, I’m fine.
Another thing that we notice is that ambient temperature will have an effect on the cooling. So we need a second thermometer to control the air temperature.
The disadvantage of this approach is that it requires boring manual work to extract the data. We can do better. We have robots that work for us.
I prepared a second experiment with a data logger. The device registered water temperature, air temperature, air pressure and altitude, every 10 seconds (nominally). I prepared the experiment in the late evening and let it run all night, near a closed window.
The air pressure is probably unnecessary for the current experiment, but the device was already measuring it, so keeping these values will help us to prepare for when we measure the building height.
All the data is on Google Sheets and in a text file, which is easier to process with R. The analysis can be done in several ways, the easy parts are easier in the spreadsheet, the advanced parts are easier in R. Choose your tools wisely, they will serve you for all your life.
The first thing to do is to familiarize oneself with the data. It is a good idea to add a column with the row number, which we usually call row id or just id. Please notice that the starting value of seconds is arbitrary, it only reflects how long the machine was turned on before the first measurement.
After looking at the first few lines, it is always good to plot the data and see how does it look. This is easy in Google Sheets or Excel, and also in R. For now let us just focus on temperature.
We see that the air temperature remains more or less constant during all the mesurements. It will be easier to analize.
Well, it is not so constant. There is a sharp temperature reduction in the first seconds, and then it goes down more or less linearly. My guess is that the initial temperature was the one from the room where I prepared the device, and then it cooled to the experiment room temperature. Then it cools as the night cools, until the sunrise. The window looks to the east, so it gets warmer in the morning. In retrospective, it would have been wise to let the device cool to the room temperature before starting, and record the real time from a real clock.
A sharp eye may also notice that there are times when the temperature rises. My guess is that these are the times when the fridge motor was working. The experiment room was my kitchen. Refrigerators keep their interior at low temperature by transferring heat to the exterior, so the room gets warmer.
We will use air temperature as our main variable. We can choose any point on time and take several values, let’s say 𝑁. We want to evaluate, for each position: average, standard deviation, and standard error. And we will do it for several values of 𝑁.
Here we evaluated 𝑁 equal to 3, 10, 20 and 30. You can try other values. The first rows of results look like the tables on the margin.
It seems that the standard error get worse with bigger N, but it is just a transient phenomena. Looking the full picture we observe these graphics.
In the first seconds the standard error is large. This is due to the fast change in the value we are measuring. In this case there is a transient period before the temperature stabilizes. Things are be more clear if we focus on the values after the transient.
We observe that the standard error is random, since we evaluate it from random data, but it follows some patterns. Being pessimistic, we can take the maximum value for each one. And we can look at the Student’s t-distribution table, to know the factor for a 95% confidence. Finally, we calculate the uncertainty on each case.
|Max Std. Error||k (95%)||Interval width||Interv. 1 sigfig|
Therefore, we can have at most one decimal when we average 3 samples, two decimals when we average 10 samples, and three decimals with 20 or 30 samples. We can see this in the following figure.
There seems to be no significant difference between averaging 20 and 30 samples.
Can you replicate these results?
Can you repeat this analysis for pressure?
According to the original design, there should be one sample every 10 seconds. But if we look a the difference of seconds between consecutive rows, we find that over 34% of times the timer advanced 9 seconds.
So the measuring device was sampling faster than intended. How fast was it running?
We see that instead of “10 seconds every 10 seconds” we have a little less.
How much time really passes between samples?
I look forward for your comments.