Notebook
Now, suppose our camera is a bit slower (perhaps because we have to manually push the button to take an image), but we still have the same infinitesimal shutter speed. The result would be that we would have fewer data points (the time series is downsampled), but still no averaging. What is the effect on the time series and histogram? Play with the downsample factor and find out.
Now, imagine instead that we are changing the shutter speed of our camera. As with downsampling, a slower shutter means fewer data points, but now each data point is the *average* of the signal over that time period. What will happen to the time series and histogram? Play with the averaging time to find out. For comparison purposes, the histogram plot will continue to show the unaveraged histogram as well.