> I have difficulties to interpret the graph created with the parameters below
> (I omitted some lines, but relevant data should be there).
> The graph shows
> - one line with high resolution data
> - one line with minimum values per day (step = 86400)
> - one line with maximum values per day (step = 86400)
> On the graph I do not understand, why the maximum value of the high
> resolution data does not correspond with the minimum / maximum values of the
> compressed data.
Because the line is not showing all the data.
> On the first and second day e.g. the max values are higher than any data
> point of the actual high resolution data.
First off, you haven't provided all the information - like what timescale the graph is plotted over (it looks like a bit over 2 weeks) or the actual data.
At a step of 900s, that's 96 periods per day. Over 16 days, that's over 1500 periods. You are plotting a graph that's around a third of that in pixels, so around 3 points will be averaged per pixel.
So you'd need to look carefully at the actual data to see what is "missing" from the graph you are plotting.
What I've done for a few of my graphs is plot a shaded area between min and max. Do a CDEF x_spread=x_max,x_min,- (where x_min is x:MINIMUM and x_max is x:MAXIMUM) and plot an area with full transparency of x_min, then stack and area of (say) 50% transparency of x_spread on top of it. Then draw your line of x. I suspect that with your parameters you'll find that the shaded area touches the min and max lines.
The timescale is 1 month (July 2014 = 31 days = 2976 data points).
I tried to increase the x- resolution to 3000 pixel, but that did not change the graph. I was hoping, that - according to your statement - rrdtool does not have to average a few pixels to one data point, if the resolution is high enough.
The red area in my example graph is exactly what you are asking for - but the blue line with the detailed data (roof) does not touch the upper limits of the red area (roof24harea).