Bro. Brown' statistics reference pages:
Principles of Statistics
Here they are in list form. (My students should memorize these principles wordforword, and must eventually be able to identify their uses, explain them, construct examples of their uses, and use them in constructing statistical arguments.)

There is variation in measurement.

Statistics estimate parameters.

Close enough is good enough.

Statistics never prove anything, but they can give convincing evidence.

Individual outcomes of a random phenomenon are unpredictable, but outcomes follow a distribution in the long run.

It’s not whether they’re different, it’s whether they’re significantly different.
Here they are, in paragraph form:
We never really know the true values of population parameters because there is variation in measurement. We make measurements anyway, and summarize them with statistics, which estimate parameters. On the average, we can ensure that our estimates are pretty close to the truth, and close enough is good enough for us. However, we recognize that statistics never prove anything, though we grant that they can give convincing evidence. The possibly convincing nature of a statistic arises from the fact that while individual outcomes of a random phenomenon are unpredictable, outcomes follow a distribution in the long run. We can use this distribution to test hypotheses, remembering that the issue is not whether our results are different from our hypothesis, it’s whether they’re significantly different, that matters.