Entropy is like a measure of the “randomness” or “disorder” in a system or process. It quantifies the how much information is missing to fully describe a system. The higher the entropy, the more disordered the system is. Can be tricky to understand, but keep at it, you’ll get it!