Since time is mathematically the same forward and backwards (why it isn’t with regards to entropy is one of the Big Unknowns)
This is true of the continuous systems of classical mechanics. If treated as a system of discrete particles, as it was by Ludwig Boltzmann at the end of the 19th century, entropy can be defined as
s = k log Ω (Arguably a more profound formula than E = mc².)
The ‘directional’ property of entropy with respect to time becomes a natural consequence of the statistical properties of the system. Nothing prevents a system significantly decreasing in entropy other than the sheer unlikelyhood of its doing so.