Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Science

Showing Original Post only (View all)

erronis

(17,399 posts)
Fri Dec 13, 2024, 11:16 AM Dec 13

What Is Entropy? A Measure of Just How Little We Really Know. [View all]

https://www.quantamagazine.org/what-is-entropy-a-measure-of-just-how-little-we-really-know-20241213/
I've always had a hard time wrapping my mind around the concept of entropy. This article points out that my problem is not unique but there may be some coalescing of thinking.


Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

To keep track of this cosmic decay, physicists employ a concept called entropy. Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

I have long felt haunted by the universal tendency toward messiness. Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball. We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire. The second law demands that machines can never be perfectly efficient, which implies that whenever structure arises in the universe, it ultimately serves only to dissipate energy further — be it a star that eventually explodes or a living organism converting food into heat. We are, despite our best intentions, agents of entropy.

“Nothing in life is certain except death, taxes and the second law of thermodynamics,” wrote (opens a new tab) Seth Lloyd, a physicist at the Massachusetts Institute of Technology. There’s no sidestepping this directive. The growth of entropy is deeply entwined with our most basic experiences, accounting for why time runs forward and why the world appears deterministic rather than quantum mechanically uncertain.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.
9 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Latest Discussions»Culture Forums»Science»What Is Entropy? A Measur...»Reply #0