Challenging the Second Law of Thermodynamics

In every warm room there is a hidden ocean of energy: air molecules racing at hundreds of meters per second, water trembling with molecular motion, walls and wires and bodies saturated with heat. The energy is real, immense, and everywhere. Yet nearly all of it is forbidden to us by the second law of thermodynamics, the law that says heat runs downhill, disorder tends to grow, and no cyclic device can turn ambient heat completely back into useful work. Daniel P. Sheehan, a physicist at the University of San Diego, has spent decades asking whether that prohibition is absolute, or whether special physical systems — surfaces, membranes, junctions, plasmas, and catalysts — can organize thermal motion into work after all. This is the story of that question: not a claim that the second law has fallen, but an investigation of one scientist’s long campaign to test whether nature’s most famous “no” has an overlooked boundary.

Origins of the Second Law of Thermodynamics

The second law began not as a cosmic philosophy but as an engineering problem. Nineteenth-century Europe wanted better steam engines, and steam engines forced scientists to ask what heat could and could not do. Sadi Carnot, Rudolf Clausius, William Thomson, later Lord Kelvin, and Max Planck helped turn the practical study of engines into a universal language of heat, work, and entropy. What began with pistons became a statement about the direction of natural processes.

The Kelvin–Planck formulation is the version that matters most for Sheehan’s challenge. In plain terms, it says that no device operating in a cycle can take heat from a single reservoir and convert it completely into work with no other effect. Ordinary heat engines do convert heat into work, but only by operating between a hot reservoir and a cold reservoir. Some heat must be rejected. A perfect single-temperature heat recycler is forbidden.

The Clausius formulation approaches the same terrain from another direction. Heat does not spontaneously flow from cold to hot. A refrigerator can move heat from cold to hot, but only by consuming work. The entropy formulation is broader still: in a spontaneous process, the entropy of the universe does not decrease. These statements are often treated as equivalent, and in ordinary thermodynamic systems they point in the same direction. Sheehan’s work lives in the places where he thinks their equivalence may fray.

The law’s authority is partly empirical and partly cultural. It has survived uncountable tests and underwrites chemistry, engineering, biology, cosmology, and materials science. It also carries a moral weight rare in physics. Scientists quote Eddington’s warning that any theory opposing it must collapse in humiliation. Sheehan argues that this reverence can become a shield around the law, making legitimate boundary tests sound like heresy before they are even examined.

The Language of Heresy

The words around this subject are dangerous. “Perpetual motion” is not merely a technical phrase; it is a warning label, a professional insult, a graveyard of frauds, dreamers, and broken machines. The phrase collapses distinctions that matter. A perpetual-motion machine of the first kind would create energy from nothing and violate conservation of energy. Sheehan is not proposing that. His devices, if they worked as claimed, would be perpetual-motion machines of the second kind: devices that convert environmental heat into work in a forbidden cycle.

That distinction does not make the claim modest. A working perpetual-motion machine of the second kind would still be revolutionary. But it would not be magic. Its energy source would be thermal energy already present in air, water, solids, or internal device components. The first law would still balance the books. The violation, if real, would be in the second law’s prohibition on fully recyclable heat.

Sheehan’s vocabulary tries to escape the trap. He uses “heat recycler,” “second-law device,” “epicatalytic thermal diode,” “thermal battery,” and “Maxwell zombie.” Each phrase opens one door and closes another. “Heat recycler” sounds technological. “Second-law device” sounds formal. “Perpetual motion” sounds doomed. “Maxwell zombie” sounds playful enough to be memorable, but playful enough to raise suspicion.

This language problem is part of the story. Frontier science often has to name itself before it can be judged, and bad names can kill a subject prematurely. Sheehan’s challenge is therefore linguistic as well as physical. He must persuade readers to separate impossible energy-from-nothing fantasies from the subtler and still highly controversial possibility of extracting work from ambient heat through unusual boundary physics.

Daniel Sheehan Enters the Story

Daniel P. Sheehan’s route into this borderland did not begin with backyard engines. It began with plasma physics, surfaces, fields, and non-equilibrium systems. His research record ranges across plasma behavior, thermodynamic paradoxes, catalytic surfaces, semiconductor proposals, infrared signature management, self-charging concentration cells, and speculative questions near the edges of time and causality. In the second-law work, the common thread is not a single device but a habit of looking where textbook simplifications are least secure.

That habit matters because many of Sheehan’s proposed challenges do not arise in ideal gases or classroom cylinders. They arise at surfaces, membranes, plasma boundaries, semiconductor junctions, blackbody cavities, and low-pressure gas systems. These are places where matter is not featureless and where a boundary can impose chemical, electrical, thermal, or statistical structure on the particles that encounter it.

Sheehan also became an organizer of the field. In 2002 he organized a conference on quantum limits to the second law at the University of San Diego. In 2005, with Vladislav Čápek, he coauthored the Springer monograph Challenges to the Second Law of Thermodynamics: Theory and Experiment. The book gathered historical challenges, quantum proposals, plasma paradoxes, chemical non-equilibrium systems, gravitational examples, and MEMS/NEMS device ideas into one controversial map.

His public talks reveal a recurring posture: provocative, but not careless. He repeatedly insists that the first law is not at issue. He presents the second law as contingent, falsifiable, and perhaps overdue for a revision in scope. The ambition is large, but the framing is specific. He is not saying nature gives energy for free. He is saying nature may contain special arrangements that make low-grade heat less unrecoverable than physicists have assumed.

The Stakes: Energy, Civilization, and the Reservoir Underfoot

Sheehan likes to begin with energy because energy gives the subject its moral force. “Energy is the currency of change” is one of his recurring lines. Nothing happens without energy. Chemistry, industry, transportation, agriculture, warfare, wealth, poverty, and climate all depend on it. The second law is not just a principle in a textbook; it is the tax collector that takes high-grade energy and leaves civilization with waste heat.

In his presentations, Sheehan often contrasts ordinary energy sources with the thermal energy around us. Fossil fuels are finite and polluting. Solar, wind, hydro, nuclear, geothermal, and tidal energy all have roles, but each faces constraints of density, scale, cost, storage, or public acceptance. Thermal energy, by contrast, is everywhere. The atmosphere, oceans, and upper crust contain vast stores of heat. The catch is that ordinary thermodynamics says this low-grade heat cannot be converted back into work without a temperature difference and a sink.

He makes the point vivid with domestic examples. A cubic meter of air contains a surprising amount of molecular kinetic energy. Water is denser still. A room is not empty of power; it is full of it, but randomized. The second law is the reason the energy in the room does not organize itself into light, propulsion, computation, or food. Thermal motion is plentiful, but it is not usefully directed.

The dream of a heat recycler is therefore extraordinary. It would not simply add another renewable source to the grid. It would change what “waste” means. It would turn the final degraded form of almost every energy process back into a usable input. That is why Sheehan links the subject to climate, geopolitics, interstellar travel, and survival. If heat could be recycled, the meaning of energy scarcity would change.

The First Law Against the Second

Sheehan’s favorite contrast is between the first and second laws. The first law conserves energy. It says the total energy of a closed system does not disappear, even though energy changes form. Kinetic energy becomes heat. Chemical energy becomes electricity. Mass becomes radiation. If new forms of energy appear in physics, the accounting expands to include them. In Sheehan’s telling, the first law behaves like a ledger.

The second law is different. It is not a ledger but a prohibition. It does not merely say energy is conserved; it says some conversions cannot be done in a cycle. Work can be degraded completely into heat, but heat cannot be completely restored to work by a cyclic device drawing from a single reservoir. This asymmetry is the heart of the challenge.

Duncan’s paradox, which became central to Sheehan’s later experimental story, dramatizes the tension. If a device performs work, conservation of energy requires the device to lose energy and cool. If it is cooler than its surroundings, the Clausius form says heat should flow back into it from the warmer environment. But if that cycle’s sole result is converting environmental heat into work, the Kelvin–Planck form forbids it.

That is the paradoxical maneuver Sheehan finds so important. His proposed devices do not violate conservation of energy. They depend on it. They use cooling, heat flow, and reset. The alleged problem is that one accepted thermodynamic statement may help complete a cycle that another accepted statement says cannot occur. In that sense, the first law does not lose. The second law is the one being tested.

Maxwell’s Demon and the Birth of the Zombie

The oldest character in this story is Maxwell’s demon. James Clerk Maxwell imagined a tiny being controlling a door between two gas chambers. By opening and closing the door at the right moments, the demon could let fast molecules move one way and slow molecules the other, creating a temperature or pressure difference. That difference could then drive an engine. The demon seemed to sort disorder into order.

For more than a century, physicists wrestled with what the demon overlooked. The modern resolution is usually tied to information. A demon that observes molecules, records their states, makes decisions, and resets its memory must erase information to complete a cycle. Information erasure has a thermodynamic cost. Once the accounting is done, the demon does not defeat the second law after all.

Sheehan argues that this old demon became a distraction. The problem with Maxwell’s creature is that it thinks. Sheehan’s “Maxwell zombie” does not. A zombie is a passive arrangement of physical materials that sorts, biases, or organizes particles without observing them, remembering them, or deciding anything. It has no memory register to erase. It is not clever. It is built.

That is why the zombie metaphor matters. A surface can dissociate molecules more strongly than another surface. A membrane can bind ions asymmetrically. A semiconductor junction can sustain a built-in field. None of these things thinks about molecules. They simply interact according to their material properties. If such passive arrangements could create a persistent, work-exploitable gradient from ambient heat, the old information-theoretic exorcism of Maxwell’s demon would not automatically apply.

The Second-Law Renaissance

Sheehan places his work inside what he calls a renaissance of second-law challenges. In his accounts, the modern phase began in the late twentieth century, after a long period when Maxwell’s demon dominated the imagination and most serious challenges were treated as curiosities. Researchers including Jack Denur, Lyndsay Gordon, Alexey Nikulov, Peter Keefe, Čápek, Sheehan, and others proposed challenges involving quantum systems, superconductors, plasmas, gravity, chemical surfaces, and solid-state devices.

The 2002 San Diego conference became a symbolic marker. It gathered researchers willing to discuss the second law not as an untouchable axiom but as a physical principle with possible boundary conditions. Whether one accepts the claims or not, the conference signaled that the topic had moved into published proceedings, technical debate, and organized scrutiny.

The 2005 Čápek–Sheehan monograph gave the field a book-length structure. It catalogued numerous formulations of the second law and entropy, reminding readers that the law’s simplicity is partly rhetorical. Thermodynamics works brilliantly in practice, but its verbal and mathematical forms differ by context. Sheehan uses that diversity to argue that a single counterexample may not destroy thermodynamics, but it could force a sharper statement of where each formulation applies.

Still, a renaissance is not a revolution. A list of challenges is not a working machine. The mainstream view remains that apparent violations usually hide an overlooked entropy cost, reservoir, non-equilibrium input, measurement artifact, or incomplete cycle. The renaissance matters because it supplies a cast of serious proposals. The verdict still depends on experiment.

The Ideal Gas Syndrome

One of Sheehan’s deepest criticisms is aimed not at the second law itself but at the mental habits surrounding it. He argues that physicists often build intuition from idealized models, especially the ideal gas. An ideal gas is a miracle of simplicity: point particles, no interactions except collisions, mathematically clean, and often close enough to reality to be useful. It is also a dangerous teacher if its simplicity is mistaken for universality.

Sheehan calls this tendency the “Ideal Gas Syndrome.” The ideal gas supports clear statistical reasoning and gives the second law an aura of inevitability. But real systems are rarely so featureless. They have surfaces, walls, defects, phase boundaries, membranes, fields, chemical affinities, and internal structure. They remember interactions for short times. They adsorb, desorb, polarize, dissociate, recombine, diffuse, and bind.

The systems Sheehan studies are precisely the systems ideal-gas reasoning tends to wash away. A low-pressure blackbody cavity with chemically distinct surfaces is not a featureless gas. A p-n junction with a built-in potential is not a bag of particles. An asymmetric membrane with different binding sites along microscopic channels is not a uniform reservoir. A plasma sheath is not a neutral ideal bulk.

This critique does not prove the second law false. But it explains Sheehan’s strategy. He is not looking for violations in the average center of a textbook gas. He is looking at structured physical regimes where boundaries do thermodynamic work that the thermodynamic limit tends to ignore. His wager is that the second law’s practical success may have made physicists too casual about the assumptions behind their certainty.

Beyond the Thermodynamic Limit: The Template

In his 2022 paper on going beyond the thermodynamic limit, Sheehan tries to identify a common template for the most promising second-law devices. The thermodynamic limit is a powerful approximation: the number of particles and the volume both go to infinity while density stays finite. It makes bulk behavior manageable. It also tends to make surfaces vanish in importance.

Sheehan’s template starts with physical or thermodynamic asymmetry at boundaries. A boundary may impose a discontinuity in chemical potential, pressure, temperature, species concentration, or electric field. The boundary is not passive decoration. It is an active region where particles encounter material rules that differ from those elsewhere in the system.

The second condition is that the asymmetry must collect ambient thermal energy into a macroscopic reservoir of free energy. A single molecule’s random thermal kick is useless. A pressure difference, temperature difference, concentration gradient, or electric field is different. It can be tapped. The boundary must act like a recruiter, organizing many microscopic motions into something large enough to do work.

The third and fourth conditions complete the cycle. Work must be extracted by a mechanism operationally independent of the process that created the reservoir, so the act of harvesting does not simply undo the sorting. Then the system must reset by absorbing heat from its environment. If that loop works, Sheehan argues, the device has done what the Kelvin–Planck statement forbids: it has converted single-reservoir heat into work in a cycle.

Life at the Boundary

Boundaries are easy to underestimate because they are thin. But much of the physical world happens at thin places. Semiconductor devices depend on junctions. Catalysis depends on surfaces. Plasmas form sheaths. Cells depend on membranes. Every organism is a boundary management system, maintaining gradients between inside and outside until death lets those gradients relax.

Sheehan uses this fact to fight the intuition that surfaces are secondary. In ordinary bulk thermodynamics, surfaces are often treated as negligible corrections. In small, structured, chemically active, or low-pressure systems, they can dominate. The boundary is where a molecule changes identity, where an ion is selected, where a field is sustained, where a gas remembers the last wall it struck.

Biology provides a useful analogy without proving any second-law violation. Living cells do not break the second law; they feed on free energy and export entropy. But their existence shows how much order and function can be maintained by membranes, channels, pumps, receptors, and concentration gradients. The surface is not a mathematical nuisance. It is an engine of organization.

Sheehan’s thermal battery work draws on the same intuition. A chemically asymmetric membrane can create different local environments for diffusing ions. Whether that ultimately counts as a second-law violation is the disputed question. But as a narrative bridge, biology helps readers see why Sheehan thinks the world’s most interesting thermodynamics may happen not in the bulk, but at the boundary.

The Plasma Years

Before tungsten and rhenium became the stars of Sheehan’s best-known experimental story, his second-law work was shaped by plasma physics. Plasmas are electrically active gases containing ions and electrons, and their boundaries can be strange. At a plasma edge, a sheath can form. Electric fields, particle energies, and local definitions of temperature may behave unlike textbook equilibrium systems.

Sheehan’s early plasma paradoxes involved surface ionization and blackbody-like environments. Different materials at the same temperature can have different work functions, meaning they may emit or ionize species differently. In a surface-ionized plasma, those differences can lead to ion populations and velocity distributions tied to the surfaces rather than to a single bulk equilibrium.

These plasma systems were scientifically suggestive but technologically unattractive. They tended to require high temperatures, exotic refractory metals, low pressures, vacuum apparatus, and tiny useful power densities. In his presentations, Sheehan is candid about this. The experiments could corroborate aspects of the underlying physics without offering a plausible commercial heat recycler.

The plasma years mattered because they trained his eye. They showed him that surfaces can impose strong thermodynamic structure and that blackbody conditions do not automatically erase every non-equilibrium feature. The lesson carried forward into the later chemical, solid-state, and membrane work: if a second-law challenge exists, it may live where a surface meets a gas, field, or diffusing species.

The Graveyard of Almost-Demons

A serious story about Sheehan’s work cannot make every proposed device sound like a breakthrough. In his APEC presentation, he says the University of San Diego group investigated roughly a dozen heat-recycler ideas over several decades. Many began as theoretical proposals. Some received laboratory substantiation of key physical processes. Few became plausible routes toward technology.

The failures and stalled paths matter. Some candidates operated only at punishing temperatures. Others required low pressures or specialized materials. Some produced effects so small that any possible work would be overwhelmed by the entropy generated in maintaining the apparatus. A device can be paradoxical in principle and useless in practice. Physics allows many curiosities that engineering cannot afford.

The plasma examples are part of this graveyard. So are high-temperature chemical systems whose power densities or operating conditions make them poor commercial candidates. Sheehan’s willingness to separate scientific interest from practical interest is important. It distinguishes a laboratory anomaly from an energy technology and keeps the story from turning into a miracle narrative.

The graveyard also gives the surviving candidates narrative weight. Epicatalytic thermal diodes, solid-state oscillators, supradegenerate devices, and thermal batteries are not simply random inventions in a speculative catalog. They are the ideas that remain after many others are acknowledged as too hot, too fragile, too weak, too inconvenient, or too noisy to matter outside the lab.

The Chemically Maintained Pressure Gradient

In 1998, Sheehan published a paper on dynamically maintained steady-state pressure gradients. The idea was deceptively simple. In a sealed blackbody cavity containing a low-density gas, chemically active surfaces might desorb different species at different rates. If gas-phase collisions are rare enough, the gas may not relax quickly to the same equilibrium everywhere. A pressure gradient could persist.

The standard view would expect pressure differences in a sealed cavity to fall into familiar categories: fluctuations, transients, or equilibrium gradients caused by external potentials such as gravity. Sheehan proposed a fourth possibility: a stationary pressure gradient maintained by surface-specific thermal desorption of chemical species. This was not yet a device. It was a claim about a possible non-equilibrium gas phase in a nominally closed thermal environment.

Todd Duncan’s 2000 comment transformed the claim into a paradox. Duncan argued that if Sheehan’s pressure gradients were attainable, they could be used to violate the second law. A radiometer with chemically different vane faces could experience unequal pressures and perform work. In conventional logic, that implication could be used as a reductio: since the second law forbids the engine, the pressure gradient must be impossible.

Sheehan took the paradox seriously in the opposite direction. If the pressure gradient is supported by theory and experiment, perhaps the second law’s prohibition is not the final answer. Duncan’s criticism sharpened the target. It turned a proposed gas-surface phenomenon into a testable challenge: can chemically distinct surfaces create permanent, work-exploitable gradients inside a single-temperature environment?

Epicatalysis: The Secret Mechanism

The key word Sheehan later gave to this mechanism is “epicatalysis.” Traditional catalysis changes reaction rates without changing the final equilibrium. A catalyst helps a reaction arrive faster at the same destination. Epicatalysis is Sheehan’s proposed extension: in low-pressure, long mean-free-path regimes, gas-surface interactions can dominate so strongly that the gas composition reflects the surface it last encountered rather than a universal gas-phase equilibrium.

The mean free path is crucial. At ordinary pressure, molecules collide constantly. Any special memory of the last surface tends to be erased almost immediately by gas-phase collisions. At low pressure, collisions are rarer. A molecule or atom leaving one surface can cross a cavity and encounter another surface before its history is randomized. The surfaces effectively exchange chemically distinct populations.

Sheehan explains this with a homely analogy. After coming home from a walk, you do not instantly become part of the living room’s equilibrium. You still carry cold hands, leaves, damp shoes, or outdoor air. Molecules leaving a surface can carry a similar short-lived imprint. Usually it vanishes quickly. Under epicatalytic conditions, it may persist long enough to matter.

If epicatalysis is real in the way Sheehan claims, it gives the Maxwell zombie a body. Different surfaces at the same temperature can create different species mixtures, and those mixtures can carry energy between surfaces. One surface dissociates a molecule and cools; another recombines atoms and warms. A sustained temperature difference emerges. That is the engine room of the challenge.

How to Measure the Forbidden

The experiment that matters most in Sheehan’s public case is not just an idea but an apparatus. To appreciate it, one has to slow down and ask how a forbidden temperature difference could be measured without fooling oneself. A claim against the second law is not allowed casual instrumentation. It demands controls, symmetry, calibration, and alternative explanations.

The 2014 Foundations of Physics paper reports two stages. First came gas-filament experiments comparing the power needed to hold tungsten and rhenium filaments at the same temperature in vacuum, helium, and hydrogen. Helium acted as an inert control. Hydrogen was the chemically active gas. The difference in required power was interpreted as a difference in hydrogen dissociation behavior on the two metals.

Then came the Duncan paradox experiments. Thin blackbody cavities were made from tungsten or rhenium. Inside them, physically similar thermocouples were coated with tungsten and rhenium and placed under highly symmetric conditions. The system was heated. In vacuum and helium, the thermocouples were expected to agree. In hydrogen, if the epicatalytic picture was right, they might diverge.

The visuals in the paper are part of the drama. The apparatus diagram shows the heated blackbody core and coated thermocouples. The data plots show the effect appearing in a particular pressure-temperature region, not everywhere. Controls are the plot, not housekeeping. Vacuum and helium tell the reader what the device does when chemistry is absent. Hydrogen tells the reader what happens when the alleged zombie wakes.

The Tungsten–Rhenium Experiment

The laboratory scene is severe rather than glamorous: a stainless-steel vacuum vessel, refractory metals, alumina discs, fine thermocouple wires, hydrogen gas, optical pyrometry, and temperatures approaching 2,000 kelvin. Tungsten and rhenium were chosen because they can survive extreme heat and interact differently with hydrogen. In this story, they are not just materials. They are characters with different chemical temperaments.

The gas-filament experiments suggested that rhenium dissociated hydrogen more strongly than tungsten under certain high-temperature, low-pressure conditions. Maintaining rhenium and tungsten at the same temperature in hydrogen required different power inputs. The effect grew at high temperature and depended on pressure in a way that fit the long mean-free-path picture: too much gas-phase collision, and the surface distinction fades.

In the blackbody cavity experiments, the reported result was striking. In vacuum and helium, the coated thermocouples agreed. In hydrogen, at elevated temperatures, they diverged. In a tungsten cavity, the rhenium-coated thermocouple cooled relative to the tungsten-coated one, with reported differences exceeding 120 kelvin in the most active range. The paper presents this as a realization of Duncan’s temperature paradox.

The claim is not that a household generator was built. It is not even that a complete external heat engine was attached and run. The claim is narrower and still profound: two surfaces inside a nearly closed, nearly isothermal blackbody environment reportedly maintained a stationary temperature difference because of surface-specific hydrogen reactions. Standard thermodynamics says an isolated system should relax to a single temperature. Sheehan says this one did not.

The Sign Reversal

One of the strongest narrative details in the tungsten–rhenium work is the sign reversal. A fixed instrumentation bias can produce a false temperature difference. A wiring error can produce a misleading offset. A simple asymmetry in the apparatus can fool a researcher. But such artifacts do not naturally reverse direction when the material environment is reversed.

In the tungsten cavity, rhenium reportedly cooled relative to tungsten. In the rhenium cavity, the complementary behavior appeared: tungsten heated relative to rhenium. The two configurations were meant to test the same chemistry from opposite sides. The cavity material dominated the gas environment, and the smaller thermocouple coating then responded to that environment.

The interpretation follows the epicatalytic story. Rhenium dissociates molecular hydrogen more effectively. In a tungsten-dominated cavity, the gas is relatively enriched in molecular hydrogen compared with what rhenium would produce, so the rhenium-coated thermocouple disproportionately dissociates hydrogen and cools. In a rhenium-dominated cavity, the gas is relatively enriched in atomic hydrogen, so the tungsten-coated thermocouple recombines atoms and heats.

This does not settle the debate. Sign reversal is not the same as a commercial device, and it does not replace independent replication. But it turns the experiment from a single surprising reading into a patterned result. The direction of the effect matters. It gives the reported anomaly a chemical logic, and it makes the reader ask whether the obvious sources of error are enough.

From Paradox to Device: The Epicatalytic Thermal Diode

The epicatalytic thermal diode is Sheehan’s proposed device-level descendant of Duncan’s temperature paradox. Instead of merely observing two temperatures inside a cavity, the diode would use chemically distinct surfaces and an active gas to create a sustained temperature gradient. One side would cool through dissociation. The other would warm through recombination. The gradient could then be used by ordinary thermoelectric or heat-engine methods.

The concept has a pleasing simplicity in diagram form. A molecule travels to surface one and splits, absorbing heat from that surface. The atoms travel to surface two and recombine, releasing heat. The molecule returns, and the cycle repeats. If the surfaces maintain different reaction propensities without an external work input, the device behaves like a thermal one-way valve.

Sheehan’s patent describes cells with two surfaces, a cavity, and a gas that interacts epicatalytically with the surfaces. The surfaces create and maintain a steady-state temperature differential. Cells could, in principle, be connected in series or parallel. Patents, however, are not proof. They describe inventions and claims. They do not establish that the invention can defeat every thermodynamic objection or survive practical engineering.

The diode is where the story becomes both most exciting and most fragile. It is exciting because it points toward practical heat recycling: cooling, heating, power generation, and thermal management. It is fragile because moving from high-temperature laboratory evidence to durable manufactured devices is a long road. The paradox may be real and still fail as technology. Or the technology may reveal the missing entropy cost that resolves the paradox.

The Room-Temperature Hunt

High-temperature hydrogen-metal experiments are dramatic, but they are not convenient. A device that requires refractory metals glowing near 2,000 kelvin is a poor candidate for everyday energy. For Sheehan’s program to matter technologically, the physics must move toward room temperature, ordinary materials, manufacturable geometries, and usable power densities.

In American Scientist, Sheehan reported room-temperature epicatalysis in systems involving hydrogen-bonded molecules such as formic acid and methanol interacting with polymer surfaces such as teflon and kapton. This is a very different world from tungsten and rhenium in hydrogen. It suggests that epicatalytic effects, if robust, may not be confined to extreme refractory-metal systems.

The practical vision follows quickly. Sheehan imagines thin epicatalytic thermal diode sheets, perhaps manufactured in rolls, sustaining temperature differences across their faces. Walls could heat or cool spaces depending on orientation. Small devices could draw in air, cool it slightly, and convert the thermal difference into electrical power. Cars, clothing, buildings, refrigeration, and decentralized power all enter the speculative horizon.

The crucial word is speculative. Room-temperature effects must be independently replicated, quantified, and turned into closed-cycle work extraction before they can carry the weight of those applications. The room-temperature hunt is not a triumphal chapter. It is the bridge between a paradox in a hot vacuum system and the possibility of something the world could actually use.

Solid-State Demons: Semiconductors, Capacitors, and MEMS

Sheehan’s solid-state proposals move the story from gas chemistry to silicon. A p-n junction is ordinary semiconductor physics. Put p-type and n-type silicon together, and charge carriers diffuse across the boundary until an internal electric field stops further diffusion. The depletion region and built-in potential are central to diodes, transistors, solar cells, and modern electronics.

Sheehan’s proposed twist is geometric. Bend a p-n junction into a horseshoe or open-gap structure, and the built-in potential may imply an electric field across a vacuum gap. That field resembles a charged capacitor. If the energy stored in the field is ultimately derived from thermal carrier motion and can be cyclically extracted, the system becomes a candidate solid-state Maxwell zombie.

The proposed hammer-and-anvil oscillator gives the idea mechanical form. A movable silicon cantilever is attracted toward an opposing plate by the field. It contacts or nearly contacts, discharges, springs back, and recharges. If the mechanical resonance and electrical recharge time are matched, the device could, in Sheehan’s model, oscillate continuously.

This is a demanding proposal. It requires high-quality silicon springs, low damping, controlled nanoscale gaps, contact behavior that does not ruin the cycle, and avoidance of stiction. It also requires that the built-in field be accessible in the way the theory predicts. The beauty of the idea is that it uses familiar semiconductor physics. The difficulty is that every microscopic imperfection may matter.

The Silicon Workbench

The transcript of Sheehan’s “current state” presentation is valuable because it brings the solid-state demon down from equations to the workbench. He describes simulations, fabricated test devices, Kelvin probe measurements, two-dimensional structures, and plans for three-dimensional torsional oscillators. The story becomes less about a grand law and more about tiny pieces of silicon.

The experimental bottlenecks are concrete. The device must match an electrical time constant to a mechanical oscillation period. The mechanical losses must be smaller than the energy gained each cycle. The silicon springs must operate in a high-Q regime. The moving part must avoid irreversible sticking when surfaces touch or approach each other. In MEMS, stiction is not a nuisance; it is a killer.

Audience questions in the transcript sharpen the tension. One participant presses Sheehan on fatigue: if thin silicon strips flex again and again, do they fail? Sheehan answers by invoking linear-regime operation, high quality factors, and carefully chosen regimes where fatigue should not dominate. Another question concerns what happens at the hammer-anvil contact and whether a new diode simply forms there.

These exchanges make the story more credible because they show informed skepticism inside the room. The challenge is not only philosophical. It is mechanical, electrical, materials-based, and expensive. The second law will not be revised by an elegant sketch. It would require a device that moves, discharges, resets, and repeats under controlled conditions, with every loss accounted for.

Supradegeneracy and Photons Climbing Ladders

Not all of Sheehan’s challenges depend on surfaces. Supradegeneracy comes from statistical mechanics. In ordinary thermal populations, higher energy levels are less occupied because the Boltzmann factor suppresses them. But population depends not only on energy. It also depends on degeneracy: the number of states available at that energy.

Sheehan and Larry Schulman explored what happens if degeneracy increases rapidly enough with energy. In such a system, the multiplication of available high-energy states could outweigh the usual Boltzmann suppression. Particles might statistically climb a ladder of states and end up with suprathermal energy. It resembles population inversion, but in the proposed scenario it arises from equilibrium statistical structure rather than laser pumping.

The technological dream is a superdegenerate thermophotovoltaic. In an ordinary photovoltaic, a photon with enough energy promotes an electron across a band gap. In Sheehan’s proposed design, dopant states inside the band gap would form a degeneracy ladder, allowing thermal photons to help an electron climb step by step until it behaves like a photovoltaic carrier.

This remains theoretical and speculative. It is useful in the story because it widens Sheehan’s challenge beyond epicatalysis. He is not only asking whether surfaces can sort molecules. He is asking whether statistical weights themselves can be engineered to make thermal energy climb into work-exploitable form. If epicatalysis is a boundary zombie, supradegeneracy is a statistical one.

The Thermal Battery

The thermal battery, or asymmetric membrane concentration cell, is one of Sheehan’s newer and more accessible proposals. It begins with a familiar kind of device: a concentration cell. If the same species exists at two different concentrations, that gradient can power an electrochemical cell. There is nothing controversial about a concentration cell by itself. The controversial element is how the concentration difference is created and restored.

Sheehan’s asymmetric membrane separator is designed to create a concentration gradient internally. In the 2022 template paper, he explains the idea with chocolate lovers in a corridor. People wander randomly, stick to walls where chocolate boxes are placed, spend longer where the chocolate is better, and thereby become distributed differently between the wall and the corridor interior. The analogy translates binding strength and binding-site density into concentration differences.

In the laboratory version described by Sheehan and collaborators, custom membranes separated hydrogen ions in hydrochloric acid, creating a corresponding chloride ion concentration gradient. The concentration difference then powered a cell using standard electrochemistry. The claim is not that electrochemistry is new. The claim is that asymmetric diffusion and binding can self-generate and restore the gradient using ambient thermal motion.

Here again the story must be careful. A self-charging concentration cell would be remarkable if it survives complete accounting, but membranes are subtle. Hidden chemical changes, electrode mass changes, irreversible mixing, material history, and concentration bookkeeping all matter. Sheehan’s own template notes that electrodes may need to be flipped to maintain mass balance. The thermal battery is perhaps the most approachable device concept, but also one where mundane electrochemical accounting will be decisive.

The Critics and the Burden of Proof

The thermal battery brings Sheehan’s challenge into an unusually concrete form, and that concreteness sharpens the skeptical burden rather than softening it. The closer a proposal gets to a working device, the less persuasive broad philosophical language becomes. A membrane, a thermocouple, a gas cavity, or a silicon oscillator must answer ordinary questions: what is consumed, what is restored, what is measured, what is hidden, and where does every joule go?

The skeptics have strong arguments. The second law has survived because apparent exceptions almost always reveal a missing term. A device may draw on a hidden chemical reservoir. A gradient may be externally maintained. A measurement may be perturbed by the instrument. A material may absorb or release gas. A cycle may not fully reset. Entropy may be exported through a path no one noticed.

The public criticism of Sheehan’s American Scientist article captures several of these concerns. One letter asked whether hydrogen dissolving into tungsten or rhenium, changing thermocouple properties, or migrating through solids had been sufficiently considered. Another questioned whether epicatalysis was really a second-law violation rather than an ordinary nonequilibrium phenomenon beyond the scope of equilibrium thermodynamics.

Sheehan’s responses are also part of the record. He argued that hydrogen dissolution in the metals and thermocouples was negligible under the conditions, that Type C thermocouples were appropriate for reducing atmospheres, and that optical pyrometry corroborated related measurements. He also argued that the heart of epicatalysis is precisely the failure of the traditional catalyst-equilibrium assumption in low-pressure, long mean-free-path regimes. The burden of proof remains on the challenger: independent replication, complete energy accounting, closed-cycle work extraction, and persuasive exclusion of hidden reservoirs.

A Law, a Loophole, or a Mirror?

In the end, Sheehan’s story is not simply about whether the second law is wrong. It is about how science treats laws that have become psychologically untouchable. A law can be overwhelmingly reliable and still have a domain. An approximation can be so good that generations mistake it for metaphysical necessity. The question is whether the second law is one of those laws or whether every zombie will eventually be exorcised.

There are three possible endings. In the first, Sheehan’s reported effects are resolved by conventional physics: hidden reservoirs, material effects, incomplete cycles, or misapplied definitions. The second law survives, perhaps sharpened by the encounter. In the second, the effects remain real but technologically weak, becoming curiosities that teach surface chemistry, membrane physics, or non-equilibrium theory without changing civilization.

In the third ending, one of the zombies survives every test. A boundary, membrane, junction, or statistical ladder collects thermal motion into useful work and resets by absorbing ambient heat. The second law is not destroyed but demoted from absolute prohibition to extraordinarily broad approximation. Textbooks change. Energy technology changes. The final degraded form of energy becomes, at least partly, recyclable.

The honest story ends before the verdict. The room is still warm. The molecules are still racing. The law still says no. But in Sheehan’s laboratories, papers, talks, and patents, surfaces remember where molecules have been, membranes sort without thinking, junctions hold fields across gaps, and a physicist keeps asking whether nature’s most famous no may have an asterisk.

References