NIST researchers and their colleagues have demonstrated a method to distinguish century-old coins from fakes by imaging antique coins with beams of low-energy neutrons. Authenticating coins is critical because scientists rely on them to chronicle the economic, political, and scientific developments of nations. NIST researcher Daniel Hussey and his colleagues chose neutrons to examine two Korean coins—one minted in the 1800s, the other a replica—because these subatomic particles penetrate heavy metals, such as copper, iron, and lead, and interact strongly with hydrogen-bearing compounds that form as a byproduct of corrosion. The location and pattern of corrosion within the two coins, both composed of copper alloys, provided hallmarks for verifying their age. For instance, the neutron study revealed that in the authentic coin, corrosion had penetrated deep within the body, indicating that the degradation was a gradual process that occurred over many decades. In contrast, corrosion in the recently minted replica was mainly confined to the surface, consistent with rapid corrosion over a short time period. Neutron imaging methods can also assist conservation efforts by determining the amount and locations of corrosion in authentic coins, suggesting areas of the coins that need a protective coating, for example.
More Relevant Posts
-
Have you ever wondered how gaussian quantum chemistry programs represent orbitals? (Hint: they use gaussians). A common basis set used is STO-nG where STO-nG stands for Slater-type orbital n-Gaussians. A basic example is where you take a 1s slater orbital which is of the form e^{-zeta*r} and then fit it with a sum of n gaussians of the form c_n*e^{-\alpha_n x^2}, where c_n are the linear coefficients and the -\alpha_n are the coefficients in the exponent. Because this is a nonlinear least-squares problem you need to be careful that the fit converges, because it can get lost more easily. I utilized the Marquardt-Levenberg algorithm from scipy to do the nonlinear least-squares. There are two fundamental differences between the gaussian and exponential functions, the gaussian has slope=0 at r=0, whereas the exponential has a finite slope. Additionally, the gaussian decays faster. This means you need a few gaussians to sharpen the peak near zero and you need a few gaussians to add their tails together so they decay less quickly. This is why you see improvement as the number of gaussians increases. Why would you perform such a fit? Well it turns out that the integrals for products of gaussians are analytical so you can solve them much faster compared to exponentials so if you fit them ahead of time you save a lot of time calculating. Below you can see how the fit converges for an increasing amount of gaussians.
To view or add a comment, sign in
-
Comprehensive analysis demonstrates that wave-particle duality is not a fundamental property of light but rather emerges from the quantum field theoretic description: 1. The fundamental entity is the quantum electromagnetic field Aˆ µ(x). 2. Photons are field excitations with definite energy ℏω and momentum ℏk. 23 3. Wave-like behavior (interference, diffraction) arises from superposition of probability amplitudes. 4. Particle-like behavior (localized detection, antibunching) emerges in interactions. 5. Classical electromagnetic waves correspond to coherent states with large mean photon number. The mathematical framework of QED completely and consistently describes all observable phenomena without paradox
To view or add a comment, sign in
-
The CMB Is Not Scale-Invariant — It’s Coherence-Imprinted ΛCDM assumes the CMB anisotropies originate from nearly scale-invariant quantum fluctuations amplified by inflation. This leads to a power spectrum close to Cℓ∝ℓ−0.1C_\ell \propto \ell^{-0.1}Cℓ∝ℓ−0.1. D10Z-TTA makes a different, testable claim: CMB anisotropies carry an additional imprint from nodal coherence, producing a modified slope Cℓ∝ℓ−0.23C_\ell \propto \ell^{-0.23}Cℓ∝ℓ−0.23. The difference is subtle at low multipoles but becomes systematic and detectable at intermediate-to-high ℓ\ellℓ, precisely where Planck and upcoming analyses (Simons Observatory, CMB-S4) have the highest sensitivity. This is not curve fitting. It is a structural consequence of how coherence propagates from the primordial state. If future high-precision CMB reconstructions confirm strict ΛCDM scale invariance, D10Z-TTA is falsified. If a consistent deviation with the predicted slope appears, inflation is no longer the only viable explanation. The CMB is not just a relic of expansion. It may be a fossil record of coherence.
To view or add a comment, sign in
-
-
Closure of Quantum Gravity Theories The paper "Closure of Physical Theories from Suppression, Coherence, and Fixed-Point Selection" establishes necessary conditions for the persistence of observable records in any causal physical theory. The closure framework has been successfully applied to String Theory and Loop Quantum Gravity, providing a consistency filter that allows these theories to derive the vacuum energy and achieve robust predictivity. The Triad of Constraints The framework derives three coupled pillars that must be satisfied for a theory to support stable, readable degrees of freedom: - Universal Suppression Bound: Influence on record-level observables must decay at least exponentially with "causal depth," defined as the minimal path length required for a bulk perturbation to influence a record. - Bulk-Boundary Coherence: A necessary condition for scalable observability is that the rate of stable record formation must keep pace with the production of distinguishable bulk novelty after compression. - Observability Fixed-Point Selection: Persistent universes require a stable equilibrium where suppression and bulk novelty are balanced under coarse-graining. The Closure Theorem The core result is the Closure Theorem: every degree of freedom is either constrained by these finite-depth observability conditions or becomes physically irrelevant via exponential suppression at unbounded depth. This implies that no physically meaningful unconstrained degrees of freedom persist in an observable universe. Universal Failure Modes The framework classifies three model-independent ways a theory might fail to be observable: - Over-suppression: All nontrivial influences vanish at macroscopic scales, resulting in an "empty" theory. - Coherence Violation (Overload): Bulk novelty exceeds the capacity of record formation, preventing persistent causal coherence. - Absence of Admissible Fixed Points: Coarse-graining drives the system toward either overload or emptiness because no stable equilibrium exists. Full Paper: https://lnkd.in/eJcXV7P6 #QuantumGravity #StringTheory #LoopQuantumGravity #TheoreticalPhysics #InformationTheory #Physics
To view or add a comment, sign in
-
What if quantum collapse isn’t just “environmental decoherence”? I went back and reanalyzed published collapse and decoherence experiments across very different physical systems: trapped ions, NMR spins, superconducting qubits, and Bose–Einstein condensates. One pattern showed up every time. The observed collapse rate followed: Γ(τ) = Γₑ + κ / τ And here’s the part that matters. τ is not an environmental variable. It’s the detector’s temporal resolution. How finely the system is interrogated in time. Across systems spanning more than ten orders of magnitude, collapse scaled linearly with 1/τ. Same scaling. Every platform. Standard decoherence theory does not predict this. It treats τ as a technical detail, not a physical constraint. The implication is blunt: Collapse is relational. It is jointly shaped by the system, its environment, and the temporal structure of measurement itself. Once τ is taken seriously as a physical variable, a larger picture follows naturally. Temporal coherence is not just something systems have. It is something they are constrained by. In later work, this same parameter reappears as a gravitational coherence window, pointing to a shared structure behind collapse, time dilation, and gravity. Everything downstream rests on this empirical result. Temporal resolution matters physically. Papers: https://lnkd.in/eTNQ9CXW https://lnkd.in/eauZWcVM #QuantumPhysics #FoundationsOfPhysics #QuantumMeasurement #Decoherence #QuantumFoundations #TimeInPhysics #TheoreticalPhysics #RelationalPhysics
To view or add a comment, sign in
-
-
A new framework for understanding the non-monotonic temperature dependence and sign reversal of the chirality-related anomalous Hall effect in highly conductive metals has been developed by scientists at Science Tokyo. This framework provides a clear...
To view or add a comment, sign in
-
◇ The Atom Is Not Just Matter. It Is a Memory Structure ◇ For over a century, physics has treated atoms as dynamical systems that respond to forces but retain no internal history. Stability, resonance, and structure have been modeled as outcomes of energy minimization, probabilistic orbitals, or field interactions — not as records of prior interaction. In my newest paper, White Paper LXXXVIII — The System of Atomic Memory and Reflection Conjugation, I present a different conclusion: Atomic stability is The basic form of memory. Building on my earlier work reconstructing atomic and nuclear structure from geometric conjugation (ORIGAMI / Potentum Physics), this paper shows that: • The introversion–extroversion dual is not just an energy flow • It is a triaxial phononic impulse sculptor • It negotiates rotor closure into a cube–octahedral multihedral lattice • That lattice functions as a persistent phononic memory field From this, a new physical law emerges: Law of Reflection Conjugation An atomic system stabilizes if and only if the external impulse structure resonates congruently with its internal closure geometry. In plain terms: If outside-in influence resonates with inside-out structure, the atom stabilizes. If it does not, coherence degrades and emission or destabilization follows. This reframes several long-standing problems in physics: • Why only specific frequencies stabilize matter • Why isotopic and lattice environments alter spectral behavior • Why hysteresis appears in coherent materials • Why resonance behaves like a selection law, not just energy transfer Most importantly, it establishes a general principle: All sensory influence is fundamentally analog. Matter filters interaction geometrically, not digitally. Atoms are not just reactive objects — they are standing records of resonant history. This paper completes a major arc of my Atlas of Atomia program and opens a new one: matter as memory, resonance as inscription, and stability as closure. White Paper LXXXVIII: The System of Atomic Memory and Reflection Conjugation (Available now). JPF Academy of Science and Arts JosephFirmage.com
To view or add a comment, sign in
-
Worth reading. What stands out here is not whether this material ultimately qualifies as a true quantum spin liquid, but how small structural distortions fundamentally change system behavior. It’s a useful reminder that, in complex systems, stability is often an environmental property rather than a component one. #NationalSecurity #AdvancedMaterials #QuantumResearch #SystemsEngineering #DefenseTechnology https://lnkd.in/gJ-B6SeD
To view or add a comment, sign in
-
You might think that a single spin degree of freedom does not have much to offer. However, when coupled to its environment, it can turn into a surprisingly complex playground to explore exotic phenomena of quantum criticality. This is the story of my recent publication, which I am happy to share was chosen as an Editors’ Suggestion: Tunable quantum criticality and pseudocriticality across the fixed-point annihilation in the anisotropic spin-boson model Phys. Rev. B 112, 235153 (2025) DOI: https://lnkd.in/djEy6Qui In two-dimensional quantum magnets, the search for exotic quantum criticality from competing orders has been a topic of intense research for the last two decades. Although the proposed continuous transition beyond the Landau paradigm has been incredibly hard to find, the search for it has brought to light a variety of exciting concepts like pseudocriticality or symmetry enhancement even at first-order transitions. Some of the challenges stem from the fact that numerical simulations are quite demanding in two dimensional quantum systems. In this work, I discover that a single quantum spin coupled to competing environments can experience much of the phenomenology that we are looking for in higher dimensions: (i) a non-Landau continuous transition, (ii) a symmetry-enhanced first-order transition, and (iii) pseudocriticality. All of these phenomena occur at the same order-to-order transition and can be gradually tuned into each other based on the exotic renormalization-group concept of fixed-point annihilation which occurs within the critical manifold. Due to the low dimensionality of the spin-boson model and with the help of my recently developed wormhole quantum Monte Carlo method, large-scale simulations can now access all aspects of quantum criticality and even the logarithmically slow drift of effective critical exponents in the pseudocritical regime. Moreover, this allows for detailed comparisons with analytical predictions. For further details, see my recent publication.
To view or add a comment, sign in
-
-
When Atoms Remember: Engineering Macroscopic Reality from the Quantum Histories of Matter At the boundary where microscopic indeterminacy meets macroscopic determinism, we demonstrate a construct in which the phase histories of atomic and molecular constituents are not ephemeral artifacts but actionable information, capable of producing coherent, predictable outcomes at scales accessible to human engineering. Each particle’s wavefunction |φₖ⟩, its local misalignment δₖ = ⟨φₖ| D_Ψ |φₖ⟩, and its frequency-equivalent rest energy λₖ = mₖ c²/ħ form a complete record which, when coordinated through a lattice of superconducting phase-modulation nodes, generates macroscopic forces F_macro = Σₖ ħ λₖ (1 − |δₖ|²)/L_core derived entirely from intrinsic quantum structure.
To view or add a comment, sign in
-
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
APMEX, Inc.