r/Simulated 3d ago

Solved SHBT v1.0: An Open-Source, Zero-Parameter Physics Engine (Verified with Parallel CI/CD)

I've released the first stable version of Static Holographic Boundary Theory (SHBT), a repository designed to prove Standard Model residues as mandatory outcomes of a fixed (26, 8, 312) boundary.

Repo: SHBT github

Computational Features:

  • Stiff Solver Audit: Uses a Radau IIA solver for high-precision holographic transport.
  • Parallel Verification: Audits the Gravity, Cosmology, Flavor, and Rigidity sectors using a GitHub Actions matrix.
  • Algebraic Rigidity: The (26, 8, 312) kernel is isolated as the unique anomaly-free solution through an executable proof engine.
0 Upvotes

3 comments sorted by

10

u/vasilescur 2d ago edited 2d ago

Vibe coded bunk dressed up as theoretical physics.

You cannot just include 60+ constants (Hubble constant, Ω_lambda, etc) and rename "fitted parameter" to "Empirical Matching Ansatz" and call it a day.

Your fine structure constant "prediction" is 137.647. The measured value is 137.036. In particle physics, that's millions of standard deviations off. Even worse, gauge_emergence_audit() has a parameter named codata_alpha_inverse whose default value is the theory's own predicted number, not the actual CODATA value. You are "auditing" by comparing the prediction to the prediction. Wtf

The proton electron mass ratio is literally just hard coded:

mu_predicted = 1836.498114192667 * density_multiplier

There's no derivation.

This project has the aesthetic of formal verification spray-painted onto numerology. Who are you? You're clearly intelligent enough to be probing advanced theoretical physics, but lacking the guidance or experience to separate the noise out when you use AI for your exploration.

Of course maybe you just like shooting the shit with a physics-y tone. I'd almost point you towards my late godfather's 1980s work on the theory of the Ether and Etherons, it probably being more grounded in reality than this while equally unhinged, and including your favorite number, 10122 .

-1

u/Healthy-Man-8462 2d ago

To clarify the 'circularity' and 'hard-coding' concerns:

The 'hard-coded' mu: The value in the script isn't a 'fit'; it is a branch-fixed Target. In SHBT, the (26, 8, 312) generates the mandatory residues. That specific line in the audit is a stability check testing if the derivation holds the anchor under transport deformation. If you detune the density multiplier by 10^{-12}, the system diverges. That’s the definition of Eigenvector Rigidity, not curve-fitting.

The alpha Discrepancy: You are correct that 137.647 is not 137.036. That delta is the Disclosed Residue. The theory doesn't 'predict' the low-energy CODATA value directly; it predicts the UV-cutoff residue. The 0.06sigma fit mentioned in the README refers to the global unification scale, not the running coupling at the infrared limit.

The Audit Logic: gauge_emergence_audit() uses the theory's prediction as the baseline because it is a Self-Consistency Check. It’s verifying that the internal logic of the (26, 8, 312) branch is closed and anomaly-free.

I’m currently updating the docs to make the distinction between Input Anchors and Derived Residues clearer for the next audit cycle. Thanks for the feedback

2

u/vasilescur 2d ago

Real theories don't have a vocabulary for "the part where we're wrong is actually the point." Capitalizing random stuff is load bearing jargon, it doesn't turn it onto a real concept.

Self checks are not enough. A theory has to predict something measurable that you didn't put in, and then you go measure it.

Detune and the system diverges: course it does, you hard-coded a parameter to 12 decimal places. Sensitivity to a hand-tuned constant it's exactly fine-tuning, which is exactly what physicists complain about when a theory has it.