[138] viXra:2601.0146 [pdf] submitted on 2026-01-30 05:37:40
Authors: Stephen J. Crothers
Comments: 25 Pages.
Temperature and the laws of thermodynamics are central to physics. They serve to guide all theory that involves thermodynamic relations. Temperature, irrespective of global or local equilibrium conditions, must always be intensive to satisfy the 0th and 2nd laws of thermodynamics. At the same time, if the laws of thermodynamics are to be followed, not only must the units balance on each side of a thermodynamic equation but so too must thermodynamic character. The theory of protostar formation by gravitational collapse is constructed from the kinetic theory of an ideal gas. In this instance, temperature is introduced in combination with gravitation via the virial theorem. Such an approach assumes that an uncontained cloud of gas in interstellar space will gravitationally collapse, or self-compress, when sufficiently massive. Yet, experiments demonstrate that uncontained gases, irrespective of bulk mass, always expand into their surroundings. The critical mass for initiation of self-compression of a gas is the Jeans mass, which depends on the gas temperature. Similarly, stellar accretion and accretion disk relations involve temperature. All these expressions assign temperature a nonintensive character, in violation of the laws of thermodynamics. Consequently, the relations and the theories from which they are derived are invalid.
Category: Astrophysics
[137] viXra:2601.0145 [pdf] submitted on 2026-01-30 16:47:13
Authors: Krishna Agarwal
Comments: 12 Pages.
The emergence of quantum computing represents one of the most significant paradigm shifts in the history of computation, with profound implications for offensive security and cyber warfare. This comprehensive research paper presents a systematic analysis of quantum-enabled offensive capabilities, novel attack vectors targeting quantum systems themselves, and the evolving threat landscape at the intersection of quantum computing and artificial intelligence. We introduce the Q-THREAT Framework, a novel temporal risk assessment model that quantifies the "Harvest Now, Decrypt Later" threat across different data confidentiality lifetimes and sectoral exposures. This research synthesizes and categorizes recently documented attack vectors including quantum Rowhammer exploits, timing-based side-channel vulnerabilities in cloud quantum services, and multi-tenant quantum system intrusions. Recent research demonstrates that current quantum cloud platforms from major providers including IBM, Rigetti, and others exhibit significant security vulnerabilities that could be exploited by adversaries to compromise computational integrity, extract sensitive circuit information, and disrupt quantum computations. Furthermore, we analyze the convergence of quantum computing and artificial intelligence as a force multiplier for offensive cyber operations. We present evidence that quantum machine learning algorithms can demonstrate competitive or superior performance in certain cybersecurity applications compared to classical approaches in controlled experimental settings, with recent studies reporting high accuracies in tasks such as malware detection and intrusion detection on benchmark datasets. Our analysis of nation-state quantum programs reveals an accelerating global quantum arms race with significant implications for national security. Based on our findings, we propose a comprehensive defensive framework incorporating post-quantum cryptographic standards, crypto-agility principles, and quantum-safe architectural patterns. This research contributes to the nascent field of quantum cybersecurity by establishing foundational threat models, identifying critical research gaps, and providing actionable recommendations for organizations preparing for the quantum era.
Category: Quantum Physics
[136] viXra:2601.0144 [pdf] submitted on 2026-01-31 03:39:31
Authors: Boubacar Diawara
Comments: 62 Pages. (Note by viXra Admin: Please cite listed scientific references and submit article written with AI assistance to ai.viXra.org) Copyright © 2026 by the author(s), all rights reserved.
This study investigates the deep analogies between quantum squeezing and entanglement phenomena and their classical counterparts realized in coupled LC oscillator circuits. By constructing comprehensive MATLAB and Python simulation frameworks, we model strongly coupled classical oscillators, compute covariance matrices, and quantify information transfer using correlation and variance-basedmetrics inspired by quantum information theory. Extensive parameter sweeps reveal that classical LC circuits can reproduce key mathematical structures associated with quantum squeezing, including variance reduction exceeding 15 dB and correlation coefficients above 0.99 in the strong-coupling regime. Beyond numerical modeling, experimental implementations of coupled LC circuits are performed and compared with simulations, showing excellent agreement with discrepancies below 5 Motivated by advances in microwave quantum engineering—where artificial atoms, superconducting circuits, and resonators enable ultra-strong light—matter coupling—this work explores whether classical circuits can provide insight beyondformal mathematical analogies. While classical LC systems do not reproduce intrinsic quantum randomness, superposition, or nonlocality, they successfully capture structural and dynamical features central to quantum squeezing and correlated states. By bridging quantum information theory and classical circuit analysis, this research offers a scalable, low-cost experimental testbed for education, prototyping, and conceptual exploration of quantum phenomena. The findings highlight the potential of classical oscillator networks as meaningful simulators for macroscopicmanifestations of quantum-inspired effects relevant to precision measurement, communication, and emerging cybersecurity technologies.
Category: Quantum Physics
[135] viXra:2601.0143 [pdf] submitted on 2026-01-31 03:30:34
Authors: Viktar Yatskevich
Comments: 53 Pages. 22 figures 22, 49 references
Contemporary physics relies extensively on mathematical formalisms to describe natural phenomena, often achieving remarkable predictive success. However, the relationship between mathematical description and physical explanation remains a subject of ongoing debate. In many cases, mathematical models are treated not only as tools for representation but also as implicit substitutes for the underlying physical mechanisms, which can obscure questions of causality and physical origin.This article examines this methodological issue using two fundamental physical phenomena—gravity and the tunneling effect—as representative examples. Both phenomena are commonly described by highly successful mathematical frameworks, namely general relativity and quantum mechanics, yet their physical interpretations remain incomplete or debated. The work argues that predictive accuracy alone does not necessarily constitute a full physical explanation and that mathematical consistency should be complemented by physically grounded mechanisms based on observable properties of matter and interaction.Alternative descriptions of gravity and the tunneling effect are proposed, grounded in experimental observations and established physical properties of matter at macroscopic and microscopic scales. These descriptions aim to clarify the physical processes underlying the phenomena while remaining compatible with empirical data. The proposed framework does not reject existing theories but seeks to supplement them by addressing conceptual gaps related to causality, physical mechanism, and interpretation. Such an approach may contribute to a more physically transparent understanding of fundamental phenomena and provide new directions for theoretical and applied research.
Category: Classical Physics
[134] viXra:2601.0142 [pdf] submitted on 2026-01-31 01:01:25
Authors: V. B. Verma
Comments: 9 Pages. AI was not used in the generation of any aspect of this work
Motivated by the suggestion made by R. H. Dicke in 1957 that the speed of light may be correlated with the gravitational potential of the entire universe, we develop a model for a Machian cosmology in which matter determines the speed of light through a scalar field which has a mathematical form similar to, but distinct from, the gravitational potential. We show that this leads naturally to a cosmology in which the speed of light was higher in the early universe and is decreasing in cosmological time, providing an explanation for the isotropy of the cosmic microwave background (CMB) without the need for inflation. This cosmology results in an apparent amplification of the baryonic mass density (dark matter) and predicts a critical acceleration reproducing that of modified Newtonian dynamics (MOND). We also fit the model to high-redshift supernova data from the Supernova Cosmology Project, showing that an excellent fit is obtained with only baryonic matter. Finally, we derive a geometric relationship between cosmological parameters and the fine structure constant of quantum electrodynamics.
Category: Relativity and Cosmology
[133] viXra:2601.0141 [pdf] submitted on 2026-01-30 01:34:18
Authors: Fawang Su
Comments: 5 Pages.
Based on the physical idea that "mass is energy with added dimensions", this paper explores how spacetime curvature contributes to the effective mass of photons from the perspective of combining general relativity and quantum mechanics by constructing a photon motion model in curved spacetime. It is proven that when a photon is confined within a characteristic scale in a strongly curved spacetime, its energy can manifest as an equivalent mass, which is directly related to the spacetime curvature tensor and the constraint scale. This paper provides a new theoretical perspective for understanding the acquisition of effective mass by photons.
Category: Classical Physics
[132] viXra:2601.0140 [pdf] submitted on 2026-01-30 01:29:10
Authors: P. N. Seetharaman
Comments: 10 Pages.
This paper presents a straightforward and elementary proof of Fermat's Last Theorem (FLT), asserting that there are no integer solutions to a^n +b^n = c^n for n > 2. Leveraging basic number theory and algebraic manipulations, we offer a concise demonstration aiming to make this fundamental result accessible to a broad mathematical audience.
Category: Number Theory
[131] viXra:2601.0138 [pdf] submitted on 2026-01-30 01:04:02
Authors: Edgar Valdebenito
Comments: 4 Pages.
This note is about a specific value of Lambert's W function.
Category: General Mathematics
[130] viXra:2601.0137 [pdf] submitted on 2026-01-29 00:27:19
Authors: Zuodong Sun
Comments: 23 Pages. (Note by viXra Admin: Further repetition will not be accepted)
Although the classic DNA double helix model proposed by Watson and Crick explains the static storage mechanism of genetic information, it fails to reasonably account for the physical driving force behind high-speed and high-fidelity DNA replication. Furthermore, the core features of Rosalind Franklin's X-ray diffraction Pattern 51—"alternating black and white stripes with a slight tilt"—have not been fully interpreted within the static framework. Based on the standard B-type DNA double helix as the basic unit, this hypothesis proposes an original DNA Origami Windmill Tetramer Model by drawing on MacKinnon's research on the tetrameric structure of potassium ion channels and the mechanical principles of the potassium ion channel origami windmill model. Four DNA double helices assemble into an inverted conical tetrameric functional unit at a non-90° oblique angle corresponding to the diffraction characteristics of Franklin's Pattern 51, forming an inverted conical ion channel at the center. Its dynamic drive relies on the electrostatic repulsion of intracellular cations such as Ku207a and Nau207a, without the need for ATP hydrolysis for energy supply. The core of the model follows the logic of whole-chain non-denaturing replication, realizing genetic transmission through pairing and recombination between double-helix units, thereby avoiding the mismatch risk caused by single-strand exposureu2077. This reasonably explains the replication phenomenon in minimalist systems such as archaea and φ29 bacteriophages that do not require helicases, and clarifies that the classic enzyme system is only an auxiliary regulatory factor in the complex chromatin environment. Combining the core laws of molecular theory and 2u207f exponential logic, this study corrects the definition deviation between traditional DNA structural units and genetic functional units, confirming that the tetramer composed of 4 double helices is the optimal functional unit for complete DNA inheritance. Meanwhile, it is the first to reveal the direct correlation between the diffraction characteristics of Franklin's Pattern 51 and the folded stacking shape of the model's blades, breaking through the limitations of static cognition. This hypothesis provides a new and testable theoretical framework for dynamic DNA replication, whose predictions can be verified through five layers of decisive experiments. It is highly compatible with the classic double helix model and offers a testable theoretical perspective and experimental basis for research in related fields.
Category: Biochemistry
[129] viXra:2601.0136 [pdf] submitted on 2026-01-29 00:10:12
Authors: Marcello Colozzo
Comments: 25 Pages.
This monograph provides a systematic treatment of Kramers degeneracy, investigating its deep-rooted origins within the framework of space-time symmetries in quantum mechanics. The investigation focuses on the nature of the time-reversal operator, exploring the mathematical peculiarities associated with antiunitary transformations and their fundamental distinction fromconventional unitary symmetries.
Category: Quantum Physics
[128] viXra:2601.0135 [pdf] replaced on 2026-04-07 14:57:07
Authors: Nicolas Poupart
Comments: 20 Pages.
We investigate whether stellar population information can be recovered from galactic dynamics. Starting from observed rotation curves, we construct synthetic stellar population mixtures constrained to reproduce the effective mass distribution. From these populations, we compute ultraviolet and optical colors and compare them to GALEX and SDSS observations. We find statistically significant correlations between predicted and observed color indices across multiple bands, including FUV—NUV, g − r, r − z, and NUV—r. Rank—rank correlations reach significances up to ≳ 5σ, indicating that the ordering of stellar populations is encoded in the dynamical information. These results show that rotation curves contain non—trivial information about stellar population structure beyond their standard interpretation as mass tracers. This behavior is consistent with a picture in which the effective dark mass component reflects gravitational binding energy, linking dynamics and stellar population properties.
Category: Astrophysics
[127] viXra:2601.0134 [pdf] submitted on 2026-01-28 23:04:12
Authors: Taeho Jo
Comments: 6 Pages.
In this research, we propose the string vector based KNN variants, and apply them to the keyword extraction. The initial KNN version was previously modified into the string vector-based version, and the keyword extraction was mapped into a binary classification, to apply it. In this research, we mention the three KNN variants, in the case of the numerical vector-based versions: one where the selected nearest neighbors are discriminated by their similarities, one where the attributes are discriminated by their correlations with the categories, and one where the training examples are discriminate by their credits. In this research, the three KNN variants are modified into the string vector-based versions, as the approaches to the keyword extraction, as well as the initial KNN version. The goal of this research is to improve the keyword extraction performance by modifying them so.
Category: Artificial Intelligence
[126] viXra:2601.0133 [pdf] replaced on 2026-03-10 23:58:09
Authors: Walter A. Kehowski
Comments: 11 Pages. Some improvements in the presentation as well as a new section with an impossibility result.
A power spectral number is a positive integer whose spectral basis consists of only primes and powers. If one searches for power spectral numbers whose spectral sum is also a power, then one finds only five examples. We call these numbers power spectral Pythagorean numbers. The first two examples involve the Pythagorean triples 3,4,5 and 8,15,17. It is shown in this note that these are the only two Pythagorean triples that are power spectral Pythagorean. The other three examples involve the Pell equation.
Category: Number Theory
[125] viXra:2601.0132 [pdf] submitted on 2026-01-27 13:17:28
Authors: Alberto Coe
Comments: 15 Pages.
This paper presents a purely illustrative exploration of some numerical hierarchies that emerge when comparing different energy and time scales in the present-day Universe. A dimensionless parameter is introduced, constructed from the ratio between the energy density associated with the nonlinear gravitational structure and that of the cosmic microwave background, weighted by the ratio between the so-called Planck’s time and atomic unit of time. The resulting extremely small value reflects the enormous separation of scales between the physics of the early Universe, the relict background radiation, and the late cosmological structure. This analysis does not aim to establish new dynamic relationships or fundamental magnitudes, but rather to offer a numerical curiosity in the spirit of traditional discussions about large numbers in cosmology.
Category: Relativity and Cosmology
[124] viXra:2601.0131 [pdf] submitted on 2026-01-27 20:50:31
Authors: Martin Kraus
Comments: 6 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
De Broglie showed that a hypothetical internal clock of electrons could cause the quantization of energy levels of closed electronic orbits in the Bohr model of hydrogen-like atoms. Inspired by this insight, this work presents a model for the quantization of orbital angular momentum of electronic Bohr-Sommerfeld orbits with apsidal precession. The predictive power of the presented model is very limited since a parameter of the model is calculated based on a previously published quantization of the electron's orbital angular momentum.
Category: Quantum Physics
[123] viXra:2601.0130 [pdf] submitted on 2026-01-27 20:40:00
Authors: Jarosław Grabiec
Comments: 14 Pages. (Note by viXra Admin: Please cite listed scientific references)
This article identifies separate energy channels, with a focus on the Coriolis force energy channel as their integrator. The author demonstrates that the total kinetic energy of a launched satellite in the unsteady state results from the geometric sum of momentum and angular momentum, as two vectorial and orthogonal kinetic energy carriers coupled together in spacetime by a dynamic phase. This phase results from energy transfer through the Coriolis effect energy channel, which mixes the components without introducing additional energy. The article identifies a scientific interpretation gap and proposes a new, enriched perspective in the field of cosmology.Two main energy components were distinguished: u2022 ΩA — (active) identified with linear momentum (translational energy opposing the gravitational force) integrated along the ascension radius, ultimately converted into the satellite's potential energy with a mass of m. u2022 ΩR — (reactive) identified with angular momentum (rotational energy equivalent to the opposing Coriolis force) force projected onto the tangent to the ascent spiral, then integrated along the spiral, finally transformed into rotational orbital energy (encoded into the angular momentum of the mass m ). Each of them is a power integral associated with the orthogonal motion components (radial and tangential). In standard orbital mechanics, angular momentum energy is implicitly included in the orbital kinetic energy along the ascent spiral, but it is not treated as an independent, orthogonal component in the energy balance in the dynamicascent state.
Category: Astrophysics
[122] viXra:2601.0129 [pdf] submitted on 2026-01-27 20:31:20
Authors: Jerry Ray Betfis
Comments: 13 Pages.
Dark energy has not been explained other than to state that it may be the driving force behind the expansion of the universe. Each topic in the evolution of the universe has its own explanation: Temperature versus time is driven by radiation, then by matter, then by dark energy; Cosmic Microwave Background (CMB) isotropy is driven by Inflation; Matter production is explained by saying sub-atomic particles get together but do not say where they came from; Recently discovered acceleration of expansion of the physical universe (that collection of things we see all around us) is still awaiting a consistent explanation. A single theory of dark energy with no adjustable parameters answers all these concerns and more. The temperature in the singularity was too great for matter to exist, thus, dark energy is a massless form of energy. It produces matter by the Breit-Wheeler process. Dark energy is the remains of the total energy from the singularity after matter production ceased. It is a perfect fluid and expands adiabatically and homogeneously with large initial velocity and will be analyzed by Friedmann’s solution of Einstein’s Field Equations. It forms a homogeneous sphere that keeps temperature, pressure, and matter isotropic. The physical universe expands at a rate that is the difference between the expansion due to dark energy and the inward rate induced by gravity. This accounts for the increased acceleration of distant cosmological entities. The expansion slows but never stops. The CMB radiation is the thermal footprint of dark energy.
Category: Relativity and Cosmology
[121] viXra:2601.0128 [pdf] submitted on 2026-01-27 20:28:49
Authors: Viktar Yatskevich
Comments: 19 Pages. 10 figures and 30 refereces
This paper proposes a phenomenological, physically motivated interpretation of gravitation aimed at addressing conceptual gaps related to physical mechanism, causality, and microscopic origin of gravitational interaction. While contemporary theories of gravity, including general relativity, provide mathematically consistent and empirically successful descriptions, they do not explicitly specify the physical processes underlying gravitational interaction.The proposed framework is based on established electromagnetic and structural properties of matter on microscopic and macroscopic scales. Gravitational interaction is interpreted as a manifestation of collective electrodynamic processes occurring within matter, associated with time-dependent electric and magnetic field configurations generated by charged constituents. The approach is not intended to replace geometric descriptions of gravity, but to complement them by introducing an explicit physical interpretation consistent with known properties of matter and interactions.The work is conceptual in scope and focuses on physical interpretation rather than on the development of a new mathematical formalism. It aims to provide a physically transparent perspective on gravity that may serve as a foundation for further theoretical refinement and experimental investigation.
Category: Classical Physics
[120] viXra:2601.0127 [pdf] replaced on 2026-02-08 00:15:36
Authors: Carl Littmann
Comments: 8 Pages.
Einstein’s Relativity Theory emphasizes that "if a body radiates a given amount of Energy, that emitting body loses a Mass equal to that emitted Energy divided by the speed of light squared". But if that lost mass can’t be fully found by adding up all the resulting products, including negligible-mass high-energy Neutrinos; where did that mass go? My paper asserts that the lost (hidden) mass was ‘injected’ into the ‘aether’, increasing aether’s mass. As Einstein even said, in 1930, "Space is Eating-Up matter!" I use that "Einstein Statement" to estimate a minimum mass density of aether in Space, i.e., a key estimate but still likely much too low. And I also show that Neutrino propagation is likely an Ethereal Pulse or Stress, like a Twisting Spring Pulse (wave), instead of a forward or backward pulse. Thus, not likely a Particle mass flying through space, like a bullet or ‘baseball’. And I give more details, and address related questions.
Category: Statistics
[119] viXra:2601.0126 [pdf] replaced on 2026-03-23 02:16:35
Authors: Zhi Li, Hua Li
Comments: 6 Pages.
In mathematics, real numbers can be represented by points on a straight line called the number line, which includes a point called the origin, the direction of number growth, and a unit length. It is generally assumed that there is a one-to-one correspondence between real numbers and points on the number line, with the position of a point determining the size and order of the numbers. This essentially assumes that all real numbers have a definite position on the number line, and that there is a definite order between any two real numbers.This paper shows that there are real numbers with uncertain positions, and that all real numbers do not lie on the number line of the same dimension. The number line is composed of discrete points, which are "pure numbers"—that is, only pure numbers exist on the number line, while non-pure numbers exist in "empty space." Therefore, there is a logical contradiction between the continuity of real numbers and the real number line; the real number line is an incomplete and imperfect conception for representing real numbers. This paper gives the definition of a pure number and the relationship between its cardinality and the natural cardinality.These results verify the viewpoint of quantum theory in physics, namely that the straight line on the "macroscopic" number line is composed of "microscopic" discrete and discontinuous points.
Category: General Mathematics
[118] viXra:2601.0125 [pdf] submitted on 2026-01-27 00:35:05
Authors: Holger Döring
Comments: 5 Pages.
Introduced is a discrete, algebraic polynom regulator for the functional reormalization-group (FRG) in quantum gravity. The regulator bases on projections of the Laplace-Beltrami-operators and integrates all modi below a discrete scala exactly. The generated discrete RG-steps produce natural log-periodic oscillations and a fractal UV-structure, which agrees qualitative with results from Causal Dynamic Triangulations (CDT). The procedure is first applied to Einstein-Hilbert-truncation and then to elaborated truncations with and . Fixed points and critical exponents are analyzed. The iteration shows stable UV-fixed points and log-periodic patterns in all couplings. The approach offers a diffeomorphism-compatible, exact and discrete alternative to standard regulators in FRG and opens up new possibilities for the study of fractal spacetime structures.
Category: Quantum Gravity and String Theory
[117] viXra:2601.0124 [pdf] submitted on 2026-01-27 00:32:41
Authors: Michael Leventhal
Comments: 306 Pages. 118 figures
This work presents Automata Processing as an algorithmic paradigm in which data-processing problems are formulated as networks of non-deterministic finite automata (NFA) expressed in the Automata Network Markup Language (ANML) and its graphical counterpart ANML-G. The primary aim is to show how moving beyond conventional regular-expression usage enables automata networks—implemented as pattern-matching state-transition elements augmented with counters and related primitives—to serve as a flexible basis for solving a broad range of data-processing tasks, and to provide practical guidance for constructing such machines. The notes synthesize execution semantics and modeling techniques into a tutorial-style reference and a cookbook of worked machines, enabling readers to experiment with design patterns and compose larger solutions from reusable automata building blocks. A semiconductor implementation of a chip able to run ANML descriptions was unveiled by the Micron Corporation in 2013, providing a real-world demonstration of the practicality of the automata processing paradigm. Observations and guidance are given on the use of this semiconductor architecture for Automata Processing.
Category: Data Structures and Algorithms
[116] viXra:2601.0123 [pdf] submitted on 2026-01-26 12:00:19
Authors: Ivan Aurelian Dan
Comments: 10 Pages.
We present a contraction-based cosmological framework in which theglobal spacetime scale evolves dynamically as a solution of a variational principlewithin General Relativity. The exponential contraction law arises as a backgroundsolution on the contracting branch of the Friedmann equations, rather than beingpostulated. Operational definitions of cosmological observables lead to positiveredshift, closed-form luminosity distances, distinctive BAO scaling, and a strictlynegative Sandage—Loeb redshift drift, providing clear observational discriminantswith respect to standard ΛCDM cosmology.
Category: Astrophysics
[115] viXra:2601.0122 [pdf] submitted on 2026-01-27 00:25:37
Authors: Hao Shen, Qu Zhang, Ruipeng Ma
Comments: 26 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This paper constructs a unified theoretical framework based on first principles—i.e.,how the universe generates from its most fundamental state—connecting quantum vacuum genesis, the origin of fundamental physical constants, the Standard Model of particlephysics, and cosmic dynamics. The core breakthrough of the theory lies in identifying amathematical-physical structure termed the "core constant cluster" from the physical mechanism of cosmic genesis. This structure consists of four mutually nested and constraining constants: the golden ratio ϕ ≈ 1.618, the core functional dimension n = 5, the fractal dimension Df ≈ 2.736, and the quantum error correction threshold pk ≈ 0.189. This paper demonstrates that these constants are inevitable products of the intrinsic holographic fractal geometry of the vacuum after spontaneous supersymmetry breaking, rather than being introduced through reverse artificial fitting. On the basis of clarifying the origin and fundamental principles of this constant cluster, this paper forwardly deduces the fundamental physical constants—the speed of light c, Planck’s constant h, and the gravitational constant G—mediated via a Nambu-Goldstone scalar field derived from conformal symmetry breaking. Research indicates that these three constants possess scale-dependence compatible with Lorentz covariance, their evolution regulated by the role density ρR and the fractal dimension Df , while the universality of the fine-structure constant α is strictly maintained. Within this theoretical framework, gravity is redefined as a "holographic tension field" emerging from the collective synergistic effects of quantum "role basis vectors" (mathematically represented as |Li⟩ = Γi ⊗ ˆψi, where Γi is an irreducible representation of the SO(10) group). The observational phenomena of dark matter and dark energy are naturally explained as gravitational enhancement effects of this tension field at low role density and fractal synergistic repulsion effects at cosmological scales. This explanation does not require introducing any unknown particles or a cosmological constant; dark matter and dark energy are viewed as normal manifestationsof gravity as defined herein under different scales and conditions, rather than being causedby undiscovered matter or energy. This theory exhibits a high degree of consistency across multiple key observational tests: fitting the rotation curves of 153 galaxies in the SPARC sample yields a root mean square error of only 4.7km/s, with the Bayesian Information Criterion significantly outperforming mainstream models such as SIDM, FDM, and f (R) gravity; combined constraints from baryon acoustic oscillations and Type Ia supernova data yield a predicted Hubble constant H0 = 67.9 ± 0.4km/s/Mpc, alleviating the current Hubble tension to the 2.1σ confidence level; the relative error of the predicted peak positions of the cosmic microwavebackground radiation power spectrum is less than 0.3%. Through research on quantum origins, this paper compatibly embeds the core constantcluster with the Standard Model of particle physics. The theory provides natural explana-tions for long-standing unresolved problems such as the origin of three fermion generations,neutrino mass and mixing angles, and the Higgs mechanism. Finally, under the premise of strictly rigid calculation without modifying specific experimental conditions, this pa-per proposes five experimentally verifiable predictions covering microscopic, mesoscopic,strong-field, and cosmological scales, each with its own defined 3σ statistical rejection line.This framework is mathematically self-consistent and complete, is in principle falsifiableby experiment, and provides a novel approach with a clear empirical path for exploring uni-fied theories beyond the current scope of the standard cosmological model and quantumfield theory. At the very least, even if this paper only serves to provide a new line of thought for related research, this work holds its own value.
Category: Quantum Gravity and String Theory
[114] viXra:2601.0121 [pdf] submitted on 2026-01-27 00:18:23
Authors: Mario Eduardo Jacome Vargas
Comments: 27 Pages. [License:] CC BY 4.0 (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Transient near-field phenomena are traditionally studied as isolated effects, each addressed within its own theoretical or experimental framework. In many electromagnetic measurement systems—including near-field imaging, such transient responses are routinely averaged out, filtered, or actively minimized in favor of stable, quasi-stationary field configurations.This paradigm draws together a wide range of transient near-field phenomena, organizing them by their qualitative characteristics into a coherent framework ,introducing concepts such as "Nearfield Stickiness". The presented concepts are articulated to suggest novel ways in which near-field energy dynamics could be harnessed to infer local electromagnetic characteristics, and are subsequently sublimated into perspective through various hypothetical scenarios and an Original analogy: Formation of Electromagnetic Waves as soap-film Bubbles.Treating transient near-field phenomena as a qualitative coherent whole, rather than as a collection of isolated effects, opens the door to exciting alternative interpretations and exploratory imaging strategies, as it suggests that the induced collapse of fields generated by metamaterial antennas —— fed with incomplete longer wavelength excitations that the ones currently used for its respective applications—— may enable meaningful access to qualitative information rooted in local electromagnetic properties of areas interacting with the field at the moment of collapse;complementing existing systems in areas such as effective range and selective interacting beyond obstacles.
Category: Classical Physics
[113] viXra:2601.0120 [pdf] submitted on 2026-01-27 00:15:10
Authors: Dmitry Makaryev
Comments: 6 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
The Standard Model of particle physics provides no mechanism to predict the hierarchical masses of charged leptons (e, μ, τ ), treating them as arbitrary free parameters. Here, we present a unified topological theory where leptongenerations emerge as discrete phase states of a single soliton in a superfluid vacuum. By imposing a Virial Equilibrium condition (θ = π/4) and a quantized Berry phase (δ = 2/9), we derive the complete mass spectrum analytically. Using only the electron mass as a physical input, our model predicts the Muon mass (105.659 MeV) and Tau mass (1776.985 MeV) with a precision of < 0.01%. Furthermore, the model predicts a fourth mass eigenstate at ∼ 29.9 GeV. Analysis of CMS collider data confirms the absence of a stable particle in this region, validating our hypothesis that the fourth generation is unbound and tunnels into the vacuum continuum.
Category: Quantum Physics
[112] viXra:2601.0119 [pdf] replaced on 2026-02-16 17:06:53
Authors: Malcolm McCoard
Comments: 24 Pages.
A new theory provides cause-and-effect explanations for dark energy, dark matter, universal gravity and the nature of time, based upon two simple justified propositions. Firstly, that spacetime being created from energy in the big bang, following the law of conservation of energy and demonstrated mathematical equivalences, is fundamentally energy. Secondly that the Einstein’s equivalence of mass and energy, can therefore be applied to Newton’s laws of motion, in relation to spacetime itself. In an expanding universe, this unconventional approach creates a force in proportion to spacetime energy momentum change, predicting universal gravity. Reciprocal forces created explain why mass bends proximate spacetime, potentially reconciling Newton’s and Einstein’s models of gravity. A consequential modification to the big bang, requires that inflation also creates simultaneous exponential uniform compression and therefore time dilation, in accordance with general relativity. Resultant continuous ongoing time dilated release and expansion of infinite spatial energy ‘moments’, along the temporal plane, creates the 4th dimension of Minkowski’s spacetime, aligning this theory with the mathematical framework describing general relativity. Time Dilated Spacetime Energy Release or TDSER has a perfect fit with dark energy and how we experience time, including special and general relativity. Unreleased spacetime compressed invisibly along the temporal plane is an excellent candidate for dark matter. This approach explains relativistic mass on a cause-and-effect basis and why nucleons have a greater mass than the sum of their parts. Consistent conjecture is presented on the first moment of creation; a big bang starting from nothing, creating familiar dimensions, all necessarily at 90 degrees. A contender for a single unifying force follows and is shown to match and indeed create equations describing Newton’s second law, the law of gravity and Plank’s force. Fundamental cause-and-effect mechanisms for quantum inertia, quantum gravity, force-carrying virtual particles and wave particle duality are presented. Proof is challenging and limited for such extensive claims, as TDSER does not modify the behaviour of spacetime, rather it is shown to align with and create existing frameworks. Hence this article, in attempting to describe an overall picture, falls between theoretical physics and science philosophy. It is a call to action to develop these ideas further; the potential, a consistent cause-and-effect basis for physics at all scales.
Category: Quantum Gravity and String Theory
[111] viXra:2601.0118 [pdf] submitted on 2026-01-26 22:53:29
Authors: Douglas Corrêa Cavasso
Comments: 13 Pages. Case report in English and Portuguese. Independent research documented at Hospital Erasto Gaertner.
Background: Immune Thrombocytopenic Purpura (ITP) conventional treatment with corticosteroids presents significant adverse effects. This study documents the first case of successful corticosteroid substitution by S. cerevisiae in ITP, achieving 127,000 platelets/mm3 (24.5% superior to prednisone peak) through gut-associated lymphoid tissue (GALT) modulation.RESUMO EM PORTUGUÊS: Relato do primeiro caso documentado de substituição de prednisona por Saccharomyces cerevisiae na PTI, alcançando 127.000 plaquetas/mm3 via modulação do GALT, com estabilidade leucocitária e ausência de efeitos colaterais.
Category: Biochemistry
[110] viXra:2601.0117 [pdf] submitted on 2026-01-27 00:34:30
Authors: Taeho Jo
Comments: 6 Pages.
In this research, we propose and apply the table based KNN variants to the keyword extraction. The initial KNN version was previously modified into the table-based version and applied by mapping the keyword extraction into a binary classification. In this research, we mentioned the three KNN variants, in case of the numerical vector-based versions: one where the selected nearest neighbors are discriminated by their similarities, one where the attributes are discriminated by their correlations with the categories, and one where the training examples are discriminated by their credits. In this research, the three KNN variants are modified into the table- based versions, as well as the initial KNN version. The goal of this research is to improve the keyword extraction performanceby modifying them so.
Category: Artificial Intelligence
[109] viXra:2601.0116 [pdf] submitted on 2026-01-25 21:09:30
Authors: Sambuddha Majumder, Jayanta Majumder, Partha P. Chakrabarti
Comments: 22 Pages.
Greggle is a small query language and tool for performing regular path queries over labelled directed graphs. Gruggle is a companion Node.js utility for ingesting, merging, inspecting, and lightly manipulating graphs in the Graphviz dot format. Graphviz is a widely used system for graph visualization; its dot language is simple to author and makes it easy to view results with standard Graphviz tools. Together the two utilities provide a frictionless workflow: Gruggle builds, merges, filters, and styles graphs; Greggle answers expressive path queries with edge—level predicates; and Gruggle can consume Greggle’s annotations (e.g., find-path) to visualize witnesses. This document presents both tools, why they are complementary, and how they can be used jointly in analysis and visualization tasks.
Category: Data Structures and Algorithms
[108] viXra:2601.0115 [pdf] submitted on 2026-01-25 00:23:54
Authors: Stanley L. Robertson
Comments: 25 Pages. One figure
Since Schwarzschild’s first solution of the Einstein field equations, the simple model of a single, point mass gravitating source has encompassed an impressive array of phenomena that have provided confirmation for Einstein’stheory of General Relativity. These include gravitational time dilation and spectral redshifts, gravitational refraction of light, perihelion precession of planetary orbits, innermost stable orbits of accretion disks and, recently, the shadows of the photon spheres of extremely compact masses. These phenomena are associated with the geodesic motions of material particles or photons in the immediate vicinity of large masses that can be regarded as point sources of gravity. The limited purposes of this article are to present the underlying physicsof the exponential metric of Yilmaz and to demonstrate that it correctly encompasses the observed phenomena. As an isotropic metric, It may be the only one also in accord with the observed isotropy of inertia.
Category: Relativity and Cosmology
[107] viXra:2601.0114 [pdf] submitted on 2026-01-25 00:21:21
Authors: Andrew W. Beckwith
Comments: 16 Pages. Chapter In " Cosmology research, Addressing current problems in astrophysics with editors M. Smith and A. Oztaz, 225
Our idea for black hole physics being used for GW generation , is using Torsion to form a cosmological constant. Planck sized black holes allow for a spin density term linked to Torsion.. And we conclude with a black hole versus white hole for creating relic GW frequencies in physics model of Black hole and white holes, linked by a worm hole. In doing so, we review its similarities to frequency values for GW due to a Tokamak simulation. The conclusion of this document will be in bringing up would be values for an initial wave function of the Universe
Category: Relativity and Cosmology
[106] viXra:2601.0113 [pdf] replaced on 2026-02-01 00:12:41
Authors: Bassam Abdul-Baki
Comments: 4 Pages.
In this paper, we improve the lower-bounds for optimal Golomb rulers.
Category: Combinatorics and Graph Theory
[105] viXra:2601.0112 [pdf] submitted on 2026-01-25 00:08:56
Authors: Abdur Rahim Choudhary
Comments: 8 Pages.
We add a mass term into Newton’s equation for gravity. This mass term is negligible at small distances but dominates at asymptotic distances. It represents the missing mass. The theory predicts rising radial velocities in galaxies, derives Hubble’s Law, together with an expression for the Hubble’s parameter. The theory reveals inner unity of the three overarching phenomena: rising rotation velocities, Hubble’s law, and value of the Hubble parameter.
Category: Astrophysics
[104] viXra:2601.0111 [pdf] submitted on 2026-01-23 20:57:08
Authors: Renato Vieira Dos Santos
Comments: 13 Pages.
This pedagogical paper presents a comprehensive framework for interpreting dispersion relations across fundamental physical systems. We adopt a novel approach that starts from the mathematical form $omega(mathbf{k})$ and systematically extracts its physical content, rather than deriving it from first principles. Through an in-depth case study of the massive Klein-Gordon dispersion relation $omega^2 = omega_0^2 + c^2k^2$, we demonstrate how this single equation encodes phase velocity, group velocity, density of states, effective mass, and impedance. The analysis reveals the universal nature of this dispersion form, which manifests in quantum fields, plasmas, superconductors, and photonic crystals with different physical interpretations of its parameters. We complement this with detailed examination of classical systems including mass-spring chains and hydrodynamic waves, providing tangible analogies that bridge conceptual understanding between quantum and classical wave phenomena. The paper includes eleven carefully designed figures that visualize key concepts and a comprehensive catalog of dispersion relations in the Appendix. Aimed at advanced undergraduates and instructors, this work emphasizes conceptual understanding through physical interpretation, offering a unified pedagogical framework for teaching wave propagation across physics curricula while maintaining mathematical rigor and depth.
Category: Condensed Matter
[103] viXra:2601.0110 [pdf] submitted on 2026-01-23 03:38:26
Authors: Sasan Ardalan
Comments: 26 Pages.
The algorithms for computing the point target response in Synthetic Aperture Radar (SAR) will be presented. The target modeling and simulations will be performed following the procedure developed by McDonough, et. al. (1985) [1] for SEASAT. Simulation results will be provided through the block diagram modeling of the SAR system with Capsimtextsuperscript{textregistered}[2]. The SAR project has been a part of the Capsimtextsuperscript{textregistered} distribution since 1990. The research on SAR was conducted by the author while a Professor at NC State University in 1987. The GitHub repository was created in November 2025.
Category: Digital Signal Processing
[102] viXra:2601.0109 [pdf] submitted on 2026-01-23 08:53:31
Authors: Timothy Jones
Comments: 1 Page.
Using tangent lines to the unit circle, we give an argument that shows pi is irrational.
Category: Number Theory
[101] viXra:2601.0108 [pdf] submitted on 2026-01-23 14:52:05
Authors: Arghya Ghosh
Comments: 3 Pages.
We present a quantum algorithm for estimating the expected loss of a credit portfolio driven by a latent risk factor. The method discretizes a continuous latent variable, encodes its probability distribution into quantum amplitudes, embeds the loss function via controlled rotations on an ancilla qubit, and applies Grover-style amplitude amplification. The resulting state admits a two-subspace decomposition enabling estimation of the expected loss using Quantum Amplitude Estimation (QAE) or a maximum-likelihood estimator (MLE) based on Grover power measurements. The approach achieves a quadratic speedup over classical Monte Carlo methods.
Category: Quantum Physics
[100] viXra:2601.0107 [pdf] submitted on 2026-01-23 15:47:43
Authors: Yuly Shipilevsky
Comments: 1 Page.
We introduce a paradox, which we named it "Friendship Paradox":"EVERY lady chooses less attractive lady as a friend. If so, why opposite, those less attractive lady is a friend of those more attractive lady? Contradiction."
Category: Set Theory and Logic
[99] viXra:2601.0106 [pdf] submitted on 2026-01-23 20:35:24
Authors: Lawrence S. Schulman
Comments: 1 Page.
It is proposed that acceleration be quantized.
Category: Relativity and Cosmology
[98] viXra:2601.0105 [pdf] submitted on 2026-01-23 19:27:37
Authors: Vladimir Skrebnev, Maria Polski
Comments: 11 Pages.
The paper examines and critiques the expression of entropy as the logarithm of the number of quantum states of a physical system. Boltzmann’s method of expressing entropy as the logarithm of the number of states of a gas with a given total energy is analyzed. We demonstrate that entropy is the product of subquantum processes and show that entropy is expressed as the ratio of the logarithm of the maximum number of realizations, over the observation period, of a macroscopic system's states with a given total energy, to the number of occurrences of its quantum states over this time.
Category: Thermodynamics and Energy
[97] viXra:2601.0104 [pdf] submitted on 2026-01-22 21:45:02
Authors: Andrew W. Beckwith
Comments: 10 Pages. [A] book [chapter to be published in] 2026
The author argues in this document that initial vacuum state values possibly responsible for GW generation in relic conditions in the initial onset of inflation may have a temporary unsqueezed , possibly even coherent initial value, which would permit in certain models classical coherent initial gravitational wave states Furthermore, several arguments pro and con as to if or not initial relic GW should be high frequency will be presented, with the reason given why earlier string models did NOT favor low frequency relic GW from the big bang.. What is observed is that large higher dimensions above our 4 Dimensional space time, if recipients of matter-energy from collapse and re birth of the universe are enough to insure low relic GW. The existence of higher dimensions, in itself if the additional dimensions are small and compact will have no capacity to lower the frequency limit values of relic GW, as predicted by Giovannini,,et.al. in 1995.
Category: Relativity and Cosmology
[96] viXra:2601.0103 [pdf] submitted on 2026-01-22 10:15:17
Authors: Rolando Zucchini
Comments: 17 Pages.
Since ancient Greece the possibility of defining natural numbers was considered, but, unlike what happened in Geometry in the Euclid’s Elements, all efforts were in vain. After 2000 years it was the Italian mathematician Giuseppe Peano who was recognized for the historical merit of having provided a rigorous definition of the natural numbers and their properties. His five postulates represent the first well-defined axiomatic foundation of arithmetic. Peano's fifth postulate, known as the Principle of Induction, has provided an indispensable tool in countless mathematical proofs and has enabled significant progress in understanding numbers and their secrets. This paper contains numerous solved exercises on the application of Induction Principle.
Category: Number Theory
[95] viXra:2601.0102 [pdf] replaced on 2026-01-29 12:50:49
Authors: Antonio León Sánchez
Comments: 8 Pages.
The relativistic contraction of distances in the direction of relative motion is used here to formally deduce a potentially infinite number of violations of the Second Law of the Reflection of Light, violations that are impossible according to the first principle of special relativity. From this impossible, and therefore false, contraction of distances, the falsity of time dilations and the falsity of phase differences in synchronizations are formally deduced. Thus, special relativity is an inconsistent theory whose inconsistency must be a consequence of one of its two fundamental principles, the second principle being the only one that can be false, since the first establishes the universality of physical laws, without which the observed consistent evolution of the known universe would be impossible.
Category: Relativity and Cosmology
[94] viXra:2601.0101 [pdf] submitted on 2026-01-22 21:26:54
Authors: Juan Moreno Borrallo
Comments: 19 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org!)
The arithmetic of the integers is governed by two fundamental operations, addition and multiplication, whose interaction lies at the core of many deep problems in number theory. While multiplication preserves prime factorization in a rigid and conservative manner, addition typically destroys multiplicative structure and generates new prime content.In this work, we develop a unified structural framework that explains this asymmetry through spectral and operator-theoretic principles. By embedding the integers into a Hilbert space, we show that multiplication acts as a diagonal, layer-preserving operator in the prime spectral basis, whereas addition acts as a non-local, mixing operator driven by carry propagation. This spectral incompatibility leads to an arithmetic uncertainty principle, forbidding simultaneous localization in additive and multiplicative bases.Building on this structure, we introduce additive innovation as a quantitative measure of the new prime information created by a sum. We prove that the only obstruction to innovation arises from smoothness and $S$-unit phenomena in the coprime core. Using classical results on smooth numbers, we show that additive innovation is typically large, yielding unconditional abc-type inequalities in density.Finally, we develop an information-theoretic perspective, showing that addition produces entropy across prime scales while multiplication remains information-preserving. These results provide a structural explanation for the sum-product phenomenon and reframe classical problems as manifestations of the intrinsic incompatibility between additive and multiplicative spectral structures.
Category: General Mathematics
[93] viXra:2601.0100 [pdf] submitted on 2026-01-22 21:27:55
Authors: Juan Moreno Borrallo
Comments: 26 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org!)
We propose a foundational route from elementary mathematical operations to the structural form of physical law. The guiding thesis is that multiplication is the primitive operation that generates geometric extension (e.g., area via bilinear composition), integration is the continuous accumulation of such local extensions into global quantities, and differentiation (or functional variation) is the dual operation that extracts local constraints from global accumulations. From these principles, we show how any consistent description of ``physical reality'' must be formulated in terms of local densities defined over a continuous geometric support, whose global content is obtained by integration and whose dynamics follows from variational (action) stationarity.Within this operational framework, quadratic field terms arise naturally as the simplest scalar invariants built from local degrees of freedom, while source couplings appear as bilinear products between generalized currents and the underlying deformation variables. Furthermore, we show that quantum entanglement is not a dynamical anomaly but a structural inevitability: additive accumulation acting on states represented in a multiplicative (spectral) basis generically produces global correlations that resist local factorization. This reframes Bell-type violations as a failure of structural independence rather than a signal of superluminal causal influence, thereby preserving relativistic causality at the level of dynamical propagation.Crucially, beyond the contractive modes commonly associated with forces and curvature, the same logic compels expansive degrees of freedom: an entropic sector characterized by an intensive--extensive product structure (temperature-like $times$ entropy-like) contributing intrinsically to the global action. This viewpoint yields a general blueprint for interpreting electromagnetic, gravitational, and entropic responses as projected modes of a common underlying field structure, and it clarifies why concrete realizations of such a blueprint---including quantum-elastic and gravito-entropic field models---arise as minimal, structurally stable completions rather than independent hypotheses.
Category: Mathematical Physics
[92] viXra:2601.0099 [pdf] submitted on 2026-01-22 21:28:49
Authors: Juan Moreno Borrallo
Comments: 75 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org!)
We present the Quantum-Elastic Geometry (QEG) theory, a unified framework wherein spacetime is modeled as a fundamental, physical substrate with quantum, elastic, and dissipative properties. The state of this medium is described by a single, symmetric rank-2 tensor field, $mathcal{G}_{muu}$, whose dynamics are governed by a generally covariant action. Known physical interactions are shown to emerge as distinct, irreducible deformation modes of this unified field: gravity, electromagnetism, and a new field --denominated "thermo-entropic field"-- that gives rise to irreversible thermodynamics.Furthermore, fundamental constants of nature are shown to be uniquely determined and interrelated by the substrate's properties. We derive the fundamental constants of nature through two distinct yet convergent approaches: (i) from the physical postulates of QEG, assuming the $mathcal{G}_{muu}$ tensor, its properties leading to dimensional collapse ($[M]equiv[L]equiv[T]$), and parsimonious physical principles (e.g., reciprocity, damped equipartition, self-consistency), we deduce specific functional forms for the constants; and (ii) independently, assuming only foundational geometric principles for the substrate (homogeneity, isotropy, covariance, Lorentz invariance) and imposing self-consistency -formalized via a minimal set of geometric normalization conditions consistent with the QEG framework-, we derive the substrate's emergent structure and properties, obtaining precisely the same functional forms for the constants. The outcome is a robust, convergent two-way deductive framework, in which fundamental constants are geometrically enforced, emerging as predictable consequences of a stable and symmetrically constrained geometry.Finally, we show how the theory predicts -among other results- a scale-dependent gravitational coupling derived from a geometric duality in self-energy, which offers a parameter-free resolution to key cosmological tensions, including the Hubble crisis.In summary, QEG provides a coherent and consistent origin for both fields and constants, unifying them as rigorously derived emergent properties of a single, dynamic spacetime substrate.
Category: Relativity and Cosmology
[91] viXra:2601.0098 [pdf] submitted on 2026-01-22 21:29:20
Authors: Juan Moreno Borrallo
Comments: 43 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org!)
We present a minimal nonlinear extension of Quantum—Elastic Geometry (QEG), in which a single symmetric deformation tensor (G_{muu}) and its modal projections underpin the effective long-range sectors of gravity, electromagnetism, and thermo-entropic dynamics. The extension accounts for two additional empirical structures—finite-range interactions and hadronic-scale confinement—without introducing new fundamental fields beyond (G_{muu}). Finite range emerges when selected projected modes acquire geometric masses set by the local curvature of the substrate self-interaction potential, (m_X^2 equiv V_X''(0)), yielding Yukawa/Proca-type propagation. In the genuinely nonlinear regime, quartic (and higher) terms in (V(G)) can energetically favor filamentary minima; under suitable variational constraints, this leads to flux-tube configurations with approximately constant tension and an effective linear energy—separation scaling (confinement-like behavior).Crucially, the framework yields an endogenous classification of particle-like excitations: particles are finite-energy, localized eigenmodes or topologically stabilized defects of the elastic vacuum (G_{muu}), carrying quantized action. Under finite-action boundary conditions and a compact order-parameter sector, the Standard Model taxonomy is reorganized as sectors of the physical configuration space: fermions correspond to nontrivial spinorial or holonomy sectors, bosons to topologically trivial transport modes, leptons to elementary globally extendable defects, quarks to fractional defect configurations obstructed from isolated finite-action completion, and hadrons to closed composites in which obstruction classes cancel. The same construction yields a natural interpretation of generations as discrete radial excitation levels ((k = 0,1,2,ldots)) around a fixed defect topology—e.g., (k=0 to e), (k=1 to mu), (k=2 to tau)—thereby relating mass hierarchies to the spectral structure of a single underlying defect rather than to distinct fundamental species.
Category: Nuclear and Atomic Physics
[90] viXra:2601.0097 [pdf] submitted on 2026-01-22 21:24:24
Authors: Abdelmajid Ben Hadj Salem
Comments: 7 Pages.
In this paper, assuming that the conjecture c [smaller than] R*2 is true, we give the proof that the explicit abc conjecture of Alan Baker is true and it implies that the abc conjecture is true. We propose the mathematical expression of the constant K(epsilon). Some numerical examples are provided.
Category: Number Theory
[89] viXra:2601.0096 [pdf] submitted on 2026-01-22 21:20:43
Authors: Mattia Furlin
Comments: 5 Pages.
In this short article, we will discuss a card game, from now on namely Solitaire modulo 3. After having described how it works, through a probabilistic calculation, we will arrive at determining the probability of victory. In particular, we will use the rook polynomials, which will allow us to finally obtain a closed form for calculating the probability of winning at Solitaire modulo 3. Finally, we will study the case where the number of cards in play is much more greater than the number of constraints present in the game format. Under this assumption, the Solitaire modulo 3 mechanism becomes asymptotically equivalent to a binomial distribution.
Category: General Mathematics
[88] viXra:2601.0095 [pdf] submitted on 2026-01-22 21:17:59
Authors: Renato Vieira Dos Santos
Comments: 12 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org!)
This paper develops a stochastic dynamical model to investigate the psychological impact of exposure to contradictory information, a prevalent feature of modern media ecosystems. We formalize ``contradictory stimulation'' as stochastic noise in a model of emotional state dynamics. Our analysis reveals two key regimes: high-intensity contradiction drives individuals towards textbf{stable apathy}, while specific parameter combinations produce textbf{bimodal polarization}, where psychological states oscillate randomly between euphoria and lethargy. These results provide a mathematical basis for sociological phenomena like anomie and offer a novel mechanism for emergent polarization from a uniform information stream. The study establishes a theoretical framework for generating testable hypotheses about the effects of information chaos on political engagement and psychological well-being.
Category: Physics of Biology
[87] viXra:2601.0094 [pdf] replaced on 2026-03-10 06:36:27
Authors: T. Nakashima
Comments: 4 Pages.
Riemann Hypothesis has been the unsolved conjecture for 164 years. This conjecture is the last one of conjectures without proof in "{U}eber die Anzahl der Primzahlen unter einer gegebenen Gr"{o}sse"(B.Riemann). The statement is the real part of the non-trivial zero points of the Riemann Zeta function is 1/2.Very famous and difficult this conjecture has not been solved by many mathematicians for many years. In this paper,I conjecture about the independence (difficulty of proof) of propositions equivalent to the Riemann hypothesis. My position is to discuss the difficulty of proof purely as an intuitive argument.
Category: Number Theory
[86] viXra:2601.0093 [pdf] submitted on 2026-01-22 21:47:54
Authors: Renato Vieira Dos Santos
Comments: 6 Pages.
Does gravity care about electric charge? Precision tests of the weak equivalence principle achieve remarkable sensitivity but deliberately minimize electric charge on test masses, leaving this fundamental question experimentally open. We present a minimalist framework coupling electromagnetism to linearized gravity through conservation of a complex charge-mass current, predicting charge-dependent violations $Delta a/g = kappa(q/m)$. Remarkably, this prediction occupies unexplored experimental territory precisely because precision gravity tests avoid charge variation. We identify this as a significant gap and propose a modified torsion balance experiment where $q/m$ is treated as a controlled variable. Such an experiment could test whether gravitational acceleration depends on electric charge, probing physics in genuinely new parameter space. This work exemplifies how theoretical minimalism can reveal overlooked opportunities in fundamental physics.
Category: Classical Physics
[85] viXra:2601.0091 [pdf] submitted on 2026-01-21 03:41:50
Authors: Yefim Bakman
Comments: 6 pages
If a non-material energy source were to exist, it could solve two problems in physics at once by identifying a common cause underlying the existence of dark matter and dark energy. Moreover, it could solve a third problem related to ordinary gravity, which general relativity still cannot explain.In fact, such a non-material source of energy was described by Nikola Tesla, who called it a "primary substance." However, this insight was not embraced by the physics community and has only recently been described in a series of articles.This article reveals the nature of the gravitational field, allowing us to understand the existence of "pure gravity" without the participation of mass at astronomical and cosmological scales. This explains the phenomena of dark matter and dark energy.
Category: Relativity and Cosmology
[84] viXra:2601.0090 [pdf] submitted on 2026-01-22 00:04:28
Authors: Victor Wang
Comments: 19 Pages. (Note by viXra Admin: This paper is written by a HS Student and may fall outside the scholarly scope/norm) )
Through a high school student’s lens, we investigate the classical problem of relating a polynomial’s roots to its coefficients and demonstrate how to derive the solution to the cubic and quartic using radicals, as well as the trigonometric solution to the real-rooted cubic. In the afterword, we discuss the history of the problem, as well as the history of how this work came to be.
Category: Algebra
[83] viXra:2601.0089 [pdf] replaced on 2026-01-27 04:37:39
Authors: Ryan Hackbarth
Comments: 11 Pages. This update includes a more usable primary function, and uses it to forecast the nontrivial zeroes of the Riemann Zeta Function.
Here I present a derivation of an equation whose solution sets are the trivial and nontrivial zeros of the Riemann Zeta Function. I demonstrate how the trivial solutions are directly encoded by integer inputs and how these can be mapped by a symmetry to positive odd integers. I extend this insight to encode the even integers, and map these to the negative odd integers, which provides an explicit connection between particular values of the Riemann Zeta Function which have historical and ongoing research interest. I then extend this symmetry to the nontrivial zeroes, and demonstrate the dependence of the critical line in producing this symmetry. Finally, I note the distribution of the nontrivial zeroes have a correspondence with the distribution of trivial zeroes, and provide a first order approximation of this correspondence.
Category: Number Theory
[82] viXra:2601.0088 [pdf] submitted on 2026-01-21 23:49:06
Authors: Ajay Kumar
Comments: 5 Pages. 1 figure. Licensed under CC BY-SA 4.0 (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
We present a conservative infrared extension of General Relativity in which late-timecosmic acceleration emerges from a curvature-regulated modification of gravitational timedilation. The framework introduces no additional propagating degrees of freedom and remains fully covariant at the action level. Exponential suppression ensures agreement withall laboratory, solar-system, and strong-field tests of gravity. We provide a detailed mathematical formulation, analyze background and perturbative dynamics, compare with existingobservational constraints, and study the theory across solar, galactic, and cosmological curvature scales. The model reproduces ΛCDM behavior at late times while yielding distinct,testable predictions in ultra-low curvature environments
Category: Relativity and Cosmology
[81] viXra:2601.0087 [pdf] submitted on 2026-01-21 23:18:21
Authors: Nestor E. Ramos
Comments: 27 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Argentina's political and economic trajectory since 1930 represents not a series of random misfortunes or isolated failures, but rather a deterministic yet chaotic descent into a self-reinforcing social attractor characterized by persistent decay, as illuminated by the principles of chaos theory. This paper employs a meticulously constructed justice-weighted Composite Social Stability Index (SSI), derived from six fundamental socioeconomic indicators, inflation rates, GDP growth, presidential instability, poverty levels, unemployment rates, and the Gini index, to quantitatively demonstrate the chaotic nature of Argentina's systemic evolution. By weighting poverty and inequality more heavily, our SSI functions as a moral compass, rejecting the neoliberal fallacy that ‘stability’ without justice is desirable. Our analysis reveals key characteristics of this chaotic system: a high Hurst exponent indicating strong persistence and long-term memory in instability patterns; a positive maximum Lyapunov exponent confirming sensitive dependence on initial conditions, the hallmark of chaos; and a fractal dimension suggestive of a strange attractor, where the system's behavior is bounded yet unpredictable and non-repeating.
Category: Social Science
[80] viXra:2601.0086 [pdf] submitted on 2026-01-21 23:13:21
Authors: Renato Vieira Dos Santos
Comments: 10 Pages. DOI: https://doi.org/10.1016/j.chaos.2025.117554 (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Political moderation, a key attractor in democratic systems, proves highly fragile under realistic information conditions. We develop a stochastic model of opinion dynamics to analyze how noise and differential susceptibility reshape the political spectrum. Extending Marvel et al.'s deterministic framework, we incorporate stochastic media influence $zeta(t)$ and neuropolitically-grounded sensitivity differences ($sigma_y > sigma_x$). Analysis reveals the moderate population---stable in deterministic models---undergoes catastrophic collapse under stochastic forcing. This occurs through an effective deradicalization asymmetry ($u_{B}^{text{eff}} = u + sigma_y^2/2 > u_{A}^{text{eff}}$) that drives conservatives to extinction, eliminating cross-cutting interactions that sustain moderates. The system exhibits a phase transition from multi-stable coexistence to liberal dominance, demonstrating how information flow architecture---independent of content---systematically dismantles the political center. Our findings reveal moderation as an emergent property highly vulnerable to stochastic perturbations in complex social systems.
Category: General Science and Philosophy
[79] viXra:2601.0085 [pdf] submitted on 2026-01-21 23:21:49
Authors: Harish Chandra Rajpoot
Comments: 14 Pages. (Note by viXra Admin: Please don't name a theorem/formula/equation etc after the author's name)
In this paper, a theorem is formulated and proved that yields generalized closed-form expressions for the dihedral angle between any two arbitrary lateral faces of a regular n-gonal right pyramid. The dihedral angles are expressed in terms of the apex angle, defined as the angle between two adjacent lateral edges meeting at the apex. The proposed formulation establishes a direct analytical relationship between the edge geometry at the apex and the corresponding dihedral angles of the pyramid. Due to its generality, the theorem applies to all regular and uniform polyhedra whose vertex configuration coincides with that of a right pyramid, as well as to regular n-gonal right prisms with an arbitrary number of sides. The resulting formulas are useful for geometric modeling, construction of physical models, and the development of computational algorithms for the analysis of polyhedral structures and equally inclined sets of concurrent vectors in three-dimensional space.
Category: Geometry
[78] viXra:2601.0084 [pdf] submitted on 2026-01-22 00:24:56
Authors: Taeho Jo
Comments: 6 Pages. (Note by viXra Admin: Further repetition will not be accepted and please submit article written with AI assistance to ai.viXra.org)
In this research, we propose and apply the graph based AHC variants to the word clustering. The initial AHC version which clusters graphs was previous proposed as an approach to the word clustering. In this research, we mention the three AHC variants: one where the data clustering proceeds in the bottom-up direction with the similarity threshold, one where it allows any merge of more than two pairs, and one where clusters are merged based on the radius. In this research, we modify the three AHC variants into the graph- based versions, as well as the initial AHC version. As the goal of this research, we improve the clustering performance, by modifying them so.
Category: Artificial Intelligence
[77] viXra:2601.0083 [pdf] submitted on 2026-01-22 00:43:42
Authors: Andrew W. Beckwith
Comments: 6 Pages. [a] part of a book Pathways to Quantum cosmology [to be released] in late March 2026
We consider if a generalized HUP set greater than or equal to Planck’s constant divided by the square of a scale factor as well as an inflation field, yield the result that Delta E times Delta t is embedded in a 5 dimensional field which is within a deterministic structure. Our proof ends with Delta t as of Planck time yielding an enormous potential energy. If that potential energy is induced by a repeating universe structure, we get for free a value of Delta E Delta t almost infinite in value which supports a prior conclusion.
Category: Quantum Gravity and String Theory
[76] viXra:2601.0082 [pdf] submitted on 2026-01-20 21:56:27
Authors: A. J. Emili
Comments: 2 Pages. (Note by viXra Admin: Please cite all listed scientific reference and submit article written with AI assistance to ai.viXra.org)
The persistent discrepancy between local measurements of the Hubble constant H_0 ~ 73 km/s/Mpc) and values derived from the Cosmic Microwave Background H_0 ~ 67 km/s/Mpc) suggests a fundamental incompleteness in the LambdaCDM model. We propose a solution based on dissipative wave mechanics within a viscous continuum. By introducing a non-vanishing kinematic viscosity nu to the vacuum substrate, we demonstrate that cosmological redshift is a non-linear function of distance, induced by Taylor-Couette-like dissipation rather than metric expansion. Numerical fitting against 2026 data from Cosmic Chronometers and JWST-JADES reveals that a single viscous parameter resolves the tension. Furthermore, we derive a falsifiable prediction: a Vacuum Dispersion (CVD), implying that redshift is frequency-dependent (d z / d omega > 0). This effect is testable with current lensed supernova observations.Keywords: Hubble Tension, Vacuum Viscosity, Dissipative Cosmology, Dark Energy, Alternative, Chromatic Dispersion, Hydrodynamic Spacetime
Category: Relativity and Cosmology
[75] viXra:2601.0081 [pdf] submitted on 2026-01-20 13:12:38
Authors: Cornel Badea
Comments: 7 Pages.
Recent advancements in Hierarchical Reasoning Models (HRM) have demonstrated strong capabilities in complex algorithmic and abstract reasoning tasks by mimicking multi-timescale cognitive processes. In this work, we extend this architecture to medical image captioning, introducing specific ImageHRM variants. Furthermore, we explore a radical simplification of this paradigm: the Tiny Recursive Model (TRM). Challenging the necessity of complex dual-loop biological hierarchies, TRM employs a single "tiny" network (7M parameters) that recurses deeply to achieve superior generalization. We introduce ImageTRM, which adapts this "Less is More" philosophy to vision-language tasks. Our experiments on ROCOv2 show that while the Triple-Loop FuseLIP ImageHRM achieves stateof- the-art results, the tiny ImageTRM with a Swin backbone surprisingly outperforms it, demonstrating that deep recursive reasoning with high-quality visual features can surpass larger, more complex architectures.
Category: Artificial Intelligence
[74] viXra:2601.0080 [pdf] replaced on 2026-02-17 00:55:16
Authors: Bin Wang
Comments: 15 Pages.
We show that on a complex projective manifold $X$, for $mathbb G=mathbb R$ or $mathbb Q$, a class in $H^{p, p}(X;mathbb Z)otimes mathbb G$ is represented by a convergent infinite series of integration currents over algebraic cycles with real coefficients. It implies that a Hodge class is represented by an algebraic cycle with rational coefficients.
Category: Geometry
[73] viXra:2601.0079 [pdf] submitted on 2026-01-20 20:04:22
Authors: Andrzej Gecow, Laszlo Barna Iantovics
Comments: 17 Pages.
Half-chaos was detected using simulations of complex, autonomous, fixed-size Kauffman networks, but it also occurs in open, growing networks. Mathematicians dealing with deterministic chaos expect its description in mathematical terms. This article (read at least to ch.3.5) can significantly help to create such a description. Kauffman, studying the statistical properties of a set of completely random systems, identifies two states of the system: either it is ordered, where a small disturbance practically always dies out, or it is chaotic, where a small disturbance practically always causes significant damage (a change in functioning). In between, there is a narrow (in the system parameters) phase transition. The assumption of a small attractor causes the system to cease to be fully random, and a third state is revealed - half-chaotic, in which small and large damage have a similar share, despite the system parameters are chaotic. If this system were fully random, it would almost always give strong chaos. A mechanism for increasing system stability following a permanent disturbance has been identified. It involves limiting secondary initiations by shortening the attractor. Leaving in a half-chaotic system only changes that give small damage do not lead out of half-chaos. In the distribution of the damage size, there is a large gap between small and large damages, which naturally defines ‘small changes’. Leaving one change which causes large damage leads to a practically irreversible transition to ordinary chaos. Unique stability of half-chaos extends the range of allowed parameters for models of human-created and living systems, previously limited to the edge of chaos by the famous Kauffman hypothesis. Half-chaos explains the essence of the life process.
Category: Combinatorics and Graph Theory
[72] viXra:2601.0078 [pdf] replaced on 2026-01-26 14:42:11
Authors: Ervin Goldfain
Comments: 31 Pages.
We develop a geometric framework in which classical gravity emerges from primordial spacetime having continuous effective dimensions. Spacetime is modeled as an evolving multifractal structure, analog to the construction of Cantor Dust (CD), where the Hausdorff measure replaces ordinary volume. Two fundamental findings are uncovered, namely, 1) CD is directly tied to Dark Matter phenomenology; 2) Einstein-Hilbert formulation of General Relativity emerges as an effective action of CD.
Category: Relativity and Cosmology
[71] viXra:2601.0077 [pdf] submitted on 2026-01-20 22:38:23
Authors: Mahdi Rezapour
Comments: 8 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This study examines user engagement with online video content using a multi-task learning approach. In this study, we combine viewing histories, basic user attributes, and content datasets from several public sources to predict both the proportion of a video watched and whether a user skips a video. The two tasks are learned jointly, using a shared representation with separate outputs for regression and classification. Several common multi-task architectures are evaluated and compared under the same experimental setup. Techniques like Multi-Gate Mixture-of-Experts (MMoE), and Progressive Layered Extraction (PLE), and cross stick network were employed. Results of this study on a held-out test set show that watch ratio can be predicted with reasonable accuracy, while skip prediction remains challenging and only marginally better than random guessing. Differences between model architectures are small, suggesting that data size and label definition might have a stronger influenceon performance than model choice. These findings highlight the difficulty of modeling discreteengagement outcomes from noisy behavioral data and point to the importance of careful labelconstruction in future work. Especially, this study highlights the challenges of prediction of skip prediction due to likely reason of subjectively setting the threshold.
Category: Artificial Intelligence
[70] viXra:2601.0076 [pdf] submitted on 2026-01-19 21:07:20
Authors: Thuy Thu Nguyen
Comments: 45 Pages. (Note by viXra Admin: Author name is required in the article; please submit article written with AI assistance to ai.viXra.org)
The reliability and performance of machine learning (ML) systems in production dependcritically on data engineering decisions made throughout the pipeline lifecycle. This compre-hensive technical review synthesizes ndings from 434 peer-reviewed publications spanning20182026 to quantify how upstream data collection, mid-stream preprocessing and featureengineering, and downstream versioning and monitoring decisions impact ML outcomes.We examine production systems across cybersecurity, healthcare, nance, and cloud-nativeplatforms, analyzing technical frameworks including Apache Kafka, Kubeow, MLow, andemerging feature stores. Our analysis reveals that data quality issues account for 6080% ofML system failures in production, with data engineering decisions inuencing model accu-racy by up to 40 percentage points. We identify critical decision points across the pipeline,quantify their impacts through empirical evidence, and provide actionable frameworks forpractitioners. Key ndings include: (1) streaming architectures reduce latency by 10100Öwhile maintaining accuracy within 25% of batch systems; (2) automated data validationcatches 7090% of quality issues before model training; (3) feature stores reduce feature engi-neering time by 5070% while improving consistency; and (4) comprehensive lineage trackingenables 35Ö faster debugging of production failures. This review establishes data-centricAI as essential for reliable ML systems and identies critical gaps in cost-benet analysis,cross-domain generalization, and standardized impact metrics.
Category: Artificial Intelligence
[69] viXra:2601.0075 [pdf] submitted on 2026-01-19 21:02:01
Authors: Vasil Korniienko
Comments: 8 Pages.
The origin of these threats, as well as the methods of detecting such objects, are unknown to science. But we experimentally confirmed A. Einstein's service station, according to which there is nothing in the world except the energies, the interaction of which forms a quantum electromagnetic field (QEF) in matter. Therefore, loads on matter cause perturbation of its QEF in the form of waves of quantum electromagnetic energy (C-radiation). In the spectrum of these energies, we observed in the summer of 2003 as an object of dark energies in the form of an Energy Information Field (EIF). occupied the Earth and the Sun. On Earth, the EIF sucks quantum energies from technology and sends them to the EIF, which occupied the Sun, which enhances the suction of solar C-radiations from it, and with them the deep solar heat, which causes massive forest fires and accelerates global climate warming. The results of these studies allowed us in May 2025 to discover a large planet from dark matter near the Sun. which generated a powerful stream of cosmic cold, and its gravity created giant cracks in the Earth, which foreshadowed its destruction. However, we have tested a spiritual practice that allows us to eliminate such threats from space. A method of neutralizing the effect of EIF on accelerating global climate warming is also proposed.
Category: Astrophysics
[68] viXra:2601.0074 [pdf] submitted on 2026-01-19 20:59:02
Authors: Holger Döring
Comments: 17 Pages.
Discussed is the possibility to overlap several temporal structures of events with time-difference on a timelike wordline or bundles/bunches of familar worldlines through a form of resonance process, which identifies the events on an eigenwert-scale with one another. This overlapping-effect of different timelike incidents on the same worldline or worldbundle will cause a form of "time travel" with several consequences. These consequences are discussed. May be, that by this process a form of information transport can occur over timelike distance-intervalls on the same global timelike wordline through a sort of identification folding resp. approximation of eigenvalue alignments of incidents without energy transport. Mathematically used for this description is a Super-Schroedinger-equation, which solutions are Schroedinger-equations for single and multiple resonance-states.
Category: Quantum Gravity and String Theory
[67] viXra:2601.0073 [pdf] submitted on 2026-01-19 20:51:04
Authors: Viktor Victorovich Oleksenko
Comments: 9 Pages. In Russian
This paper presents the final decoding of the physical essence of electric charge, current, and electromagnetic vacuum constants. Based on the "Nolekson Gas Model" and the physical number πvphys introduced by the author in 2010 [1], the redundancy of SI dimensions is proved. It is established that the foundations, including the term "Nolekson", the derivation of πvphys, and the mass calculation of the Nolekson particle, originated in the 2010 work [1]. A key step is the transition to the kinematic dimension of mass as area [L^2] [3]. It is proven that electrodynamics is a special case of quantum gas dynamics, where charge corresponds to volumetric flow [L^3/T] and current to volumetric acceleration or mechanical force [L^3/T^2].
Category: Quantum Physics
[66] viXra:2601.0072 [pdf] submitted on 2026-01-17 22:21:06
Authors: Cesar Roberto Arellanes Gonzalez
Comments: 9 Pages.
This paper examines the hypothesis that measurement events function as generative operations rather than passive observational processes in the formation of observable states. Preliminary theoretical analysis suggests measurement interactions may constitute the fundamental mechanism by which potential states transition to actualized configurations across quantum and relativistic regimes.Initial exploration indicates similar generative dynamics may operate in information processing systems, thermodynamic state transitions, chemical reaction pathways, neural signal propagation, developmental gene expression, evolutionary selection events, market transaction execution, material phase boundaries, computational proof verification, and distributed consensus protocols. The commonality appears to lie in the discrete, event-based character of state actualization rather than continuous revelation of pre-existing conditions. The present hypothesis is intended as a unifying statement regarding the ontological role of discrete interaction events in state realization, independent of domain-specific implementations.This work presents the foundational hypothesis without detailed mathematical formalism. The author proposes that action -understood operationally through measurement interaction -serves as a cross-domain generative principle. Specific mechanistic treatments and quantitative predictions will be addressed in subsequent publications.
Category: Quantum Physics
[65] viXra:2601.0071 [pdf] submitted on 2026-01-17 19:36:30
Authors: Kuan Peng
Comments: 21 Pages.
Faraday’s law is empirically derived and, as such, may be subject to limitations. Notably, it appears to violate the law of conservation of energy in certain contexts. To establish a more robust formulation, it is necessary to derive the law from first principles. In this article, we theoretically derive Faraday’s law using only Coulomb’s law and special relativity. We present the first stage of this derivation: the construction of the 'Progressing Electric Field Model.' This model determines the curl of the electric field produced by moving charges and calculates the electric potential induced in a wire loop within that field.
Category: Classical Physics
[64] viXra:2601.0070 [pdf] submitted on 2026-01-16 04:56:38
Authors: Chan Bock Lee
Comments: 11 Pages.
Ampere-Maxwell equation can predict the speed of light by assuming light as a traveling wave. However, it can not predict such characteristics as light generation, moving in a specific direction as a particle and dependence of wavelength upon its energy. Therefore, it needs to be noted that Ampere-Maxwell equation is very limited in predicting characteristics of light. While physics law should be applied to all the inertial frames, co-variance that mathematical form of the equation describing the physics law is same in all the inertial frames is not essential requirement to be physics law. Therefore, Lorentz transformation to result in both the co-variance of Ampere-Maxwell equation for light and the constant velocity of light in all the inertial frames is not the essential requirement. Considering light as a discrete particle with wave characteristics inside, analysis of double slit experiment indicates that travel direction of each photon after slit would be decided at the slit by interaction of incident photon with the slit including its materials and geometry to result in the interference pattern at the screen. There is no direct evidence of occurrence of superposition of photon in the slit and interference of the photon at the screen to result in the interference pattern. If there is no superposition of photon, quantum mechanics and entanglement of photon related with photon superposition need to be updated.
Category: Quantum Physics
[63] viXra:2601.0068 [pdf] submitted on 2026-01-16 21:32:20
Authors: Ryan Hackbarth
Comments: 5 Pages. (Note by viXra Admin: Please cite and list scientific references!)
Here I present an equation for the Zeros of the Riemann Zeta Function which connects the distribution of the trivial zeroes with integer inputs to the distribution of the nontrivial zeroes. I demonstrate that this relationship explicitly depends on the critical line where a = ½. I do so in plain language and with replicable calculations, as when I try to write like a mathematician it comes across as inauthentic and bad. Finally, I provide an appendix of calculated solutions.
Category: Number Theory
[62] viXra:2601.0066 [pdf] submitted on 2026-01-15 06:55:09
Authors: J. Kuzmanis
Comments: 9 Pages.
A mathematically simple odd semiprime factorization method is presented.
Category: Number Theory
[61] viXra:2601.0065 [pdf] replaced on 2026-02-24 22:06:15
Authors: L. Martino, L. Scaffidi, S. Mangano
Comments: 30 Pages.
Likelihood-approximation methods and contrastive learning (CL) are two prominent approaches for inference in models with unknown partition function. In this work, we provide a detailed comparison between the likelihood approximation by Geyer's approach (GA) and CL. Rather than increasing the complexity of Geyer's method to enable comparison, as proposed in [1], we adopt the opposite strategy by simplifying CL. We introduce a class of IS-within-CL schemes that estimate the partition function via importance sampling (IS) and reduce the optimization problem to the original parameter space. This perspective motivates the development of novel variants, whose theoretical properties are analyzed and empirically compared in a replicable experimental study. The described IS-within-CL schemes yield an entire approximation of the partition function, so enabling a possible efficient Bayesian inference. An optimal independent proposal density for IS-within-CL methods and the GA is also introduced. Overall, this work contributes to a clearer unification of likelihood-approximation and CL approaches, offering both theoretical understanding and practical tools for inference in energy-based and non-normalized models. Related MATLAB and R codes are also made freely available to help the reproducibility of the results.
Category: Statistics
[60] viXra:2601.0064 [pdf] submitted on 2026-01-16 03:07:25
Authors: Jaba Tkemaladze
Comments: 18 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
The centrosome, a key microtubule-organizing center, has long been implicated in fundamental cellular processes such as division, polarity, and ciliogenesis. Observations linking centrosomal aberrations to specific cellular states, including senescence and aging, raise a pivotal question: is the centrosome merely a structural casualty of these states, or does it actively encode and transmit information dictating cellular phenotype? This article explores the conceptual framework, methodology, and profound implications of centrosome transplantation, an experimental paradigm designed to answer this question by directly testing causality. We review the historical context of organelle transplantation, detail a comprehensive technical protocol encompassing centrosome isolation, microinjection, and phenotypic analysis, and discuss the significant challenges and alternative approaches. A meta-analysis of pioneering and preliminary data highlights the potential of this method to demonstrate the transfer of age-related traits. We argue that a successful transplantation experiment would constitute a revolution in cell biology, providing definitive proof of non-genetic, organelle-based inheritance of cellular age. This would position the centriole as a strategic custodian of cellular time and open transformative therapeutic avenues focused on rejuvenating the centrosomal machinery in stem cells, thereby offering a novel paradigm for intervening in the aging process.
Category: General Science and Philosophy
[59] viXra:2601.0063 [pdf] submitted on 2026-01-16 02:59:56
Authors: Erkan Gürkan
Comments: 6 Pages.
This study positions the ellipse not merely as a static definition, but as the embodiment of a dialogue between perception and mathematics, a living expression of balance. Through a poetic-philosophical narrative, the (F1 and F2) foci are revealed as symbols of complementary opposites, approaching and separating in a continuous act of creation. The research demonstrates that the ellipse functions not only as a static locus of points but as a self-regulating, dynamic structure governed by an "Internal Law of Balance" and a "Four-Quarter Mathematical Repetition Program." This structure manifests as a continuous unit value exchange between the axes, analytically detailing how the ellipse is cyclically regenerated across four symmetrical quarters. This approach expands the current understanding of the ellipse, positioning it not merely as a defined curve, but as a structure that reveals an intrinsic and continuous mathematical process that necessitates radical revisions in the field of geometry.
Category: Geometry
[58] viXra:2601.0062 [pdf] submitted on 2026-01-14 21:44:00
Authors: Victor Victorovich Oleksenko
Comments: 4 Pages. In Russian
This work presents a new physical paradigm based on the recognition of the substance 'Nolekson'— an inert gas occupying the zeroth position in the zeroth period of D.I. Mendeleev's periodic table of chemical elements. It is mathematically proven that physical constants are derivatives of the geometrical parameters of this medium. The concept of a 'physical π' (πvphys) is introduced, which accounts for the discreteness of the quantum vacuum. A new dimensionality for particle mass is proposed — as a cross-sectional area (L²). Previously, values for the gravitational constant (G) and vacuum energy density consistent with experimental data were derived.
Category: Astrophysics
[57] viXra:2601.0061 [pdf] submitted on 2026-01-14 21:42:13
Authors: Yvan-Claude Raverdy
Comments: 3 Pages.
Recent observations made by the James Webb Space Telescope suggest a reformulation of what is called the "Big Bang." Here, we offer a detailed interpretation that incorporates the hypothesis of the universe's rebound as envisioned by Roger Penrose. This hypothesis presents three major advantages: the conservation of energy, the elimination of infinities, and the non-necessity of the inflation hypothesis. We provide numerical data to support this new conception.
Category: Relativity and Cosmology
[56] viXra:2601.0060 [pdf] submitted on 2026-01-14 12:47:26
Authors: Raul Fattore
Comments: 32 Pages.
A new atomic model is introduced, based on the electron morphology theory derived from extensive experimental research initiated by Compton and further refined by Bostick. This model, which is validated by experimental results, presents a finite-sized atom with defined dimensions and energy, in contrast with the traditional "point particle" concept of infinite energy.The proposed atomic model accounts for all currently known subatomic particles and predicts the existence of potential new ones based on the well-established electrodynamic laws. This atomic model was developed without invoking randomness and non-causality, as quantum theory does, which cannot adequately explain the physical world.The model provides robust explanations for various physical properties of elements and particles and for discrete energy levels from the finite size of a real atom rather than the "magical" energy jumps of quantum models. It further demonstrates the origins of discrete energies, as well as Planck’s and Rydberg’s constants. Experimental validation confirms that the total energy equation accurately predicts known atomic spectral lines and forecasts new ones yet to be observed.The derivation of a real-valued atomic wave function challenges Schrödinger’s imaginary wave function, asserting that a real physical world finite-sized particle must possess a real-valued wave function rather than an imaginary one.The proposed modern atomic model offers a superior framework for understanding the physical properties of particles and elements, surpassing other models by providing true physical insight supported by experimental data and the universal electrodynamic laws.
Category: Nuclear and Atomic Physics
[55] viXra:2601.0059 [pdf] submitted on 2026-01-14 17:01:00
Authors: Dmitriy S. Tipikin
Comments: 3 Pages.
A famous Fibonacci sequence is forming a simple cycle when sign plus is replaced to minus. A simple proof for any numbers is outlined.
Category: Number Theory
[54] viXra:2601.0058 [pdf] submitted on 2026-01-14 17:03:13
Authors: Jeremy Dunning-Davies
Comments: 11 Pages.
The whole idea of entropy has caused, and continues to cause, problems of real understanding for all but especially for students. Here it is hoped to highlight some, but inevitably not all, of those problems and to provoke thought among interested parties towards producing clear and accurate solutions.
Category: Thermodynamics and Energy
[53] viXra:2601.0057 [pdf] replaced on 2026-01-23 20:36:40
Authors: Ervin Goldfain
Comments: 11 Pages.
We have recently conjectured that Dark Matter (DM) emerges from a statisticallyhomogeneous and isotropic Cantor Dust (CD) mass distribution described by a singularmultifractal measure [21-23]. The goal of this book is to show that leading DM paradigms—self-interacting, fuzzy, axion, and superfluid DM—emerge as effective descriptions ofprimordial CD. From this perspective, the multifractal representation of CD provides anultraviolet completion of DM phenomenology, unifying galactic dynamics, lensing, andlarge-scale structure while remaining consistent with cluster-scale constraints andexperimental observations
Category: Relativity and Cosmology
[52] viXra:2601.0056 [pdf] replaced on 2026-04-04 00:24:39
Authors: Christopher C. Mbakwe
Comments: 32 Pages.
This paper presents a novel proof of the non-existence of odd perfect numbers using the framework of algebraic circuit theory and spectral graph theory. We construct a specialized resistive network, Γm(n), where the topology is uniquely determined by the divisor structure of an integer n. By embedding the arithmetic properties of the sum-of-divisors function σ(n) into the Kirchhoff Laplacian L(Γ),we demonstrate that the potential distribution of the network satisfies a discrete harmonic extension if and only if n satisfies specific divisor identities. We then generalize the result to odd k perfect numbers for k > 1.
Category: Combinatorics and Graph Theory
[51] viXra:2601.0054 [pdf] submitted on 2026-01-13 20:24:01
Authors: Michael Gunning
Comments: 7 Pages.
In this paper I put forward some ideas for a physical explanation of the origins of the Lorentz Factor, γ, and how it explains relativistic time dilation, length contraction and Inertia (mass) increase.
Category: Relativity and Cosmology
[50] viXra:2601.0053 [pdf] submitted on 2026-01-13 21:20:39
Authors: Taeho Jo
Comments: 6 Pages.
In this research, we propose the table based KNN variants, as the approach to the word categorization. The initial KNN version which receives a table as its input data was previously proposed as the tool of such task. In this research, we mention the three KNN variants: one where the selected nearest neighbors are discriminated by their similarities with a novice example, one where the attributes are discriminated by their correlations with the target outputs, and one where the training examples are discriminated by their credits. In this research, we modify the three KNN variants as well as the initial version of the KNN algorithm. As the goal of this research, we try to improve the classification performance bymodifying the KNN variants so.
Category: Artificial Intelligence
[49] viXra:2601.0052 [pdf] submitted on 2026-01-13 22:50:17
Authors: Youssef Ayyad
Comments: 24 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Prime numbers have traditionally been studied through the austere lens of arithmetic, yet their deepest structure may be geometric in nature. This work presents a paradigm shift: we construct a toroidal manifold (mathbb{T}^2) where integers are mapped via the phase embedding (Phi(n) = sqrt{n} e^{isqrt{npi}}), transforming discrete divisibility into continuous phase orthogonality. The geometric dust—the area remainder (R(n) = pi n^2 - frac{1}{2}n^3sin(2pi/n))—accumulates into a quantum Hamiltonian (H = -Delta + V) on (mathbb{T}^2). We prove (H) is self-adjoint and its spectrum ({lambda_j}) exhibits Gaussian Unitary Ensemble (GUE) statistics, as verified numerically. Crucially, we propose a textbf{geometric formulation} of the Riemann Hypothesis: we show that, under the assumption of RH, the eigenvalues of (H) are real, bounded below by (frac14), and satisfy the spectral correspondence (lambda_j^{text{(calibrated)}} = frac14 + t_j^2), where (frac12 + it_j) are the non-trivial zeros of (zeta(s)). Numerical verification shows agreement within (0.1%) for the first 50 zeros. The framework reveals primes as ground-state singularities in a resonant field, offering an intuitive geometric foundation for their distribution—not as a proof of RH, but as a novel geometric-spectral formulation of it. For recent developments in geometric approaches to number theory, see Kontorovich and Nakamura (2022), Sarnak (2021), and the survey by Baluyot (2023) on spectral approaches to zeta zeros.
Category: Number Theory
[48] viXra:2601.0050 [pdf] submitted on 2026-01-13 01:28:26
Authors: Debasis Biswas
Comments: 03 Pages.
In this paper Polya equivalent of Riemann Hypothesis is proved from Complex analytic expression of Riemann Xi function.
Category: Number Theory
[47] viXra:2601.0049 [pdf] submitted on 2026-01-13 01:21:15
Authors: Constantin Sandu
Comments: 8 Pages.
In two earlier studies, we demonstrated that due to the enormous accelerations arising during the perpendicular reflection of a photon by a mirror, the photon’s energy distribution behaves as a quadrupole, thereby generating a graviton (or gravitational wavelet) at the same frequency and direction as the reflected photon. The results shown that the energy Eg of a graviton is Eg=χ*ν^3 (where χ* in an universal constant) and not Eg=kν1 as it is hypothesized today in the quantum theories of gravity. For simplicity, in the early studies, only the contribution of the quadrupole component Q_xx at the reflection of a photon by a mirror was previously considered. Here, we extend the analysis to include all quadrupole components associated with perpendicular photon reflection in the case of a resonant cavity. By applying the standard Einstein quadrupole radiation formula, we demonstrate again in a more accurate manner that the energy of the emitted graviton scales as ν^3, revealing a direct coupling between electromagnetism and gravitation. This finding challenges the long-standing but unverified assumption that graviton energy depends linearly on frequency (ν^1). Our results establish that quantum theories of gravity must instead incorporate cubic frequency dependence. The proposed framework provides a new bridge between general relativity and quantum approaches, suggesting that confined electromagnetic radiation can act as a direct source of high-frequency gravitational wavelets.Keywords: gravitation, general relativity, quantum gravity, gravitational wavelets, Nordström—Einstein paradox, quadrupole radiation, graviton generation, resonant cavity.
Category: Quantum Gravity and String Theory
[46] viXra:2601.0048 [pdf] submitted on 2026-01-12 13:52:08
Authors: Sarah Makarem
Comments: 7 Pages.
PictoLens is a novel gaze-based interaction technique for exploring layered data visualizations through progressive disclosure. Thesystem uses real-time gaze data to implement a point-and-click interaction model. Through intuitive gestures such as ‘Gaze and Fixate’and ‘Gaze and Lean In,’ users can seamlessly interact with three representations of the data: an AI-generated pictograph, a scatter-plotvisualization, and an annotated scatter-plot visualization. This hands-free and voice-free interaction technique addresses key challengesof traditional data exploration, such as long dwell times and the Midas Touch problem. PictoLens uses intuitive metaphors fromeveryday gestures: the gaze serves as a pointer, moving the visualization lens. Fixating the gaze at a point on the pictograph unlocks afiner data representation, while leaning forward reveals the most granular, detailed visualization layer with annotations. We presentPictoLens’ design and implementation to demonstrate its potential as an immersive analytics tool and interaction technique.
Category: Artificial Intelligence
[45] viXra:2601.0047 [pdf] submitted on 2026-01-12 14:20:58
Authors: Anatoly V. Belyakov
Comments: 4 Pages.
Based on Wheeler’s geometrodynamics, a mechanism of the transition of an electron neu-trino to a sterile one is proposed. The neutrino deficit (gallium anomaly) is explained, andits magnitude is calculated, consistent with the experimental values recently obtained in theBaksan Experiment on Sterile Transitions (BEST experiment).
Category: Astrophysics
[44] viXra:2601.0046 [pdf] submitted on 2026-01-12 20:50:18
Authors: Youssouf Ouédraogo
Comments: 21 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org) Creative Commons Attribution 4.0 International
This paper proposes a new structural approach to the study of consecutive prime numbers based on a quadratic relation linking three successive primes. A stability ratio is introduced and shown to converge asymptotically to unity using explicit bounds for the k-th prime number. This convergence induces a constraint on the local variation of prime gaps, leading to an asymptotic smoothness law for their relative fluctuations. The analysis is fully deterministic and avoids heuristic arguments based on average asymptotic. Numerical validations using verified large prime datasets confirm the theoretical predictions and illustrate the progressive regularization of local gap variations as the prime index increases.
Category: Number Theory
[43] viXra:2601.0045 [pdf] submitted on 2026-01-12 02:01:23
Authors: Ekaghni Mukherjee
Comments: 21 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Large language models have demonstrated remarkable capabilities across diverse natural language tasks, yet controlling their output characteristics remains challenging. We present HelpSteer Transformer, an attribute-conditioned language model architecture designed for training on the HelpSteer dataset. The model incorporates modern architectural innovations including Rotary Position Embeddings (RoPE), SwiGLU activation functions, and RMSNorm, enabling fine-grained control over five response attributes: helpfulness, correctness, coherence, complexity, and verbosity.The model contains approximately 60 million parameters across eight transformer layers and is designed for efficient scaling while maintaining high-quality text generation. An explicit attribute conditioning mechanism integrates user preferences directly into the generation process, enabling dynamic control of outputs without requiring separate fine-tuning for different attribute combinations. Architectural analysis and preliminary experiments indicate competitive performance relative to larger baseline models, while maintaining lower computational cost. This work highlights the effectiveness of architectural conditioning for controllable and efficient language model design.
Category: Artificial Intelligence
[42] viXra:2601.0044 [pdf] submitted on 2026-01-12 01:56:25
Authors: Bernard Lavenda
Comments: 27 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
We present a stochastic geometric framework for gravity, starting from the Gravitational Balance Equations (GBE)~cite{lavenda} which arise from varying the Einstein-Hilbert action with respect to sectoral scale factors in a doubly-warped spacetime. The extrinsic curvature is promoted to a random field, and a moment hierarchy is derived from the GBE. A geometric projector closure maps second moments to an effective fluctuation curvature, yielding closed mean equations without ad-hoc stress tensors. The fluctuation energy obeys a generalized Bochner formula, linking geometric dissipation to the mean extrinsic curvature and the intrinsic curvature of the leaves. This approach provides a self-consistent probabilistic description of gravitational fluctuations, revealing that classical general relativity is not a fundamental deterministic theory but rather the first-moment truncation of a more complete stochastic geometric description. In particular, the so-called ``exact'' vacuum solutions of Einstein's equations--such as Schwarzschild--are not exact; they are mean-field approximations that neglect the essential nonlinear term (K_{AB}K^{AB}) and all higher fluctuations. This neglect becomes manifest in regimes beyond the photon sphere ((G<3M)), where the classical hierarchy of terms breaks down and the mean-field description yields unphysical results.
Category: Relativity and Cosmology
[41] viXra:2601.0043 [pdf] submitted on 2026-01-11 20:40:51
Authors: Mikhail Batanov-Gaukhman
Comments: 8 Pages.
This report is devoted to the main additions (i.e., modifications) to Einstein's general theory of relativity, which led to the creation of the "Hierarchical Cosmological Model" based on a fully geometrized vacuum physics from the standpoint of the Algebra of signature [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17]. This project is aimed at implementing the Clifford-Einstein-Wheeler program for the complete geometrization of physics.
Category: Classical Physics
[40] viXra:2601.0042 [pdf] submitted on 2026-01-12 00:39:19
Authors: Taeho Jo
Comments: 6 Pages.
In this research, we propose the three KNN variants which considers the feature similarity, as the approaches to the word categorization. The initial version of the KNN algorithm which does so was previously proposed as the tool of the task. We mention the three KNN variants: one which discriminates its selected nearest neighbors by their distances, another which does attributes by their correlations with the target outputs, and the other which does the training examples by their credits. The feature similarity is applied to the three KNN variants as well as the initial version. The classification performance is improved by applying the feature similarity to the KNN variants as the improved KNN versions.
Category: Artificial Intelligence
[39] viXra:2601.0041 [pdf] submitted on 2026-01-10 17:34:38
Authors: Erik G. Bergren
Comments: 5 Pages. (Note by viXra Admin: Author name and an abstract are required in the article; please cite and list scientific references)
Explanation of the Relativistic Space-Time Scales defined by the IAU called "TCB", "TCG", "TT", and "TDB", and the resulting effects on report of times, velocities, positions, and masses using those measurement frames, for example as in the JPL Ephemerides. The constants of those measurement systems are calculated.
Category: Relativity and Cosmology
[38] viXra:2601.0040 [pdf] submitted on 2026-01-10 22:21:25
Authors: Jaba Tkemaladze
Comments: 28 Pages.
Consciousness is fundamentally a process of selection, a continuous "collapse" from a manifold of potential states into a singular, coherent narrative. This article introduces the Ze formalism, a theoretical framework that models this process through a cognitive localization parameter, Γ_Ze. We posit that the critical distinction between wakefulness and sleep is not the presence of consciousness, but the suspension of this localization mechanism. During wakefulness ( Γ_Ze ≫ 1 ), the cognitive system enforces rapid, frequent collapse, yielding a stable, logical stream of thought. Sleep (Γ_Ze → 0), conversely, is a physiologically controlled state of suspended localization, where the brain acts as a "quantum eraser" for cognitive "which-path" information. This allows for the maintenance of coherent superpositions of memory and meaning, with dream phenomenology arising from the resulting interference patterns. The model provides a unifying lens for altered states: it frames psychedelics as conscious Γ_Zereduction, general anesthesia as its artificial nullification, and psychiatric conditions like schizophrenia as its pathological dysregulation. We argue that sleep’s primary function is the offline recalibration of cognitive probability amplitudes c_i, facilitating memory integration, emotional regulation, and creative insight. The Ze formalism thus redefines sleep from a passive state of rest to an active, essential operation for maintaining cognitive flexibility and the integrity of waking consciousness.
Category: Quantum Physics
[37] viXra:2601.0039 [pdf] submitted on 2026-01-11 01:30:39
Authors: Ryan Hackbarth
Comments: 2 Pages.
In this paper, I present a formula for the zeroes of the Riemann Zeta Function and highlight their dependence on a rational integer ratio. I connect these ratios with a hyperbola reminiscent of Pell’s equation which approximates pi and provide a table of calculated ratios and their corresponding Zero. Finally, I demonstrate the requirement of the critical line at ½ in producing these integer approximations.
Category: Number Theory
[36] viXra:2601.0038 [pdf] submitted on 2026-01-10 02:04:57
Authors: Friedrich Sösemann
Comments: 5 Pages. In German
From the minimal ontology of relational hierarchies, information, knowledge, and intelligence, as well as their measures, are derived. The following conclusions are drawn:1. Identical perception of subjects is not necessary for the truth of knowledge.2. Abstraction can lead to subjective randomness and isolated elements of knowledge.3. Knowledge networks are more effective, and therefore more intelligent, than sets of knowledge.
Category: Artificial Intelligence
[35] viXra:2601.0037 [pdf] submitted on 2026-01-09 11:36:26
Authors: Andreas Ball
Comments: 14 Pages.
In this report very exact formulas for the Proton Radius and for the Gravitational Constant are presented, at which the Fine Tuning Term of the "Hans de Vries"-Formula and also the "Julian Schwinger"-Term are applied. Some of the Formulas are constructed quite simple and the connections for one with another are presented. Many of the result values are very accurate referring to their tolerance ranges (i.e. exact in this context) and therefore lie astonishingly closely together.
Category: Mathematical Physics
[34] viXra:2601.0036 [pdf] submitted on 2026-01-09 00:41:53
Authors: Ajay Sharma
Comments: 23 Pages. (Note by viXra Admin: Further repetition may not be accepted)
Newton’s third law is examined within the Newtonian framework under realistic interaction conditions, extending its applicability to real-world systems relevant to contemporary theoretical and experimental investigations. The law asserts the equality and simultaneity of action—reaction force pairs. Newton primarily applied the law qualitatively in Principia, illustrating it through three examples involving macroscopic interactions. Simple rebound experiments show that spherical bodies can retrace their original line of fall and rebound to comparable heights under suitable conditions, whereas asymmetrical or flat bodies exhibit reduced rebound heights and oblique rebound trajectories. The original formulation neglects several interaction-dependent factors, including material properties of bodies, rotation, spin, orientation, contact geometry, and deformation during interaction. Consequently, the law is treated as independent of these factors and is therefore held universally. In horizontal motion the characteristics of the surface are also significant. Motivated by the above qualitative experimental trends and supported by historical and conceptual analysis, a generalized form of Newton’s third law is proposed in which the reaction force is modified or extended by dimensionless coefficients accounting for shape, composition, target surface, and other interaction parameters, and expressed asReaction (FBA) = − [Kshape × Kcomposition × Ktarget × Kother] Action (FAB)The generalized form reduces to the original form under suitable conditions of parameters and provides an experimentally testable framework for quantitative confirmation at the macroscopic level. Over time, applications of Newton’s third law have been extended to diverse systems, including aerodynamics and aerospace propulsion, each of which requires separate quantitative analysis.
Category: Classical Physics
[33] viXra:2601.0035 [pdf] submitted on 2026-01-09 00:37:32
Authors: Le Zhang
Comments: 48 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org) DOI: 10.5281/zenodo.18144335
This paper proposes a framework for a unified Axiomatic Field Theory, establishing the logical closure of the geometric information system based on Information Geometry. By postulating the axiom of Maximum Information Efficiency, we derive the Ideal Planck Constant and demonstrate that physical reality emerges from Saturated Excitation within a constrained Phase Space Topology. Applying the Shannon Entropy Limit and Channel Capacity, we prove that the Fine Structure Constant is a geometric projection of the Vacuum Polarization Background.The framework utilizes the Paley-Wiener Theorem and Orthogonal Decomposition to identify the Deviation Field—manifesting as a Evanescent wave and radiating as Topological Radiation. We derive the Gravitational Constant from the Residue caused by the decay of Geometric Fidelity, explicitly defining gravity as a Recoil Force. Furthermore, the model introduces Field-Cavity Duality and Vacuum Breathing modes. Through Geometric Screening rooted in Measure Theory, we explain Momentum Asymmetry. The system's structural closure is secured via Quantum Phase Locking and Generalized Rabi Oscillation, confirming the G Efficiency Structure aligns closely with the CODATA 1986/1998 historical baseline, while discussing potential theoretical implications for the deviation observed in recent high-precision measurements.. Furthermore, the theory identifies a synchronized ~0.025% vacuum polarization shift across both G and a, suggesting a distinction between derived ‘Geometric Naked Values’ and experimentally screened effective values.
Category: Quantum Gravity and String Theory
[32] viXra:2601.0034 [pdf] replaced on 2026-01-30 05:42:29
Authors: Satish Gajawada
Comments: 12 Pages.
This article is a collection of five Excellent Artificial Intelligence (EAI) articles. First article defines new field Excellent Artificial Intelligence (EAI). Artificial Intelligence Researcher Algorithm version 1 (AIRAv1) is the version 1 of new algorithm designed in the first article. A new algorithm titled Teacher Brother Sister Father Mother Friend Artificial Intelligence Algorithm (TBSFMFAIA) is proposed in the Second article. Kindness Love Satisfaction Peace Excellence Money Happiness Respect Intelligence Health Artificial Intelligence Algorithm (KLSPEMHRIHAIA) is the novel and unique algorithm invented in the third article. A unique algorithm titled Prabhakar Gajawada Bhagyamma Gajawada Satish Gajawada Artificial Intelligence Algorithm (PGBGSGAIA) is proposed in the fourth article. Cricket Match Runs Algorithm (CMRA), Rice Bags Sales Algorithm (RBSA), English Language Sentence Algorithm (ELSA) and Object Swarm Optimization Algorithm (OSOA) are four novel Swarm Intelligence algorithms designed in the fifth article.
Category: Artificial Intelligence
[31] viXra:2601.0033 [pdf] replaced on 2026-02-13 21:13:08
Authors: Tomasz Kobierzycki
Comments: 21 Pages.
In this work I present extensions of Einstein field equations [1] into four index equations. This extensions give as natural result a energy tensor for vacuum thus for gravity field. It’s all construct in spirit of two index field equations and in truth does not need any additional assumptions about field equations. Form it follows that it’s natural completeness of two index equations not a true extension as it fully defines curvature tensor not only Ricci part of curvature as it happens in two index equations. Quantum effects are divided into two parts, one is about wave function like object and measurement, next one is about spin as orientation of manifold. Wave function like object is constructed from normalized curvature invariant. That plays role of "probability" of finding object in given volume of spacetime at given interval of time. I did no present direct solutions to those equations or concrete examples where it differs from General Relativity [1].
Category: Relativity and Cosmology
[30] viXra:2601.0032 [pdf] submitted on 2026-01-08 19:13:15
Authors: Pavlo Danylchenko
Comments: 11 Pages.
The feasibility of using in physics of relativistically invariant Newtonians of the free inertial rest energyof matter and Keplerians of the ordinary rest energy of matter, respectively, instead of relativistically noninvariantHamiltonians and Lagrangians, has been shown. And this is in good agreement not only withrelativistically invariant thermodynamics, but also with the equations of the dynamic gravitational field ofboth the Solar System and flat galaxies. Newton's law of gravity is obtained directly from the condition ofno change in the flow of the proper time of matter during its inertial motion in a gravitational field. Andthus the presence of complete compensation of the gravitational dilation of time of the matter by itsinertial motion is proved. True relativistic transformations of the increments of spatial coordinates andtime are obtained. The true relativistic transformations of increments of spatial coordinates and time arepresented. These transformations are based on Keplerian (which is alternative to Lagrangian) and differ from Lorentz transformations only in one parameter b. Based on the analysis of the motion of the planets, the compensation by the centrifugal pseudo-force of inertia not only of the gravitational pseudoforce, but also of the pseudo-force of evolutionary self-contraction of the matter to the center of gravity is confirmed.
Category: Relativity and Cosmology
[29] viXra:2601.0031 [pdf] submitted on 2026-01-07 04:13:43
Authors: Taiwei Song
Comments: 10 Pages. This paper is re uploaded. The previously published version has been withdrawn due to errors. Belonging to the author's original theory.
In this paper, the author redefines the concept of phase transition in a more general sense, provides an accurate characterization method, and derives the phase transition equation and phase transition temperature formula. On this basis, the essential properties of superconducting phase transition areanalyzed, and the general superconducting phase transition equation is derived. The author argues that the superconducting phase transition of high-temperature superconducting materials,including cuprates, iron-based, nickel-based superconductors, and high-pressure hydrogen-richsuperconductors, still follows the conductive electron pairing mechanism. The key to the highsuperconducting temperature lies in the stronger space-time correlation between conductiveelectrons in low-dimensional structures. Based on the geometry of space-time structures, theauthor defines a more general space-time correlation formula between particles, reveals the logical relationship that the correlation between conductive electrons decreases geometrically with increasing dimensions from the most general conditions, provides relevant formulas, and further analyzes the intrinsic mechanism of the high superconducting phase transition temperature,and proposes the process of developing new high temperature superconductor.
Category: Quantum Physics
[28] viXra:2601.0030 [pdf] submitted on 2026-01-07 11:29:59
Authors: Dmitriy S. Tipikin
Comments: 48 Pages.
In this book the statistical approach to tired light is used to explain the red shift and absence of blurring for the close galaxies. The blurring is actually present and observed for far supernovae and very far galaxies and allows to evaluate the particles on which light is scattered. That creates the possibility to return to old idea of infinite and eternal Universe. The solutions of the Olbers paradox and heat death are discussed too together with new ideas for dark matter and how it solves cusp problem for spiral galaxies
Category: Astrophysics
[27] viXra:2601.0029 [pdf] submitted on 2026-01-07 11:54:30
Authors: Marek Suder
Comments: 8 Pages.
This paper presents a geometric and wave interpretation of energy quantization in the hydrogen atom, based on the de Broglie closure condition of the electron wave in a circular orbit. In this concept, energy quantization is a secondary phenomenon resulting from the fact that the electron wave in each orbit consists of exactly n full periods, and the transition to level n+1 corresponds to the addition of one full period.Combining the wave condition with the classical equilibrium of the Coulomb and centripetal forces leads to values of the orbital radii and a discrete energy spectrum consistent with solutions of the Schrödinger equation for the hydrogen atom. It is shown that energy quantization can be interpreted as a consequence of the resonant nature of the electron's wave nature and the conditionof phase uniqueness after a complete orbit around the nucleus.1The model under consideration is semiclassical in nature and serves as an intuitive representation of known results from quantum mechanics. It is assumed that the allowable states correspond to configurations in which the de Broglie wave forms a standing wave containing an integer number of full periods around the orbital circumference. This condition leads directly to the quantization of angular momentum according to the Bohr model.The waveband model provides a one-dimensional analogy of the full quantum description and can serve as a teaching tool to facilitate understanding the geometric aspects of energy quantization in the hydrogen atom. It demonstrates that energy quantization is a natural consequence of the standing wave geometry and the addition of successive full periods along the orbit as the system transitions to the next energy eigenstate after activation.The electron is a stable eigenstate of a quantum field whose behavior in bound systems can be geometrically interpreted as the self-resonance of a de Broglie wave satisfying the condition of single-valent phase, rather than as a local particle with a classical trajectory. Self-resonance of a wavemeans that the condition of single-valent phase of the wave function after a complete orbit around the nucleus is satisfied, i.e., the requirement that the phase change be an integer multiple of 2π; this is equivalent to the Bohr—de Broglie condition for the closure of the de Broglie wave.
Category: Quantum Physics
[26] viXra:2601.0028 [pdf] submitted on 2026-01-06 09:55:32
Authors: Haihong Xie
Comments: 32 Pages.
This study proposes a new ontological perspective for quantum mechanics: phenomena such as quantum wave-like behavior, randomness, and electron clouds may not be intrinsic properties of particles but originate from the structure of space itself. This perspective is founded on a re-examination of the formation mechanisms of electron orbitals and clouds. We particularly elucidate the origin of the "point-like randomness" exhibited by quantum entities, revealing that it is not an inherent property but stems from the dualistic structure of space and the inherent limitations of current observational technology.This dualistic structure of space manifests as "Open Dimensions" and "Closed Dimensions." Present technology can only detect phenomena on the "Open Dimensions." When a quantum entity possesses low kinetic energy, it primarily moves within the "Closed Dimensions," becoming instantaneously detectable only when passing through an intersection point between the two dimensions; the statistical distribution of a vast number of such instantaneous events manifests as the electron cloud.Based on this dualistic structure, the paper further deduces the fundamental unit of space—the "spatial cell"—and thereby constructs the "Spatial Cell Theory." The theory posits that space is woven from discrete "spatial cells." Each cell contains two functionally distinct dimensions: the "Open Dimensions" serve as connecting channels, linking cells to transmit matter, energy, light, and more; the "Closed Dimensions" do not participate in intercellular connections and serve as the primary arena for the motion of quantum entities with low kinetic energy. Within this framework, a wide range of physical phenomena—including the superluminal correlations of quantum entanglement, dark energy, dark matter, double-slit interference, quasars, and the microscopic dynamics of gravity—can be explained within a unified mechanistic framework.
Category: Quantum Physics
[25] viXra:2601.0027 [pdf] submitted on 2026-01-06 11:54:01
Authors: Holger Döring
Comments: 27 Pages.
An interaction of two coupling fields of fourth order leads to a local energy overthrow, which can generate a form of Big Bang over a phase transition. The interaction of two topological, skyrmionlike objects can cause this sort of Big Bang in a form of description. The assumption, that the first cause thereby is generated from a pointlike singularity is neglected through a substitute of a kink an its antikink of a skyrmionlike structure. The theory- description is a approximation of spacetime — for one/two spacelike dimensions and one timelike dimension. As well as the early but also later phases of the universe are similiar in description to classical theories in inflation and later phases for a universe caused by a single singularity.
Category: Relativity and Cosmology
[24] viXra:2601.0026 [pdf] submitted on 2026-01-06 12:08:16
Authors: Vincenzo Peluso
Comments: 28 Pages.
In his Parmenides, Plato subjects two kinds of One to dialectical examination: the absolute One, without parts, which is neither in space nor in time, nor does it have being, and the One that is being, and therefore is the whole that has parts. These are two totally different Ones, two mutually transcendent worlds. Each of the two, considered independently of the other, ultimately proves to be aporetic within the narrow horizon of the act, within whose limits the thought of the Platonic dialogue is exhausted.However, by extending the ontological horizon to the sphere of potentiality, both, united, constitute the structure of Intention, which binds an "I" to its other. In Intention, the "I" does not exist without being, thanks to which it has a soul and a consciousness, and being makes no sense without the "I".The purpose of this article is to clarify this relationship between the absolute one, the "I", and the one of being, the whole, whose synthesis is the person, and to show that Intention, a true theory of everything, integrates both physical reality and aspects inherent to consciousness and the constitution of the I within a single explanatory framework.The distance of separation is reflected in the time of waiting in the mirror that is the three-dimensional space of Intention, and thus of the universe, as well as of every whole that is part of it. A mirror whose substance is desire and in which the Other is revealed.
Category: History and Philosophy of Physics
[23] viXra:2601.0025 [pdf] submitted on 2026-01-06 22:47:11
Authors: Abdelmajid Benahmed
Comments: 13 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This article examines the development of operator splitting methods in Soviet numericalanalysis during 1955—1975, with particular focus on N.N. Yanenko’s formalization of theMethod of Fractional Steps at the Siberian Branch of the USSR Academy of Sciences. Whilesimilar techniques were independently developed in the West (Peaceman-Rachford 1955,Douglas-Rachford 1956), the Soviet school pursued a distinct trajectory shaped by acutehardware constraints and deep epistemological commitments to operator theory. Throughanalysis of technical publications, archival materials, and comparative historiography, thisstudy argues that material scarcity catalyzed a systematic research program emphasizingcomputational economy, while a pre-existing mathematical culture valorizing theoreticalelegance reinforced this trajectory. The case illuminates how geopolitical constraints andintellectual traditions jointly shaped algorithmic innovation, contributing to methods that ironically became foundational for modern massively parallel computing. Significant archival gaps limit definitive claims about industrial applications, highlighting the need for further primary source research.
Category: General Science and Philosophy
[22] viXra:2601.0024 [pdf] submitted on 2026-01-06 14:29:32
Authors: Brian Scannell
Comments: 25 Pages.
To support intuitive understanding of Fermat’s Last Theorem, this paper presents a simple visualisation based on a defined normalised Fermat plot and shows that rational directions arising from succession t Pythagorean triples—with a fixed hypotenuse gap—become automatically irrational beyond a finite point, explaining why no Fermat type integer solutions can occur along these directions.
Category: Number Theory
[21] viXra:2601.0022 [pdf] submitted on 2026-01-05 20:41:57
Authors: Christ Abnoosian
Comments: 13 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Climate change exacerbates financial uncertainties in emerging markets, where economies are particularly vulnerable to environmental disruptions like droughts, floods, and extreme weather events. Traditional models for pricing climate-linked derivatives, such as catastrophe (CAT) bonds and weather-indexed insurance, often fail to capture the non-linear, high-dimensional nature of climate risks. This paper proposes a quantum-enhanced machine learning (QEML) framework integrating Quantum Amplitude Estimation (QAE), Variational Quantum Eigensolvers (VQE), and Quantum Support Vector Machines (QSVM) with classical techniques like Gaussian Process Regression and deep neural networks. Evaluations on datasets from Brazil, India, and South Africa show up to 35% improved pricing accuracy and 60% faster computation versus classical methods. This approach advances sustainable finance in climate-vulnerable regions.
Category: Quantum Physics
[20] viXra:2601.0021 [pdf] submitted on 2026-01-05 20:34:11
Authors: Gabriel H. Eisenkraemer, Fernando G. Moraesy, Leonardo L. de Oliveira, Everton Carara
Comments: 75 Pages.
Abstract—We describe a lightweight RISC-V ISA extension for AES and SM4 block ciphers. Sixteen instructions (and a subkey load) is required to implement an AES round with the extension, instead of 80 without. An SM4 step (quarter-round) has 6.5 arithmetic instructions, a similar reduction. Perhaps even more importantly the ISA extension helps to eliminate slow, secret-dependent table lookups and to protect against cache timing side-channel attacks. Having only one S-box, the extension has a minimal hardware size and is well suited forultra-low power applications. AES and SM4 implementations using the ISA extension also have a much-reduced software footprint. The AES and SM4 instances can share the same datapaths but are independent in the sense that a chip designer can implement SM4 without AES and vice versa. Full AES and SM4 assembler listings, HDL source code for instruction’s combinatorial logic, and C code for emulation is provided tothe community under a permissive open source license. The implementation contains depth- and size-optimized joint AES and SM4 S-Box logic based on the Boyar-Peralta constructionwith a shared non-linear middle layer, demonstrating additional avenues for logic optimization. The instruction logic has beenexperimentally integrated into the single-cycle execution path of the "Pluto" RV32 core and has been tested on an FPGA system.
Category: Artificial Intelligence
[19] viXra:2601.0020 [pdf] submitted on 2026-01-05 20:31:18
Authors: Silvio Gabbianelli
Comments: 7 Pages. (Note by viXra Admin: Please cite and list scientific references)
By observing the relative positions of odd composite numbers in the set of odd natural numbers up to a given n, the positions of the prime numbers can be logically derived by subtraction. Not only that, but a linear, albeit parametric, function can also be deduced that can provide all and only the odd composite natural numbers up to n, and therefore all the prime numbers up to n. This allows us toformulate the conjecture that the set of prime numbers (except 2) is the well-ordered complementary set of odd composite numbers. This ordering can also be seen using the Cartesian line y = 2x + 1. Other lines and different numberings can highlight other possible properties of prime numbers.
Category: Number Theory
[18] viXra:2601.0019 [pdf] submitted on 2026-01-05 20:28:21
Authors: Urs Frauenfelder, Joa Weber
Comments: 18 Pages. 1 figure
In the merry-go-round fictitious forces are acting like centrifugal force and Coriolis force. Like the Lorentz force Coriolis force is velocity dependent and, following Arnold, can be modeled by twisting the symplectic form. If the merry-go-round is accelerated an additional fictitious force shows up, the Euler force. In this article we explain how one deals symplectically with the Euler force by considering time-dependent symplectic forms. It will turn out that to treat the Euler force one also needs time-dependent primitives of the time-dependent symplectic forms.
Category: Mathematical Physics
[17] viXra:2601.0018 [pdf] submitted on 2026-01-04 22:41:48
Authors: Guofeng Chang
Comments: 8 Pages.
Inspired by Mach’s philosophical standpoint, Einstein constructed the theory of special relativity, which has been shown to be reliable both theoretically and experimentally. However, the negative conclusion regarding the absolute equivalence of relatively moving inertial frames, as suggested by the twin paradox thought experiment, has not been explicitly reflectedat the level of physical theory. The present work attempts to address this issue and includes the following investigations:(1) a reconsideration of the ether problem; (2) derivations of the mass—energy relation and centrifugal acceleration based on an ether contraction framework; (3) a heuristic interpretation of the invariance of the speed oflight and inertial forces. It is hoped that this work may offer some conceptual insight to readers interested in this problem.
Category: Classical Physics
[16] viXra:2601.0017 [pdf] submitted on 2026-01-04 22:34:40
Authors: Aung Kyaw Sunn
Comments: 13 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Recent high-precision cosmological observations have revealed statistically significanttensions between early-universe inferences and late-time measurements, most notably in theHubble constant H0 and the clustering amplitude parameter S8. These discrepancies mayindicate limitations of the standard ΛCDM framework when extrapolated across cosmicepochs. In this work, we develop a thermodynamically motivated cosmological model in which the dark energy component is not introduced as a fundamental constant, but instead emerges dynamically from the thermodynamics of the apparent horizon. By applying Hayward’s unified first law in conjunction with the Clausius relation to the cosmological apparent horizon, we derive a self-consistent evolution equation for the Hubble parameter H(z). Numerical integration of the resulting evolution law yields a present-day expansion rate H0 ≃ 71.0 kms−1 Mpc−1, which lies between cosmic microwave background inferences and local distance ladder measurements. The model further predicts a present-day matter density Ωm,0 = 0.2677 and a clustering parameter S8 = 0.781, both of which are consistent withrecent weak lensing constraints. ...notably in the Hubble constant H0 [2] and the clusteringamplitude parameter S8 [8]. These results suggest that horizon thermodynamics may provide a viable mechanism for generating an effective dark energy component, and that the observed cosmological tensions could reflect an incomplete thermodynamic description of the cosmic expansion history rather than the need for new fundamental fields.
Category: Relativity and Cosmology
[15] viXra:2601.0016 [pdf] submitted on 2026-01-04 14:36:58
Authors: Giovanni Di Savino
Comments: 5 Pages.
Thales measured the height of the inaccessible pyramid and the distance of the unreachable ship from the harbor, demonstrating that anything that can be plotted on a plane can be measured; Euclid, with the product of known prime numbers, continually generates new primes and demonstrated that prime numbers are infinite; Peano, with the second of his five axioms, affirmed that for every natural number there exists a successor number +1. We will never be able to claim to have developed Euclid's inaccessible primes or Peano's unattainable number, but twin primes are two of the infinite primes, one of which is a successor number +2 of the other prime, and the sum of the two primes is always a number 6n; by representing even numbers in the form 6n or 6n±2 and odd numbers in the form 6n±1 or 6n±2±1, we can demonstrate that Euclid's inaccessible primes and Peano's unattainable successor number exist. All prime numbers, all twin primes, all Mersenne primes which are the sum of numbers in double proportion and generate the even perfect numbers, all odd numbers 3n of the Collatz algorithm whose successor +1 is a power 2^n_even which when halved is 2^(n-1) and ends at 2^0 = 1 and all even numbers and all odd numbers which are the sum of 2 or 3 primes, all exist and, even if they will never be known, the final digit of the prime numbers and of the successor number which can be a prime or composite number is known.
Category: Number Theory
[14] viXra:2601.0015 [pdf] replaced on 2026-01-16 01:35:03
Authors: Ervin Goldfain
Comments: 33 Pages.
As sequel to [1-2], this work explores the gravitational consequences of Cantor Dust formation in the primordial Universe. We find that the multifractal structure of Cantor Dust (CD) can account for a wide range of galactic and cosmological phenomena, commonly attributed to either particle Dark Matter (DM) or modified Newtonian gravity (MOND). Asymptotically flat rotation curves are recovered without invoking modified force laws. Baryonic cooling and dissipation fix the extent of luminous structures at a universal acceleration scale, which leads naturally to the baryonic Tully—Fisher relation (BTFR). We survey weak lensing, dynamical friction, and cluster constraints, and outline testable observational signatures distinguishing this framework from standard cold DM scenarios. In summary, our results suggest that CD provides a unified geometric explanation of DM phenomenology across multiple scales.
Category: Relativity and Cosmology
[13] viXra:2601.0014 [pdf] submitted on 2026-01-04 00:56:13
Authors: Andrew W. Beckwith
Comments: 11 Pages. Essay written for the Gravity Research Foundation 2026 Awards for Essays on Gravitation
We consider if a generalized HUP set greater than or equal to Planck’s constant dividedby the square of a scale factor as well as an inflation field, yield the result that Delta Etimes Delta t is embedded in a 5 dimensional field which is within a deterministicstructure. Our proof ends with Delta t as of Planck time yielding an enormous potentialenergy, Second, we e this energy to black hole physics and the early universe. i.e ,Our ideafor black hole physics being used for GW generation , is using Torsion to form acosmological constant. Planck sized black holes allow for a spin density term linked toTorsion.
Category: Quantum Gravity and String Theory
[12] viXra:2601.0013 [pdf] submitted on 2026-01-04 00:50:35
Authors: Ludovic Ngabang
Comments: 2 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
The integration of Large Language Models (LLMs) into the software development lifecycle represents a shift from constructive programming to curated programming. While current metrics focus on productivity gains and syntactical correctness, this paper argues that these metrics are insufficient to capture the long-term systemic risks introduced by AI.We propose the concept of Epistemic Debt: the divergence between the complexity of a software system and the developer’s cognitive model of that system. Unlike traditional Technical Debt, which is often a conscious trade-off, Epistemic Debt is an invisible accumulation of "unearned" code that functions correctly but lacks ahuman owner who understands its causality. This paper provides a theoretical framework for this phenomenon, classifies the specific rchitectural erosions caused by stochastic code generation, and proposes a "Cognitive Ratchet" methodology to mitigate the collapse of maintainability.
Category: Data Structures and Algorithms
[11] viXra:2601.0012 [pdf] submitted on 2026-01-03 23:44:20
Authors: Oliver Couto
Comments: 5 Pages.
This paper has solution for equation with degree’s (1,2,3 & 4). The author has also shown solution for equation for degree (1 & 7). He have also provided an identity where a fourth power is represented by (first, second & third powers).
Category: Algebra
[10] viXra:2601.0011 [pdf] submitted on 2026-01-03 23:42:34
Authors: Andrew W. Beckwith
Comments: 16 Pages.
This study compares a multiverse generalization of CCC Penrose cosmology with my work on Klauder enhanced quantization and the cosmological constant problem. While it is not linked to tokamaks, it is still interesting to contemplate.
Category: Relativity and Cosmology
[9] viXra:2601.0010 [pdf] submitted on 2026-01-02 21:31:57
Authors: Chan Bock Lee
Comments: 13 Pages.
The special theory of relativity proposed by Albert Einstein in 1905 postulates that velocity of light is independent of movement of its light source. It is different from conventional Newtonian mechanics for objects. Based upon advancements in understanding light up to now since 1905, dependence of light velocity upon movement of its light source was investigated. Analysis of characteristics of light such as generation, traveling and interaction with the matter showed that light can be regarded as a discrete one, a particle or a photon. Through analysis of the results of the Michelson-Morley experiment and the working mechanism of radar, it was found that velocity of light depends upon velocity of the light source. Therefore, Newtonian mechanics can be applied to light as same as all the other objects. The evidences which have been used to support the special theory of relativity, such as light from binary stars and time dilation measured in satellite using atomic clock were discussed to show that they can not be solid or sufficient verification. The subjects which need to be updated when the special theory of relativity is not valid, were discussed. As Newtonian mechanics is applied to light, physics and the universe could be more easily conceivable and accessible by using the frame of the three-dimensional space and time.
Category: Relativity and Cosmology
[8] viXra:2601.0008 [pdf] submitted on 2026-01-02 15:30:57
Authors: Abdelmajid Ben Hadj Salem
Comments: 184 Pages. In French. Comments welcome.
This is my second book. It includes 70 exercises and problems with solutions in astronomy, geodesy, celestial mechanics and least squares theory for geomatics students.
Category: Geophysics
[7] viXra:2601.0007 [pdf] submitted on 2026-01-02 15:53:37
Authors: Miloš Čojanović
Comments: 4 Pages.
It is generally accepted that Georg Cantor proved that the set of the real numbers in the interval (0,1) is not countable. Actually instead of real numbers, Cantor considered a set of infinite sequences composed of two characters 'm' and 'n'. We will prove that the countability of rational numbers in the interval (0,1) is crucial for Cantor's Diagonal Argument on the uncountability of real numbers in the interval (0,1) and the Cantor's proof cannot be directly applied to the set of real numbers since some of the rational numbers in binary form can be expressed in two different ways.
Category: Set Theory and Logic
[6] viXra:2601.0006 [pdf] submitted on 2026-01-01 03:50:05
Authors: Moshe Segal
Comments: 23 Pages.
Newton's Universal Gravitational Law (1) provided the magnitude of the Force of the attraction between massive bodies. However, the reason what causes this attraction remained a mystery until the introduction of Einstein's General Relativity Theory (GRT)(2). GRT explained the attraction between massive bodies, but Physics does not provide yet a tested, verifiable explanation to the question: why Electric Charges attract and repel each other, despite the fact, that the Coulomb's Law Force (3) provided the magnitude of the Force of the attraction or the repulsion between Electric Charges. Current main stream Physics does provide several theories which attempt to provide an explanation to why Electric Charges attract and repel each other, but all these theories are still in the stage of research and investigation, and none propose a feasible test or experiment to provide additional validity to its claims. This paper proposes tentative additional explanations to the question: why Electric Charges attract and repel each other, along with a feasible proposed experiment. Since GRT replaced the Energy embedded in Newton's Gravitation Field with Einstein's four-dimensional interwoven spacetime Field, and Newton's Universal Gravitational Force with the geometry attributed to Einstein's four-dimensional interwoven spacetime, then, similarly to GRT, this paper presents, that it might be reasonable to present, several models, which replace the Coulomb's Electric Force with additional geometries, attributed to the Energies embedded in the Electric Fields, such that, these models, might provide additional explanations to why Electric Charges attract and repel each other. And, as already mentioned above, this paper also proposes a feasible experiment which might either provide additional validity to the models presented in this paper, or disprove them. An explanation to why Electric Charges attract and repel each other is just one issue in a larger attempt to Unify Gravity and Electromagnetism. Thus, the existing theories that also try to provide an explanation to why Electric Charges attract and repel each other, also propose a tentative Unification of Gravity and Electromagnetism. Thus, if the experiment, proposed in this paper, will be implemented, and its results will be successful, this might also provide a lead, to achieve a tentative Unification between Gravity and Electromagnetism, a significant issue which is still an open subject.
Category: Relativity and Cosmology
[5] viXra:2601.0005 [pdf] submitted on 2026-01-01 03:47:28
Authors: Bijon Kumar Sen, Subha Sen
Comments: 17 Pages. 2 Figures; 2 Tables
The idea that universe is constituted of a few fundamental substances haunted the human history from earliest of times. Philosophical description of these as earth, water, fire, air and space could not satisfy the urge of scientists who tried to find a scientific description of these essential components. At the dawn of the twentieth century, some such particles were recognized which are electron, proton and neutron. At this time Dirac combined special theory of relativity with quantum mechanics and derived the notion of anti-particles. High energy physics extended the number of constituent particles and their interactions as par Standard Model to weak, quantum electro dynamics (QED), quantum chrono dynamics (QCD) and strong interaction. It is particle physics which definitely showed the role of quantum force as the attractive force between unsymmetrical objects. Big Bang theory of evolution of the universe was found to be inadequate for explaining the binding force of the universe. Alternative theory based on rotation and revolution of energy plasma was found suitable for the formation and binding of the universe. Here it is shown that gravitation and gravity are the effect of quantum forces in different degrees which bind matter by push-in forces from outside and gravity is the faintest reminiscence of gravitation. This idea finds support from particle physics to solve the misconception of gravity as attraction between matter and the failure of Einstein’s search for a unified field theory. Finally, it is established that high energy physics and particle physics cannot reach so high an energy that prevailed in the early stage of the formation of the universe. However, finding ultimate particles is not solved till today.
Category: High Energy Particle Physics
[4] viXra:2601.0004 [pdf] submitted on 2026-01-01 03:33:47
Authors: Fabio Savoca
Comments: 5 Pages.
This paper investigates the logical-mathematical foundations of physical reality, proposing amodel based on the persistence of symmetry breaking from the real to the complex domain.We postulate the existence of two fundamental structures: the Internal Structure S(O),defined in Hilbert Space, and the External Structure S(O^-1), defined in the complex field.The theoretical core of the work lies in identifying two mutually exclusive regimes of accessto reality: the state of Observation (Potential Infinity) and the state of Understanding(Actual Infinity). We demonstrate that phenomenal reality and logical reality are not static, but the result of a continuous high-frequency exchange between cardinality increment and complex rotation. Furthermore, we hypothesize that such rotation is governed by a metric compatible with the Riemann Hypothesis, linking the distribution of quantum weights tothe nature of prime numbers.
Category: History and Philosophy of Physics
[3] viXra:2601.0003 [pdf] submitted on 2026-01-01 03:20:53
Authors: Yi-Chia, Tsen
Comments: 4 Pages.
We propose a theoretical cosmology framework in which the classical spacetime manifold is reinterpreted as an emergent superfluid vacuum, described by a Bose--Einstein condensate governed by a nonlinear textit{logarithmic Schr"{o}dinger equation} (LogSE). In this two-phase picture, the homogeneous ground-state of the condensate (Phase A) gives rise to cosmic acceleration (dark energy) through its negative pressure and exhibits a small bulk viscosity that can reconcile disparate measurements of the Hubble constant. Meanwhile, excited states of the condensate (Phase B) form quantized vortices and density solitons that behave as dark matter halos in galaxies. We derive the effective fluid dynamics of this superfluid vacuum, showing that it naturally yields a cosmic equation-of-state $w approx -1$ on large scales and MOND-like phenomenology on galactic scales, without requiring unknown particle species. We demonstrate that quantum pressure from the LogSE resolves the core--cusp problem by stabilizing galactic cores, and that the logarithmic self-interaction allows halo core sizes to be decoupled from the particle mass, avoiding the Catch-22'' that plagues fuzzy dark matter. The framework is confronted with observations: it passes current cosmological tests and galactic rotation curve data, while making distinct, falsifiable predictions. In particular, Lorentz invariance emerges only as a low-energy symmetry of the superfluid vacuum, implying an energy-dependent vacuum refractive index at high energies. We discuss how precision multimessenger timing (e.g., GW170817) and ultra-high-energy gamma-ray observations (e.g., LHAASO detection of GRB~221009A) place stringent constraints on any such Lorentz-violating dispersion. Upcoming astronomical surveys and particle experiments will further test this unified dark'' sector framework.
Category: Relativity and Cosmology
[2] viXra:2601.0002 [pdf] submitted on 2026-01-01 02:07:28
Authors: KyeongDo Kwak
Comments: 39 Pages.
The second principle of relativity, stating that the speed of light is constant re-gardless of the source’s velocity, remains incompletely understood.Moreover, thespeed of light is incompatible with length contraction.Beyond this, relativity stillcontains many thought experiments that are difficult to comprehend. These includeBell’s spaceship paradox, the muon paradox, the Supplee submarine paradox, andthe Ehrenfest paradox. The commonality among these problems is that logical con-tradictions arise during the application of length contraction. Since these problemsstem from length contraction, approaching them with length expansion logically re-solves all issues. This article examines whether length expansion resolves this seriesof problems.
Category: Relativity and Cosmology
[1] viXra:2601.0001 [pdf] replaced on 2026-01-25 00:27:26
Authors: Keshava Prasad Halemane
Comments: 16 Pages. 1 Table
This research report presents the Collatz-Hasse-Syracuse-Ulam-Kakutani (CHSUK) Theorem, which asserts the convergence of the Collatz Sequence to the trivial cycle, thus proving the Collatz Conjecture, which has been a long-standing unsolved problem. The proof is based on the bijective isomorphism established between the set of positive integers and a carefully designed system with a hierarchy (arborescence) of binary-exponential-ladders defined on the set of positive odd numbers.
Category: Number Theory