[142] viXra:2510.0154 [pdf] submitted on 2025-10-31 16:30:15
Authors: Fian Qnoz
Comments: 12 Pages. 37 figures.
The exploration of incomplete magic square of squares, whose being fully working magic square with less than 9 square entries, leads to the extensive use of Brahmagupta-Fibonacci identity. By only taking account of primitive (irreducible) entries, the umbrella term Brahmagupta’s Abacial Slice of Irreducible Calamari (BASIC) magical marine square was adopted. Interesting varieties ranging from congruent elliptic curves up to the affine variety A6 along with the K3 surface of degree 8 were encountered when one considers the birational model of Magicmare.
Category: Number Theory
[141] viXra:2510.0153 [pdf] submitted on 2025-10-31 16:29:16
Authors: Juan Truchado Martin
Comments: 16 Pages. (Note by ai.viXra.org Admin: Please be scholarly - Cite and list scientific references)
By folding a sheet of paper, we prove that length contraction (special relativity) and spacetime curvature (general relativity) are the same geometric effect seen from inverted observers. In a dynamic universe, light travels the same proper distance for all observers. The radius of the observable universe (~8.8 × 10²u2076 m) coincides with the event horizon scale of the central galactic black hole (2GM ≈ 1.09 × 10²u2077 m), with a maximum Lorentz factor of 1/cu2074 (~1.23 × 10u207b³u2074)—matching the order of the Planck constant h (~6.626 × 10u207b³u2074 J·s). One coincidence could be chance. Two, across 60 orders of magnitude, are not. Mass is spherical self-interaction; time is work done by self-observation. Constants (π, G, c, e) are determined per ds with intrinsic uncertainty. Key equations treat radii and masses as critical quantized values—absent from literature. No citations. Only geometry and motion.
Category: Relativity and Cosmology
[140] viXra:2510.0152 [pdf] submitted on 2025-10-31 18:33:29
Authors: Jaime Vladimir Torres-Heredia Julca
Comments: 16 pages, 6 figures
This paper is a continuation of viXra:2508.0176, in which we saw that we can avoid theconcepts of negative number and complex number thanks to the study of the underlying vectornature of some arithmetic and polynomial problems. With the solutions of the polynomialequations which were actually geometrical, in the Euclidean vector space, we will constructseveral operations which are analogous to what we have seen until now with "complex numbers".We will show also the representations of functions whose arguments are vectors. We will see thebasic elements needed in order to rebuild all what has been constructed in complex analysis.We will show also that we can construct the Mandelbrot set in the Euclidean vector space.
Category: General Mathematics
[139] viXra:2510.0151 [pdf] submitted on 2025-10-31 22:11:53
Authors: Teo Banica
Comments: 400 Pages.
This is an introduction to plane geometry, angles and trigonometry, starting from zero or almost. We first discuss basic plane geometry, with the main results regarding the triangles explained. Then we get into trigonometry, with the basic properties of the sine, cosine and tangent discussed. We then go on a more advanced discussion, using affine and polar coordinates, then complex numbers, and with a look into trilinear coordinates too. Finally, we get into calculus methods, with an even more advanced study of the trigonometric functions, and with some applications discussed too.
Category: Geometry
[138] viXra:2510.0150 [pdf] submitted on 2025-10-30 12:15:52
Authors: Anatoly V. Belyakov
Comments: 4 Pages.
The mass values of the Majorana and sterile neutrinos and the period of their oscillations, calculated on the basis of a physical model based on J. Wheeler’s geometrodynamics, coincide with the results obtained in the Neutrino-4 Experiment.
Category: Astrophysics
[137] viXra:2510.0149 [pdf] submitted on 2025-10-30 20:23:19
Authors: Daniel Nehmé
Comments: 15 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
We introduce Scale-Relative Set Theory (SRST), a revolutionary framework where mathematical concepts are inherently relative to observational scales $kappa in mathbb{E}$. This theory naturally resolves classical paradoxes (Russell, Cantor, Burali-Forti), unifies arithmetic with geometry through a fundamental number-point isomorphism, and provides new foundations for topology, analysis, and modern physics. We develop complete scale-relative constructions of numerical sets, set operations, distance metrics, and neighborhood systems. The framework demonstrates profound connections to gauge theories, parallel transport, renormalization group flow, and quantum gravity. SRST explains how mathematical structures emerge across scales from quantum to cosmological, offering a unified understanding of reality that transforms both mathematical foundations and physical theory.
Category: Set Theory and Logic
[136] viXra:2510.0148 [pdf] submitted on 2025-10-29 21:14:32
Authors: Srinivas Nampalli, Tanav Khambapati, Saathvik Gampa
Comments: 12 Pages. (Note by viXra Admin: Author names are required in the article)
This paper presents a comprehensive predictive maintenance system for industrial robot motors utilizing multi-sensor fusion and machine learning techniques. The proposed system analyzes 84,942 real-time sensor measurementsfrom six motors across eight test sessions, integrating temperature, voltage, and position data to detect operational anomalies. We implement and compare three machine learning approaches: Random Forest (RF), XGBoost, and Long Short-TermMemory (LSTM) networks. Using proper session-based data splitting to prevent leakage, RF achieves an AUC score of 0.871 with corresponding precision-recall AUC of 0.824 and F1-score of 0.813. The system processes a dataset with 26.12% anomaly prevalence (IQR-rule labels), with position sensors providing the strongest predictive signal. Our feature engineering pipeline incorporates rolling statistics and temporal patterns, improving prediction accuracy by 15% over baseline models. The developed web API enables real-time deployment with 42ms single-predictionlatency, making it suitable for industrial IoT applications. Experimental results couldreduce unplanned downtime by 30—45% under typical PdM adoption scenarios (assumptions detailed in §V-D). This work contributes to the field by providing a scalable, production-ready framework for multi-sensor anomaly detection in robotic systems.
Category: Data Structures and Algorithms
[135] viXra:2510.0147 [pdf] replaced on 2025-12-09 01:10:11
Authors: Patrice Uguet
Comments: 32 Pages.
This article proposes a realistic approach to entanglement by formulating the hypothesis that the photon is not a point-like entity, but a linear, vibrating particle endowed with energy, polarization, and spin: the photon-fibre.This material link between entangled particles ensures a continuity of state along its entire length.Thus, instantaneous entanglement correlations occur without invoking non-locality or the EPR paradox (Einstein, Podolsky and Rosen, 1935).Applied to the Casimir effect, this representation assigns an intrinsic tension to the photon- fibre, capable of accounting for the attraction measured between plates (Casimir, 1948). The model distinguishes two states: a mobile and luminous state before electronic docking, and an established, immobile and dark state after docking.The presence of established fibres throughout the universe provides a mechanism and an interpretation for vacuum energy and dark matter (Zwicky, 1933; Rubin and Ford, 1970). At the atomic scale and beyond, this network of stretched fibres connecting electrons from distinct atoms generates collective tractions likely to constitute quantum gravity.We thus observe that quantum gravity may be understood as the set of universal entanglements.Dark matter would correspond to immobilized fibres, and dark energy to the expansive tension of fibres that remain mobile.We further suggest that this model can be extended to other bosons, opening the way toward a conceptual unification of the fundamental forces.Cosmological evolution would then result from the gradual conversion of mobile fibres into the established state: free light slowly transforms into gravitational cohesion.This conversion governs the dynamics of cosmic expansion and the balance between gravity and dark energy.Validation of these hypotheses must involve empirical experiments investigating the specific signature of a linear particle, the docking phenomenon in the Casimir effect, and any possible attractive force between two entangled elements.The Fibre Theory thus presents itself as a unified program for reframing quantum interactions and certain cosmological observations on the basis of a single physical object: the fibre.Mots-clés : Fibre-photon, actionneur de force, continuité d’état, force photonique, statut mobile, statut établi, photon sombre, arrimage, Réseau Global UniverselKeywords: Fiber-photon, force actuator, state continuity, photonic force, mobile status, established status, dark photon, docking, Universal Global Network.
Category: Quantum Gravity and String Theory
[134] viXra:2510.0146 [pdf] submitted on 2025-10-29 20:57:53
Authors: Qiuyu Shan
Comments: 18 Pages.
This paper identifies a long-overlooked issue in foundational physics and proposes a tentative solution: Einstein's clock synchronization scheme cannot be realized at the quantum scale, yet Quantum Field Theory (QFT) employs this unexamined scheme and relies on continuous spacetime coordinates as its foundation. This paper presents a new scheme for defining simultaneity at the quantum scale. Within this scheme, interesting conclusions such as discrete spacetime can be derived, illustrating how it can provide a foundation for the ultraviolet cutoff in renormalization methods, and also offer a possible operational basis for the background spacetime of QFT. It further discusses how to recover traditional Lorentz transformations and the Schrödinger equation by stipulating good clocks, and presents some currently testable corollaries.
Category: Quantum Physics
[133] viXra:2510.0145 [pdf] submitted on 2025-10-29 15:24:47
Authors: Oliver Couto
Comments: 4 Pages.
In volume 2 of L. Dickson book (history of theory of numbers) the below system of equation, has been considered in. Ajai Choudry in his latest paper (ref # 1), has considered three simultaneous equation with one of them having a maximum degree of three. This paper has considered a system of four equations with a maximum degree of two. The author has provided two methods to arrive at parameterization.
Category: Algebra
[132] viXra:2510.0144 [pdf] replaced on 2025-11-02 14:27:21
Authors: Ervin Goldfain
Comments: 40 Pages.
Complex Ginzburg-Landau equation (CGLE) is a universal amplitude equation governing the dynamics of phenomena unfolding in far-from-equilibrium conditions. It was recently argued that CGLE emerges from primordial dimensional fluctuations acting in the far ultraviolet sector of field theory and primordial cosmology. Here we show that classical Maxwell, Dirac and non-Abelian field theories can be derived directly from a generalized version of CGLE without invoking a Lagrangian or variational principle. Demanding that CGLE preserves local coherence under continuous internal transformations, we introduce a natural covariant derivative whose connection acts as a gauge field. The commutator of these covariant derivatives defines a curvature tensor that reproduces the familiar structure of Maxwell and Yang—Mills field strengths, while a first-order, spinor generalization of CGLE yields Dirac-type dynamics. In a nutshell, classical field theories naturally emerge from demanding local coherence invariance of the generalized CGLE.
Category: Mathematical Physics
[131] viXra:2510.0143 [pdf] submitted on 2025-10-28 20:40:15
Authors: Nagi Nangirky Ogata, Norichika Ogata
Comments: 4 Pages. 1 figure, 10 references
While the normal distribution is often termed "normal," it is uncommon to encounter precise bell curves in biological measurements. In this study, we discovered an accurate normal distribution in the intraspecific body mass distribution of the Scarab Beetle Anomala albopilosa. We captured over thousands of beetles and measured their weight, and we also conducted a Mark-Release-Recapture study.
Category: Quantitative Biology
[130] viXra:2510.0142 [pdf] submitted on 2025-10-28 09:37:22
Authors: Holger Döring
Comments: 21 Pages.
In Thoms catastrophe theory there can exist control-variables, which can describe the system variables of a physical system changing in a continuos manner, while the system variables change spontan and abruptly their physical states and so the equlibrium of the whole system.This is a u201ecatastrophe" after Thoms definition. The system variables can be defined over a class of germ-functions, which minimal enfoldings are the added functions of controlling variables. Most descriptions are time - and space-dependend.But in this modelling the description of universe itself can be given of Big Bang as a germ of an elliptic umbilic point and the universe expansion as its enfolding. If the assumption is made, that the three spacelike dimensions are generated in a different energy spectrum, then inertia could be explained over supposed anisotropy of spacetime in the microscale of Planck-length.
Category: Quantum Gravity and String Theory
[129] viXra:2510.0141 [pdf] submitted on 2025-10-28 11:41:01
Authors: Richard J. Mathar, Artur Jasinski
Comments: 21 pages including a Mathematica program
The manuscript contains one corrigendum to the evaluation of the generalized hypergeometric function 3F2(-1) in chapter 7.4.5 in volume 3 of the Prudnikov-Brychkov-Marichev book series, four corrigenda to the evaluation of 4F3(1) in chapter 7.5.3, sixteen corrigenda to the evaluation of 4F3(-1) in chapter 7.5.4, and six corrigenda to the evaluation of 5F4(1) in chapter 7.7.2.
Category: Functions and Analysis
[128] viXra:2510.0140 [pdf] submitted on 2025-10-28 20:35:34
Authors: Bhushan Poojary
Comments: 9 Pages.
We propose that gravity arises not from curvature induced by mass—energy, but from weak nonlocal entanglement between microscopic spacetime fabrics associated with each particle. These fabrics exchange information through a decaying entanglement field that scales as 1/2, producing time dilation and curvature as emergent synchronization effects. We derive modified Einstein field equations incorporating an entanglement scalar field () show how the resulting potential yields asymptotically flat galactic rotation curves without invoking dark matter, and demonstrate consistency with relativistic lensing constraints. This framework, termed the General Theory of Relative Fabrics (GTRF), unifies general relativity and quantum entanglement under a single geometricinformational principle.
Category: Relativity and Cosmology
[127] viXra:2510.0139 [pdf] submitted on 2025-10-28 13:54:18
Authors: Ritvik Chappidi, Aditya Jupally
Comments: 2 Pages.
Scaling laws describe how model performance improves with dataset size, model width, and compute. While such laws are well documented for large-scale language models, their behavior in small networks remains less understood. This paper presents a concise empirical study of loss scaling behavior in simple feedforward neural networks trained on synthetic regression tasks. Results show that even very small networks follow an approximate power-law relationship between dataset size and test loss, with a fitted exponent of about 0.076. These findings suggest that scaling regularities emerge even at small scales, implying that the underlying principles of efficiency and generalization extend beyond large-scale models.
Category: Artificial Intelligence
[126] viXra:2510.0138 [pdf] submitted on 2025-10-28 15:46:12
Authors: Martin Kraus
Comments: 6 Pages.
Mie's electrodynamics shares several features with a specific modified Born-Infeld field theory, including nonlinearity, lack of gauge invariance, and particle-like solutions of negative energy and mass. Unfortunately, Mie has never identified suitable field equations to completed his theory. The present remarks discuss potential reasons that might have prevented Mie and other physicists of his time from researching the kind of field equations underlying the mentioned modified Born-Infeld field theory.
Category: History and Philosophy of Physics
[125] viXra:2510.0137 [pdf] replaced on 2025-11-24 22:51:06
Authors: Yerkebulan Bolat
Comments: 40 Pages.
This paper is devoted to an in-depth study of the concept of the power of a point and its applications to the solution of olympiad-level geometry problems. The discussion encompasses the classical definitions of the power of a point, the radical axis, and the radical center, as well as their various generalizations --- including the interpretation of a point as a circle of zero radius, the notion of coaxial circles, and the linearity property of power differences. Detailed examples drawn from both national and international mathematical olympiads are presented to showcase the effectiveness of these methods in addressing both classical and modern geometric problems. Furthermore, the paper considers potential extensions and applications within a broader framework of elementary geometry, with particular emphasis on their value as practical tools for olympiad training and problem solving.
Category: Geometry
[124] viXra:2510.0136 [pdf] replaced on 2025-11-14 13:32:33
Authors: Yaroslav D. Kivenko-Emetov
Comments: 7 Pages.
This article examines the philosophical and methodological legacy of Dmytro Tarasovych Krivenko (1941—1994) on the occasion of the 84th anniversary of his birth. As one of the most original Ukrainian thinkers of the late Soviet period, Krivenko developed a textit{hierarchical model of cognition} and proposed textit{hierarchical logic} as a complement—rather than a negation—of dialectical logic. His works represent an attempt to reinterpret the epistemological foundations of science beyond both positivism and dogmatic materialism. Reconsidering the crisis of Western metaphysics from antiquity to the twentieth century, the study places Krivenko’s ideas within the broader context of the global discourse on materialism as a methodological principle rather than as a worldview.
Category: History and Philosophy of Physics
[123] viXra:2510.0135 [pdf] replaced on 2025-12-12 01:08:37
Authors: Zihang Chen
Comments: 10 Pages.
This paper will start with the derivation of the Euler-Maclaurin formula with singularities, compensate for the series divergence problem of its fitting the zeta function by adding a compensation factor ε, perform analytic continuation on the expanded series, analyze the distribution of its trivial zeros through the laws of Bernoulli numbers, and then construct functions and combine the properties of the gamma function to solve the distribution law of non-trivial zeros.
Category: Number Theory
[122] viXra:2510.0134 [pdf] submitted on 2025-10-27 06:01:48
Authors: Chunhua Jin, Yifu Wang
Comments: 38 Pages.
This paper is concerned with a predator-prey model in $N$-dimensional spaces ($N=1, 2, 3$), given bybegin{align*}left{begin{aligned}&frac{partial u}{partial t}=Delta u-chiablacdot(uabla v),&frac{partial v}{partial t}=Delta v+xiablacdot(vabla u),end{aligned}ight. end{align*}which describes random movement of both predator and prey species, as well as the spatial dynamics involving predators pursuing prey and prey attempting to evade predators. It is shown that any global strong solutions of the corresponding Cauchy problem converge to zero in the sense of $L^p$-norm for any $1<ple infty$, and also converge to the heat kernel with respect to $L^p$-norm for any $1le ple infty$. In particular, the decay rate thereof is optimal in the sense thatit is consistent with that of the heat equation in $mathbb R^N$ ($N=2, 3$).Undoubtedly, the global existence of solutions appears to be among the most challenging topic in the analysis of this model. Indeed even in the one-dimensional setting, only global weak solutions in a bounded domain have been successfully constructed by far. Nevertheless, to provide a comprehensive understanding of the main results, we append the conclusion on the global existence and asymptotic behavior of strong solutions, although certain smallness conditions on the initial data are required.
Category: Functions and Analysis
[121] viXra:2510.0133 [pdf] submitted on 2025-10-27 23:48:35
Authors: Taiki Takahashi
Comments: 10 Pages.
Recent advances in cultural psychology elucidated a number of cultural differences in diverse psychological characteristics and behaviors from perceptions, and economic decisions to religiosity. Also, quantum models of cognition and decision making have been developed to mathematically characterize perceptions, and human judgement and decision making. This study proposes cultural quantum modelling approaches to cultural psychology and neuroscience, by utilizing the mathematical model of quantum cognition and decisions in psychology, economics, and decision science. This approach may help better quantitatively rigorous understandings of cultural differences between Westerners and Easterners, Catholics and Protestants, and other cross-cultural variations in psychological and behavioral characteristics and normative principles of rationality.
Category: Social Science
[120] viXra:2510.0132 [pdf] submitted on 2025-10-27 17:21:34
Authors: Ryan O'Rourke
Comments: 6 Pages.
In this paper I propose a function that would, if proven, allow one to find the pancake number, or diameter, of any arbitrarily large pancake graph. This solution is derived from an observation which points to there being an underlying structure to the sequence of pancake numbers, tying them closely to the Fibonacci sequence. The sequence arises in sets of adjacent n-numbers having equivalent differences between the n-number and its corresponding P_n number.
Category: Combinatorics and Graph Theory
[119] viXra:2510.0131 [pdf] replaced on 2025-11-15 19:30:17
Authors: Pavlo Danylchenko
Comments: 139 Pages. updated formulas
The cardinal difference between relativistic gravithermodynamics (RGTD) and general relativity (GR) is that in RGTD the extranuclear thermodynamic characteristics of matter are used in the tensor of energy-momentum to describe only its quasi-equilibrium motion. For the description of the inertial motion in RGTD only the hypothetical intranuclear gravithermodynamic characteristics of matter are used. Exactly this fact allows avoid the necessity of non-baryonic dark matter in the Universe in principle. Evolutionary self-contraction of microobjects of lower layers of gravithermodynamically bonded matter outpaces the similar self-contraction of its upper layers. This is the exact reason of the curvature of intrinsic space of matter. That is why gravitational field itself should be primarily considered as the field of spatial inhomogeneity of evolutionary decreasing of the size of matter microobjects in the background Euclidean space of expanding Universe. In correspondence to this the gravitational field itself is the field of spatial inhomogeneity of gravithermodynamic state of dense matter of compact astronomical objects, as well as of strongly rarefied gas-dust matter of space vacuum. And, therefore, the gravitational field fundamentally cannot exist without matter. That is why it is not an independent form of matter. It is shown that equations of the gravitational field of GR should be considered as equations of spatially inhomogeneous gravithermodynamic state of only utterly cooled down matter. This matter can only be the hypothetical substances such as ideal gas, ideal liquid and the matter of absolutely solid body. The real matter will be inevitably cooling down for infinite time and never will reach the state that is described by the equations of gravitational field of the GR. Only conditional identity of inertial mass of moving matter to its gravitational mass only by gravity-quantum clock, which is located in the point, from which the matter started its inertial motion, and due to the usage of corrected value of gravitational constant in its pseudo-centric intrinsic frame of reference of spatial coordinates and time, is justified. This is related to the equivalence of inertial mass of matter to the Hamiltonian of its inert free energy, while the gravitational mass of matter is equivalent to the Lagrangian of its ordinary rest energy. The identity of the multiplicative component of the Gibbs free energy to the ordinary rest energy of matter, which is equivalent to its gravitational mass, is substantiated.
Category: Relativity and Cosmology
[118] viXra:2510.0130 [pdf] submitted on 2025-10-26 00:58:08
Authors: Xianzhong Cheng
Comments: 6 Pages.
Objective: To critique and reconstruct the theoretical foundation of "light as electromagnetic waves", proposing an alternative theoretical framework based on a real "ontology".
Method: Through logical analysis, it points out the conceptual confusion between "light has wave-like properties" and "light is an electromagnetic wave"; it reinterprets the LC oscillation circuit as a nuclear magnetic resonance process; it demonstrates that the wavelength measured by diffraction is the wavelength of the "momentum wave" rather than the wavelength of the electromagnetic transverse wave.
Conclusion: Light is a "light soliton" with rest mass [?] and its wavyness is intrinsic "momentum wave". Vacuum electromagnetic properties (εu2080, μu2080) are the ontological properties of the "Qi-state crystalline ether". The geometric view of gravity in general relativity is a mathematical fiction based on false presuppositions[?]. This work has provided a new ontological basis for unified understanding of the wave-particle duality.
Category: Classical Physics
[117] viXra:2510.0129 [pdf] submitted on 2025-10-26 23:04:55
Authors: Pravin Kumar Mishra
Comments: 10 Pages. (Note by viXra Admin: Please cite listed scientific reference)
This paper presents a series of theorems and corollaries in two sections. The 2nd section outlines a method for verifying the existence of prime numbers within specific intervals. Postulates 1 and 2 establish a methodology to verify the existence of primes in the specific intervals shown in Theorem 1 and its corollary and Theorem 2. The 3rd section outlines a prime indicator function that generates all primes sequentially. The construction of this indicator begins from Theorem 3, and Theorem 4 provides insights on the sum of all odd composite numbers, and its corollary produces a prime indicator supported by an Illustration of a few first numbers. This work provides new insights into how to use elementary principles and methods.
Category: Number Theory
[116] viXra:2510.0128 [pdf] submitted on 2025-10-26 22:57:23
Authors: Jianming Wang
Comments: 6 Pages.
How to extract the mass of up quark and down quark in proton? In this paper, a new extraction method is found. Although the exact mass of the up quark and the down quark in the proton cannot be obtained, the accurate ratio of the mass of the up quark and the down quark can be extracted, and the mass relationship among the three valence quarks can be found. Based on the experimental results of Seaquest carried out by Fermilab in 2021, through analysis, the accurate ratio of up quark to down quark mass (mu/md=0.707) is obtained, and the mass triangle is established. It is deduced that the sum of squares of up quark mass in protons is equal to the square of down quark mass. The mathematical expression is mu²+ mu²= md². According to the observation of the decay law of other baryons, the quark law in baryons is obtained. The basic content of quark law: except protons, the heavy quarks in other baryons will decay into up quark or down quark, and the sum of squares of up quark mass should be equal to the square of down quark mass.
Category: High Energy Particle Physics
[115] viXra:2510.0127 [pdf] submitted on 2025-10-26 22:51:54
Authors: Jean-Yves Boulay
Comments: 47 Pages.
Grounded in a novel mathematical framework, this study partitions the set of whole numbers (ℕ0) into four distinct hierarchical classes. A key innovation is the definition of Ultimate Numbers—the union of the prime numbers with zero and one—which resolves classic conceptual limitations. Three further subsets, representing increasing degrees of numerical complexity, are subsequently defined by the initial distinction between ultimate and non-ultimate numbers within ℕ0. The structural interaction among these four classes yields unique arithmetic arrangements in their initial distribution, most notably revealing an exact and recurring 3:2 ratio.
Category: Number Theory
[114] viXra:2510.0126 [pdf] submitted on 2025-10-26 22:48:15
Authors: Amin Bagheri
Comments: 6 Pages. (Note by viXra Admin: Please cite listed scientific reference and submit article written with AI assistance to ai.viXra.org)
The number e, which is Euler’s number, has an important role in the fields of analysis, differential equations, and number theory. In 1873, Charles Hermite proved the transcendence of e , which was a fundamental achievement in the theory of transcendental numbers. In this note, we focus on the historical and mathematical context surrounding Hermite's original proof, followed by an outline of a modern, elegant, and straightforward approach proving e’s transcendence. This paper intends to present a clear, succinct, and self-explanatory assembly ideal for undergraduate students or early career researchers and shifts focus from severe technical overgeneralization to clarity of explanation and conceptual understanding.
Category: Functions and Analysis
[113] viXra:2510.0125 [pdf] submitted on 2025-10-26 19:16:25
Authors: Richard F. Cronin
Comments: 23 Pages. Astrophysical Phenomena are the Principal Drivers of Earth's Weather & Climate
This lexicon explains terms and supplements other papers by Richard F. Cronin
Category: Astrophysics
[112] viXra:2510.0124 [pdf] submitted on 2025-10-26 20:46:48
Authors: Jaba Tkemaladze
Comments: 24 Pages.
The intensifying frequency of climate disasters, geopolitical conflicts, and pandemics exposes critical vulnerabilities in globalized, input-intensive food systems. Traditional protein sources—terrestrial livestock, crops, and marine fisheries—are highly susceptible to collapse under such catastrophic scenarios due to their dependencies on complex supply chains, external inputs, and stable climatic conditions. This article posits that lake-based aquaculture represents a strategically undervalued yet indispensable component of a resilient food security framework. We argue that the inherent characteristics of lacustrine systems—including superior feed conversion ratios, the utilization of natural trophic pathways, and a static "live storage" production model—confer a unique capacity to function autonomously during prolonged infrastructural and logistical breakdowns. The analysis delineates criteria for selecting resilient fish species, advocates for extensive polyculture management models, and outlines strategies for mitigating risks related to disease, genetic resource security, and ecological degradation. Furthermore, a strategic roadmap is proposed for integrating this approach into national policy, emphasizing legislative action, targeted research, economic incentives, and specialized education. The conclusion asserts that proactive investment in developing lake aquaculture as a decentralized protein reserve is a critical imperative for enhancing national food sovereignty and long-term survivability in an era of escalating systemic risks.
Category: Economics and Finance
[111] viXra:2510.0123 [pdf] submitted on 2025-10-26 23:11:37
Authors: E. P. J. de Haas
Comments: 13 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org; please also cite other scholars' work)
This paper develops a new way to connect quantum theory and gravitation by placing geometry inside the structure of the Dirac equation itself. A single mathematical object, the gravitational rotor field Qg, replaces the fixed time direction of flat spacetime with a locally curved one. When this rotor is introduced into the Dirac adjoint, the theory automatically reproduces three familiar regimes: ordinary flat space, the weak-field limit that gives gravitational redshift, and the strong-field domain where curvature and its gradients affect the motion of matter. In this formulation, gravity acts by changing the local measure of time and therefore the effective mass of particles. Massive Dirac particles experience this as a renormalization of their rest mass, while massless Weyl particles can acquire a small, curvature-induced mass. The framework thus unifies flat, weak, and strong gravitational behavior inside a single operator, without introducing an external metric or new postulates. It preserves the tested limits of general relativity yet extends them into a directly quantum setting, offering a new language in which mass, curvature, and the flow of time emerge as different aspects of one spinorial geometry.
Category: Quantum Gravity and String Theory
[110] viXra:2510.0122 [pdf] replaced on 2025-11-13 22:23:20
Authors: Bassam Abdul-Baki
Comments: 7 Pages.
In this paper, we discuss some bounds for optimal Golomb rulers.
Category: Combinatorics and Graph Theory
[109] viXra:2510.0120 [pdf] submitted on 2025-10-24 21:10:25
Authors: Yu. E. Zevatskiy
Comments: 9 Pages.
The process of neutron beta decay was considered within the framework of a thermodynamic heterogeneous system model formed by gas consisting of neutrino particles moving at the speed of light and massive bodies. The calculation results are in satisfactory agreement with the experimental results. Further development of the model could lead to a theory that describes both weak and gravitational interactions.
Category: Nuclear and Atomic Physics
[108] viXra:2510.0119 [pdf] submitted on 2025-10-24 21:05:27
Authors: Yu. E. Zevatskiy
Comments: 5 Pages.
Are there tools for exploring time in a similar way to how it is in space?
Category: General Science and Philosophy
[107] viXra:2510.0118 [pdf] submitted on 2025-10-24 21:04:21
Authors: Yu. E. Zevatskiy
Comments: 7 Pages. (Note by viXra Admin: Author name is required in the article)
In the framework of the relativistic model representing the Euclidean space with three real spatial axes and one corresponding to local time, the dynamics of displacements without mass particles in the absence of fields is investigated.
Category: Quantum Physics
[106] viXra:2510.0117 [pdf] submitted on 2025-10-24 21:00:39
Authors: Huanyin Chen, Marjan Sheibani
Comments: 13 Pages.
We establish the first necessary and sufficient conditions for a $2times 2$ matrix over a local ring to be decomposable into the sum of a tripotent and an invertible matrix. Building on this decomposition, we derive a novel characterization of the generalized Drazin inverse for such matrices. Our approach hinges on a key relationship between this decomposition and the associated tripotent-quasinilpotent decomposition, thereby offering significant new insights into the spectral theory of matrices over local rings.
Category: Algebra
[105] viXra:2510.0116 [pdf] replaced on 2026-04-20 04:16:16
Authors: Johan Aspegren
Comments: 13 Pages.
We give a proof for a sharp projection analogue of the slicing problem. Moreover, we show a geometric proof of the slicing problem.
Category: Functions and Analysis
[104] viXra:2510.0115 [pdf] submitted on 2025-10-24 21:28:23
Authors: Daniel Suh, Akhil Patel, Sriyan Daggubati
Comments: 5 Pages.
This study examined the effects of 12 biochemical agents, including chemotherapies, a dilution series of Etoposide, copper (II) ions, and other bioactive compounds, on tumor cell migration using a fibroblast scratch assay. Agents were applied to adult mouse fibroblasts, and migration was quantified via microscopy and cell counts. Chemotherapies such as 5-Fluorouracil, Cytarabine, Etoposide, and Cycloheximide showed varied effects based on mechanisms like DNA synthesis inhibition and apoptosis induction. Copper (II) ions and Vanadate promoted mitochondrial damage and tumor suppressor activation, while Retinoic Acid inhibited proliferation and metastasis-related signaling. Resveratrol demonstrated the strongest inhibition of cell migration, with logistic growth modeling suggesting lasting effects on tumor proliferation. These results highlight how Resveratrol can act as a dual-action therapeutic and support further investigation into its use in cancer treatment.
Category: Quantitative Biology
[103] viXra:2510.0114 [pdf] submitted on 2025-10-23 09:31:07
Authors: Taha Sochi
Comments: 67 Pages.
This is the fifth article in our series "The Dark Sides of Modern Science" and is about the curse of science (and knowledge in general). The remarks that we stated in the Introduction of the first article of this series (i.e. "Knowledge Production and Authoring") generally apply to this article and hence we do not need to repeat.
Category: General Science and Philosophy
[102] viXra:2510.0113 [pdf] submitted on 2025-10-23 21:07:20
Authors: Rik Gielen
Comments: 41 Pages.
The Cosmic Microwave Background CMB is a powerful evidence for the Big Bang theory and is accepted as such by the vast majority of the scientific community. There are always those who doubt the CMB and its interpretation. I am one of those!A sneak peek: "The CMB started 380,000 years after the big bang, when the universe became transparent. The entire universe was filled with mainly neutral hydrogen, and flashes of light (photons) were shooting everywhere and in various directions through the cosmos. The cosmos continued to expand, but no new flashes of light were generated, which means that the photons did not expand with the cosmos either. The hydrogen-filled cosmos expanded, but the photons did not expand with it. They continued to follow their flash directions undisturbed. 300 million years after the big bang, we are interested in some of the CMB radiation near the galaxy JADES-GS-z14-0. We want to see how the two types of radiation behave. The redshift of the CMB 300 million years after the big bang is about 75 according to the scientific literature. From that moment on, the CMB and the light from JADES-GS-z14-0 travel towards us together. I cannot accept that the CMB arrives at z = 1,100 (from about 75 to 1,100) while JADES has a redshift of only 14.32 (from 0 to 14.32). To explain the large redshift of the CMB, a fourth redshift was introduced: the cosmological redshift (as space expands, the wavelength of the radiation increases proportionally). When the CMB undergoes this cosmological redshift, so should the light from JADES. But it doesn't! Therefore I would dare to conclude: the CMB redshift cannot be 1,100."In this paper I try to prove that there is no CMB. What then are the radiations measured by Penzias and Wilson in 1964 and subsequently measured with increasing accuracy by the COBE, WMAP and Planck satellites. The measured spectrum of the radiations is a property of the dark, or rather, cosmic energy. The radiation of this cosmic energy caused the unexpected noise in the radio receivers of Penzias and Wilson. The cosmic energy is full of quantum energy and particles and antiparticles that appear for a very short time and then disappear again. They give off strange signals, and these signals are called quantum fluctuations. The strange signals that are emitted here, the quantum fluctuations, can be picked up as electromagnetic background noise. And these are the signals that the researchers call the cosmic background radiation. I would rather call that radiation the Cosmic Foreground Radiation. Cosmic Microwave Foreground (CMF) instead of Cosmic Microwave Background (CMB).
Category: Relativity and Cosmology
[101] viXra:2510.0112 [pdf] submitted on 2025-10-23 21:04:19
Authors: Rik Gielen
Comments: 22 Pages.
As we all know, the universe originated 13.8 billion years ago from a single point in nothingness through a massive burst of energy known today as the big bang. The extremely dense and hot point began to expand and cool down, causing some of the energy to concentrate into particles. According to modern physical theories, this should have created an equal number of matter particles, each with a sort of twin with opposite electric charge: antimatter.In the second after their formation, matter and antimatter particles should have annihilated each other, leaving behind an empty, radiation-filled space with no matter, no gases, no galaxies, and no planets. The problem is that we live in a universe full of matter and very little antimatter. So the question is: what happened to the other half of the universe, the antimatter particles? The disappearance of antimatter is one of the biggest mysteries and most open questions in physics. In this article we will show in detail that it is highly unlikely that baryogenesis is a valid explanation for the asymmetry between matter and antimatter. After that we will attempt to propose a solution that preserved symmetry in the universe. The result: an anti-universe in its own three spatial dimensions.
Category: Relativity and Cosmology
[100] viXra:2510.0111 [pdf] submitted on 2025-10-23 15:28:36
Authors: Jerry Chen
Comments: 2 Pages.
In this paper, we calculated the spin of gravitational field theoretically, suspecting the correctness of popular spin 2 postulate.
Category: Quantum Gravity and String Theory
[99] viXra:2510.0110 [pdf] submitted on 2025-10-23 21:00:25
Authors: Urs Frauenfelder, Joa Weber
Comments: 45 Pages. 3 figures
In this article we are considering the Hessian of the area functional in a non-Darboux chart. This does not seem to have been considered before and leads to an interesting new mathematical structure which we introduce in this article and refer to as almost extendable weak Hessian field.Our main result is a Fredholm theorem for Robbin-Salamon operators associated to non-continuous Hessians which we prove by taking advantage of this new structure.
Category: Functions and Analysis
[98] viXra:2510.0109 [pdf] submitted on 2025-10-23 20:58:40
Authors: Gene H. Barbee
Comments: 31 Pages.
Artificial intelligence technology mimics some aspects of brain function with layers of probability calculations. DNA base pairs consist of AI-like layers and an amazing toolbox. They represent an entire organism but how they store the parent image and position cells accurately in offspring is obscure. I developed a model of the neutron, proton and electron that represent probabilities that may be carrying out functions needed by DNA to perform its tasks. The proton model describes energy-based features that would allow the DNA layers to position particles, exert forces to separate cells after cell division, store and recall images. Recalling long term visual images is interesting. Where the images are stored and how they are recalled over long periods of time needs to be better understood. The proton model provides a method of "time stamping" information for later recall. The continued motivation of this work is to understand the role of information in our universe. The author believes there are two levels to nature 1) an information level 2) a correlated energy level. This belief is based on data that leads to an information-based proton mass model. This model helps us understand nature.Duplication of protons creates the huge universe around us. Information from the proton model provides information that underlies a cosmological expansion model based on an accurate definition of time and space.
Category: Physics of Biology
[97] viXra:2510.0108 [pdf] submitted on 2025-10-23 18:07:55
Authors: Fernando Reyes
Comments: 8 Pages.
Modern social technologies often paradoxically inhibit direct, meaningful human connection. We address this gap by proposing the Curated Interaction Model, a novel client-side framework designed to orchestrate engaging, face-to-face conversations among small groups of young adults (ages 21—29) in a controlled social setting. The model employs a lightweight, three-item psychometric classifier to assign participants to one of six empirically derived conversational profiles. Based on the specific compositional dynamics of a five- to six-person group, the frame-work’s orchestration algorithm selects and blends curated, versioned collections of discussion prompts ("decks"). This process adaptively guides the group toward balanced, high-quality dialogue. Critically, all orchestration logic executes on-device (edge-first), a design choice that minimizes cloud dependency, reduces interactional latency, and inherently preserves user privacy. The model’s architecture is grounded in established social-psychological principles, including optimal dissonance, conversational flow, and social identity theory, to foster emergent communion. In this paper, we formalize the orchestration algorithm, provide a rigorous termino-logical framework, and present extended system visualizations, including client server data flow and a session state transition diagram. We offer implementation sketches in Swift/Objective-C and delineate a comprehensive evaluation plan, complete with statistical power considerations. We conclude by proposing empirical validation pathways (simulation and pilot studies) and discussing the model’s potential for generalization beyond the initial demographic cohort.
Category: Data Structures and Algorithms
[96] viXra:2510.0107 [pdf] submitted on 2025-10-22 07:31:19
Authors: Zhi Li, Hua Li
Comments: 15 Pages.
Finding the roots of polynomial equations is a fundamental problem inmathematics. This paper discovers that general polynomial equations can be simplifiedinto a canonical or standard form through Tschirnhaus transformations. A power seriesrepresentation consisting of coefficients in the canonical or standard form is a universalrepresentation of the roots of polynomial equations. If the series converges, a root of theequation is obtained. If the series does not converge, it can be further transformedthrough one or more Tschirnhaus transformations to obtain a convergent seriesrepresentation. This method is applicable to higher degree polynomial equations withreal and complex coefficients, avoiding the complex determination of whether they aresolvable in the radicals , and has universal significance. This advance returns theproblem of finding polynomial roots to the realm of pure algebra, using only polynomialtransformations and multivariable power series.
Category: General Mathematics
[95] viXra:2510.0105 [pdf] submitted on 2025-10-21 20:46:23
Authors: Ulrich Schreier
Comments: 12 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This paper introduces Anton Bopp's (1900--1971) comprehensive vibratory-hydro-dynamic model to the international scientific community. Bopp's unpublished manuscripts present a unified theory spanning cosmic to atomic and subatomic scales, using the hydrogen atom as a universal archetype. The model demonstrates a remarkable capacity to derive fundamental physical constants through mathematical relationships involving the foundational set ${pi, e, 1, 2, 3}$. While Bopp's primary focus was operational---developing practical tools for physical applications---his work reveals, as a natural byproduct, deep connections between mathematical principles and physical reality. This paper highlights selected findings of particular significance, situating Bopp's contributions within the historical continuum from Pythagorean harmony to Wigner's "unreasonable effectiveness of mathematics," suggesting that physical laws emerge from mathematical substrates rather than merely being described by them.
Category: Mathematical Physics
[94] viXra:2510.0104 [pdf] replaced on 2025-10-27 18:53:32
Authors: Tariq Khan
Comments: 14 Pages.
An informal essay proposing engineering designs or options that could help the United States intercept a single missile rogue nuclear attack like that presented in the 2025 Netflix motion picture film "A House of Dynamite." Theoretical modern technology designs are presented that the United States Department of Defense and advanced military armament and space organizations can consider for possible near-term and future platforms including "Golden Dome" proposals.
Category: History and Philosophy of Physics
[93] viXra:2510.0103 [pdf] submitted on 2025-10-21 04:54:57
Authors: Xiaodong Liu, Qichang Liang, Yu Liang
Comments: 2 Pages.
In this work, a gas discharge tube with isolated power connection was studied. Since the power connection is isolated, there is no current through the tube thus there is no power consumption. The gas inside the tube emits light, which can be absorbed by solar cells and be converted to power output. This device can be used as power generator.
Category: Classical Physics
[92] viXra:2510.0102 [pdf] submitted on 2025-10-21 20:42:19
Authors: Taiki Takahashi
Comments: 10 Pages.
Recent advances in behavioral economics and quantum cognition and decision elucidated a number of deviations of actual human decisions and choices from mathematical principles of normative decision theory, which are referred to as "anomalies". One of the prominent anomalies is that the violations of Savage’s sure-thing principle, which is the fundamental axiom of the rational theory of decision under uncertainty. It states that if prospect x is preferred to y knowing that Event A occurred, and if x is preferred to y knowing that A did not occur, then x should be preferred to y even when it is not known whether A occurred. I explicitly derive an equality for testing the violations of Savage’s principle in behavioral experiments on decision under uncertainty. Future applications for behavioral and neuroeconomics and quantum cognition and decision theory are discussed.
Category: Economics and Finance
[91] viXra:2510.0101 [pdf] submitted on 2025-10-21 20:35:19
Authors: Samuel A. Prescott
Comments: 10 Pages. License: CC BY 4.0 (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
The Λ—Ω Advanced Scalar Field Law provides a unified field-theoretic description of gravitation and cosmic expansion, demonstrating that inertia and gravity arise from local coupling between matter and the pervasive Λ-field rather than spacetime curvature. This coupling, modulated through frequency and phase interactions, yields modulated inertia and drives cosmic acceleration without invoking dark matter or dark energy. Numerical simulations of field resonances yield stable normalized densities ⟨ρ⟩ = 0.998 ± 0.002 and fractal evolution from Df(z = 5) ≈ 2.5 to Df(z = 0) ≈ 1.85, addressing key cosmological tensions and predicting measurable deviations in ΔH/H and Δχ consistent with DESI and LISA forecasts
Category: Relativity and Cosmology
[90] viXra:2510.0100 [pdf] submitted on 2025-10-21 20:30:58
Authors: Miroslav pardy
Comments: 7 Pages. Original article
The Schwinger proper-time method [which avoids] the direct use of the wave function is used in order to calculate the vacuum polarization in the presence of the homogenous magnetic field. The derivation gives the same results as obtained by Adler (1971) and Minguzi (1956).
Category: High Energy Particle Physics
[89] viXra:2510.0099 [pdf] submitted on 2025-10-20 16:10:26
Authors: A. J. Owen
Comments: 6 Pages.
In Einstein's theory of general relativity (GR), gravitation is considered as a consequence of space and time curvature, whereas Newton's law of gravity applies strictly in a Euclidean or flat space. Logically, then, Newtonian gravity must relate solely to the time curvature contributionin GR, and instances where Newton's law does not describe phenomena correctly, such as the erihelion rotation of the planet Mercury and the bending of starlight, must be attributable to an effect ofspatial curvature. In this paper, the GR solution for a static point mass is calculated on this basis for correspondence with Newton's law, andfound to be crucially different from the usual interpretation in the current scientific literature. It is shown here that gravitational attraction does not diverge to infinity as masses approach each other, but tails off to zero. This means there is no singularity at the origin of coordinates where physical laws would break down, and, furthermore, speeds of free-falling objects do not exceed the speed of light. There is also noevent horizon obscuring a black hole at the origin of coordinates, since the spacetime behaves perfectly regularly.
Category: Relativity and Cosmology
[88] viXra:2510.0098 [pdf] replaced on 2025-12-26 18:42:33
Authors: Richard L. Hudson
Comments: 6 Pages.
The game show problem originated when Steve Selvin submitted his paper 'A Problem in Probability' to 'The American Statistician' magazine Feb 1975. [1] After receiving some negative reviews from readers, he submitted a 2nd letter to the editor Aug 1975. [1] There was little public awareness of his game show analysis until 1990, when Craig Whitaker posed a question of a winning strategy for a similar game show to Marilyn Savant who wrote articles for Parade magazine.The debate of probability of success as 2/3 vs 1/2 has continued until today.This paper reveals a common error in both Selvin and Savant interpretations.
Category: Set Theory and Logic
[87] viXra:2510.0097 [pdf] replaced on 2025-11-10 01:42:52
Authors: Xianzhong Cheng
Comments: 6 Pages.
This paper proposes a novel neutron composite model, describing the neutron as a two-level bound state composed of a proton (p), an electron (eu207b), and a sterile antineutrino ̄ ) through electromagnetic and magnetic moment interactions. The core mechanism lies in the orbital instability caused by magnetic moment perturbations within the Wu207b bound state (eu207b-̄ ), which drives βu207b decay. By introducing quantization conditions for orbital angular momentum, a closed self-consistent system of equations is constructed and solved. For the first time, this theory directly derives key internal parameters of the neutron from first principles, including its radius (~1.28 fm) and the orbital velocity of the Wu207b bound state (~0.115c). It also predicts the magnetic moment of the sterile antineutrino (3.64×10u207b¹u2070) and its relativistic velocity within the bound state (0.54c)[1]. Based on this framework, the neutron lifetime is naturally predicted as 878.4 seconds, with a deviation of only -0.20% compared to the experimental value (880.2±1.0 seconds)[2]. Furthermore, the probabilistic nature of decay (half-life) is attributed to quantum tunneling effects at the critical point of orbital instability, unifying the dynamical process of decay with quantum statistical outcomes within a single theoretical framework.
Category: Nuclear and Atomic Physics
[86] viXra:2510.0095 [pdf] replaced on 2025-11-06 02:51:23
Authors: Vadim Khoruzhenko
Comments: 55 Pages. (Note by viXra Admin: Please cite listed scientific references and submit article written with AI assistance to ai.viXra.org)
This article proposes a radical simplification of physics foundations through the introduction of the concept of protomatter - the imaginary density of space, representing an additional non-geometric dimension. It is shown that this concept allows for a unified description of phenomena that previously required separate entities: charges become density clusters, the electric field becomes its gradient, the magnetic field becomes the gradient of flow, and quantum states become its resonant modes. Within the formalism, Coulomb's law and the Biot-Savart law are derived from first principles, Bohr's postulate is justified, and the finiteness of the self-energy of a charge is demonstrated. The model does not contradict experimental data but reinterprets them, indicating the derivative nature of the magnetic field and the existence of absolute (gravitationally-bound) reference frames. Dark matter and dark energy are identified with the very fabric of protomatter, eliminating the need for hypothetical particles.
Category: Classical Physics
[85] viXra:2510.0094 [pdf] submitted on 2025-10-18 14:25:18
Authors: Herbert Weidner
Comments: 17 Pages.
Long-term atmospheric pressure measurements across Central Europe reveal coherent oscillations at ultra-low frequencies (4 -- $7~mu$Hz), persisting over two decades. Using a demodulation technique adapted from signal processing, we identify reproducible spectral lines with complex phase modulation patterns. These signals are synchronized across geographically distant sensors and exhibit modulation periods ranging from months to decades. A prominent annual modulation suggests a correlation with Earth's orbital motion, implying a non-local origin. The inferred wave propagation speed ($sim 2.4times 10^6$ m/s) and wavelength ($sim 5times 10^{11}$ m) exceed known atmospheric dynamics, challenging current geophysical models. While the source remains unidentified, the findings suggest the presence of a previously unrecognized class of long-range atmospheric oscillations with potential astrophysical relevance.
Category: Geophysics
[84] viXra:2510.0091 [pdf] submitted on 2025-10-17 03:21:55
Authors: Taiwei Song
Comments: 10 Pages.
Based on the Geometry of Space-Time Structures founded by the author himself, this paper demonstrates the physical nature of the integration of time, space, matter at the extremely small spatial scale, and the natural essence of strong interactions between particles; proves that proton p+ and electron e- are the most fundamental particles of matter in the natural world; reveals the material structure and creative logic of neutrons and atomic nuclei; establish precise mathematical relationships between neutron and atomic nucleus structures and discrete space fractional dimensions; reveals various new quantum state structures of pep quantum pairs formed by adjacent protons in atomic nuclei and their relationships with natural number sets, etc.
Category: High Energy Particle Physics
[83] viXra:2510.0090 [pdf] submitted on 2025-10-17 06:27:48
Authors: Eric Louis Beaubien
Comments: 4 Pages.
This article quantifies the known measured magnetic moment of the neutron corrected from a previous article. My original neutron viXra article (2507.0122) gave the exact numbers for the neutron mass and its Compton wavelength derived from NIST constants (codata 2022), but the magnetic moment it generated was incorrect. Herein that measure is corrected through a re-imagining of the electron’s magnetic field. This solution yields the correct magnetic moment while retaining the previous neutron mass and Compton wavelength calculations.
Category: Nuclear and Atomic Physics
[82] viXra:2510.0089 [pdf] replaced on 2025-11-05 04:44:56
Authors: Ervin Goldfain
Comments: 30 Pages.
We develop a theoretical framework in which the equations of General Relativity (GR)emerge from dimensional fluctuations of the early Universe. The derivation points outthat primordial fluctuations in the effective dimensionality of spacetime are governed bythe complex Ginzburg—Landau equation (CGLE), which is a coarse-grained description ofcomplex dynamics near the Planck scale. Elaborating on the behavior of CGLE as generichydrodynamic flow, our paper offers an intriguing path from fractal dimensionality ofprimordial cosmology to the onset of gravitational physics in the late Universe.
Category: Relativity and Cosmology
[81] viXra:2510.0088 [pdf] replaced on 2025-11-09 05:41:20
Authors: Warren D. Smith
Comments: 57 Pages. v3 adds new section discussing experiments that cast doubt on Hypothesis I - perhaps fatally.
I. HYPOTHESIS: Aqueous RNA becomes stable against hydrolysis under high hydrostatic pressure, 1-4000 atm. Weaker also-sufficient hypothesis: it works for some exponentially-large subset of RNA-like molecules; also salinity has most of the same beneficial effects as pressure, and both can work together. This and related hypotheses can resolve the gaping holes in L.E.Orgel's "RNA world" paradigm about life's origin. It deserves direct experimental tests, but even without them we present 4 independent lines of evidence for it apparently yielding confidence>99.9999%.
II. I explain how to measure the "true vital information content" of a lifeform. It is hugely wrong to claim an N-base pair DNA genome has information content 2N bits. The simplest estimate instead should be
Together, ideas I (if correct) and II overcome the two top obstacles blocking understanding how genesis could have happened, and suggest many experiments. Unfortunately version 3 adds a new section
Category: Biochemistry
[80] viXra:2510.0087 [pdf] submitted on 2025-10-16 05:45:49
Authors: Taiwei Song
Comments: 8 Pages.
Based on the Geometry of Space-Time Structures founded by the author himself, this paper uses an accurate mathematical model to explain the space warp problem of the giant world, deduces the curvature formula of the optical path curve of the natural space-time space, and explains its physical significance; demonstrates the space-time transformation relationship between the geometric scene of natural space-time space and the visual image of the observer, and gives the space-time transformation equation; discusses the natural properties of the cosmic redshift, derives the redshift differential formula; proposes the experimental methods to verify the new space-time relationships of photons, etc.
Category: Mathematical Physics
[79] viXra:2510.0086 [pdf] submitted on 2025-10-16 07:16:59
Authors: Jorma Jormakka
Comments: 17 Pages.
The result of this article is that the gravitational field is a field in flat space, not the geometry of space-time. This result is reached from eight considerations of General Relativity and a scalar gravitation field. The reasons why gravitation is not space-time geometry include the following. The Einstein equations do not have valid solutions that can describe the gravitation field of the Sun and the Earth and therefore they are wrong. The geodesic metric of General Relativity does not work. General Relativity cannot be quantised. Einstein's equations are not derived from anywhere. But the strongest argumentsare from considering when a mass can be reduced into a point mass. This is possible onlyif the ball of the geometry grows as r^2 implying flat 3-dimensional spatial space.
Category: Relativity and Cosmology
[78] viXra:2510.0085 [pdf] replaced on 2025-11-20 00:52:59
Authors: Ding Jian
Comments: 10 Pages. In Chinese
There is continuity between truth and the relevant objective thing in reality, and its intrinsic mechanism is inertia. Inertia is an inherent characteristic of objective things, and truth resides there. Inertia generates continuity, which is a necessary condition for reasoning. It can go from the quantitative change of real space all the way deep into the qualitative change of the ideal realm, and expand the philosophical view of materialism to the category of metaphysics. In the fundamental part of Eastern and Western philosophy, this absence has been more than two thousand years. From this, it can be seen that the law of the unity of opposites should have been the Trialism, in fact we use it every day, just not deliberately reflected upon. Once a consensus is reached, it means the unification of Eastern and Western philosophies. Based on this, the norms for identifying truth are given, and the true nature of metaphysics is restored through norms and definitions. In addition, the definition of philosophy is discussed and given. The Trialism on things' limits is founded, which resolves the dilemma that truth has no place in Dualism, and thereby uses this theory to perfect materialism as well as the unity of opposites of all knowledge.
Category: General Science and Philosophy
[77] viXra:2510.0084 [pdf] submitted on 2025-10-16 22:19:23
Authors: Mikhail Batanov-Gaukhman
Comments: 34 Pages.
This article is the thirteenth part of the scientific project under the general title "Geometrized Vacuum Physics Based on the Algebra of Signature " [1,2,3,4,5,6,7,8,9,10,11,12]. This article is aimed at substantiating the assertion that there is no difference in the mathematical description of the behavior of objects in the macrocosm and the microcosm. The hierarchical cosmological model proposed in the previous articles of this project assumes that the metric-dynamic models of all "corpuscles", regardless of their size (for example, "elementary particles", naked "planets" and "stars", as well as naked "galaxies") are structured almost identically. The main differences between them are associated primarily with the distinguishability of small details. The larger the "corpuscle", the more subtly its infrastructure is manifested. However, the similarity of "corpuscles" of different sizes is not limited to the coincidence of their shape. Their random movements (i.e., chaotic deviations of the core of the "corpuscles" from their mean positions) also obey the same laws. The article presents the derivation of the stochastic Schrödinger equations and self-diffusion equation, suitable for describing the averaged (including quantized) states of stochastic systems of any scale. It is shown that, for example, the chaotically shifting core of a planet (or star) can have a quantum set of possible averaged states, similar to the excited states of an electron in an atom. It is suggested that when the core of a planet (or star) transitions from one quantum state to another, the interior of this celestial body can absorb or emit gravitational waves. This hypothesis may form the basis of stellar-planetary gravitational spectroscopy.
Category: Classical Physics
[76] viXra:2510.0083 [pdf] submitted on 2025-10-16 22:10:12
Authors: Juan Moreno Borrallo
Comments: 32 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This work presents a framework for the derivation of key physical constants and their interrelations, building the basis for the geometric structure underlying Quantum-Elastic Geometry (QEG) model, a unified field theory in which spacetime is described as an elastic substrate. From QEG Lagrangian and a generally covariant action, we assume the foundational principles of homogeneity, isotropy, covariance, and Lorentz invariance for a physical substrate, showing that imposing self-consistency -formalized through a minimal set of geometric normalization conditions compatible with QEG basis— uniquely determines the substrate's emergent structure and properties. The outcome is a deductive framework in which fundamental constants are geometrically enforced, emerging as predictable consequences of a stable and symmetrically constrained geometry.
Category: Relativity and Cosmology
[75] viXra:2510.0081 [pdf] replaced on 2026-05-03 15:26:28
Authors: Eric Zhuang
Comments: 26 Pages.
Lychee (Litchi chinensis Sonn.) is an economically important tropical fruit threatened by litchi downy blight, caused by the oomycete Peronophythora litchii. To prioritize genes and regulatory features associated with early host responses, we reanalyzed public leaf RNA-seq data from Guiwei and Yurong cultivars (GSE201243; mock and inoculated samples, three biological replicates per group). Differential-expression analysis identified 17 significant DEGs in Guiwei and 117 in Yurong at adjusted P < 0.05, suggesting a broader transcriptional response in Yurong. Eighteen highly responsive genes were selected for structural, motif, promoter, and candidate interaction analyses. These candidates included annotations associated with defense or stress responses, including jacalin-like lectin, pathogenesis-related protein, heat-shock protein, thaumatin/osmotin-like proteins, cytochrome P450, lipid-transfer proteins, and calcium-binding proteins. A selected-gene interaction contrast prioritized LITCHI019519, LITCHI001510, LITCHI028401, LITCHI017676, LITCHI028104, and LITCHI019183 as potential cultivar-dependent response genes. Promoter analysis of 2-kb upstream regions detected stress- and hormone-associated cis-elements, and a random-promoter background test supported enrichment of ARE, MeJA-responsive motifs, TCA-elements, and TC-rich repeats. These results provide testable candidates for future functional validation of lychee resistance to P. litchii.
Category: Quantitative Biology
[74] viXra:2510.0080 [pdf] replaced on 2025-11-12 23:21:50
Authors: James Johan Sebastian Allen
Comments: 66 pages. English. Updated operator formulation and equilibrium proof. Includes revised references and appendices.
This paper presents a functional proof strategy for the Riemann Hypothesis within the Pattern Field Theory framework, using the Allen Orbital Lattice (AOL) as the geometric and arithmetic substrate. The approach establishes an equivalence between two independently constructed systems: the continuous Equilibrion Hamiltonian, which describes recursive curvature balance along the critical line, and the discrete AOL operator, whose prime-anchored spectrum exhibits Gaussian Unitary Ensemble statistics after unfolding.The analysis shows that both systems produce self-adjoint spectra aligned to Re(s)=1/2, with nontrivial zeros interpreted as equilibrium nodes of the prime field. The operator includes prime-weighted potentials and duplex curvature phases, producing spectral behavior consistent with Riemann-class dynamics. Numerical diagnostics across 30—50 lattice shells confirm Wigner—Dyson spacing, with Kolmogorov—Smirnov and Cramér—von Mises distances stable under randomized phase ensembles. Control experiments removing the prime anchors fail to reproduce this universality, isolating prime weighting as a necessary structural condition.The paper incorporates the updated operator formulation, extended curvature analysis, and cross-references to related Pattern Field Theory results, including the conduction constant tau = 71.2 ± 3.9 ms measured during Pattern Alignment Lock formation. These results support the claim that the prime-indexed curvature harmonics on the AOL constitute a physical equilibrium field and that the Riemann Hypothesis corresponds to its stationary manifold.The work integrates mathematical, geometric, and empirical components into a unified framework and provides a stable model linking prime recursion to field equilibrium. This version includes updated references, consistency corrections, and the completed analytic—geometric correspondence across the continuous and discrete representations.
Category: Mathematical Physics
[73] viXra:2510.0079 [pdf] submitted on 2025-10-15 20:39:39
Authors: Jaba Tkemaladze
Comments: 23 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This article presents the Ze artificial life system, a novel bio-inspired architecture for predictive processing in infinite data streams under severe memory constraints. The system implements Bayesian probability updating through a mechanism of dynamic chronotropic frequency analysis, demonstrating remarkable computational efficiency and biological plausibility. Unlike traditional approaches such as LSTM networks and Markov models, Ze processes information through parallel beginning and inverse processors, enabling complementary pattern discovery while maintaining sublinear memory complexity. The core algorithm exhibits distinctive probability dynamics characterized by an initial match probability of 0.5 with exponential decay to 0.00001 as counter diversity increases, achieving 78-92% prediction accuracy for stable data flows. Experimental results using synthetic datasets (1,048,576 binary sequences) confirm 37-42% operational savings compared to conventional methods, rapid adaptation to changing stream characteristics within 2-3 seconds, and robust noise tolerance up to 15% input distortion. The Go implementation processes 1.2 million operations per second with 850 nanosecond latency while maintaining memory usage of 12.8 bytes per counter. The system's architecture shows strong neurobiological correlations with predictive coding principles and synaptic plasticity mechanisms, providing both a practical solution for resource-constrained environments and a computational model of Bayesian inference in neural systems. Future development pathways include extension to non-binary data streams, integration with hierarchical Bayesian models, and hardware acceleration through memristor-based implementations.
Category: Artificial Intelligence
[72] viXra:2510.0078 [pdf] replaced on 2025-12-22 20:14:32
Authors: Han de Bruijn
Comments: 16 Pages.
Starting with the Hydrogen atom as an example,a dimensionless formulation of the Schrödinger equation for many electron atoms is presented. It is shown that the energy levels E of such a system are only dependent on the fine structure constant α, the atomic number Z, the speed of light c in vacuum and elementary particle (rest) masses m of electron and nucleus. If α and c may be assumed to be constants of nature, then E is only dependent on mass m. Introducing the Variable Mass hypothesis (VM) leads to formulas for the so-called intrinsic redshift. Ticks of Atomic clocks Δt are shown to be inversely proportional to mass m.
Atomic clocks may seem the most accurate nowadays. Common timing in life, however, is still based upon seconds, minutes and hours, as parts of daylength, and days as parts of a year. This may be called the Orbital timeframe. By considering orbits of planets such as the earth, it is argued that ticks of Orbital clocks ΔT are inversely proportional to the square root of mass m.
If it is accepted that there are two clocks instead of one, then there is a discrepancy between the Atomic ticks Δt and the Orbital ticks ΔT, due to Variable Mass. A simple differential equation is derived that relates the two timeframes. For mass in Atomic time t, a formula can be derived that shows exponential growth. With help of the differential equation, a formula can be derived for mass in Orbital time T as well, showing quadratic growth anda beginning for T=-A, meaning that there is an age A.
Empirical evidence for both the quadratic and exponential growth formulas is found in the Shrinking Kilogram.Quite unexpectedly, our version of the VM theory is consistent with the infamous C-decay. On the other hand, Variable Mass offers an explanation for the anomalous Earth's Rotation Retardation. At last, it is proved that Leap Seconds cannot be avoided with Atomic time and Orbital time, when combined with the Variable Mass hypothesis. And there is a very simple formula for calculating them.
Category: Astrophysics
[71] viXra:2510.0077 [pdf] replaced on 2025-12-20 02:02:16
Authors: Keshava Prasad Halemane
Comments: 11 Pages. 2 Tables
This research report presents the statement of the Monty-Hall Theorem and provides a constructive proof by solving the classical Monty-Hall Problem. It establishes the fact that the probability of winning the prize is indeed unaffected by a switched-choice — very much unlike the most prevalent and widely accepted position held by the Leading Subject-Matter-Experts.
Category: Statistics
[70] viXra:2510.0076 [pdf] submitted on 2025-10-14 08:46:26
Authors: Marciano L. Legarde
Comments: 7 Pages.
This study explores the Antiderivative Power Rule Sequence, demonstrating how its infinite series leads to the polylogarithm. By iteratively applying the power rule for antiderivatives to successive powers of x, we derive the sequence, which, when expressed as an infinite series, converges to -ln(1-x). Differentiating the resulting series recovers the geometric series, highlighting a profound inverse relationship between 1/(1-x) and -ln(1-x). Furthermore, this formulation establishes a natural connection to the polylogarithm function, generalizing the relationship for higher orders of integration. This work provides both pedagogical and theoretical insights, reconstructing a transcendental function from elementary calculus operations.
Category: General Mathematics
[69] viXra:2510.0075 [pdf] replaced on 2025-10-26 05:48:00
Authors: Jorma Jormakka
Comments: 18 Pages. typo corrections, there was a typo in the title, need to correct it
Section 1 of the article shows that the Schwarzschild metric and cosmological models with similar metrics are invalid because the spatial part of the metric is not a valid Riemannian metric in local Cartesian coordinates, as it should. Theorem 1 proves that a metric for the spatial part given in the spherical coordinates of R3 with only dr2, dθ2 and dϕ2 defines a valid metric in local Cartesian coordinates only if the spatial part of the metric is a scalar metric, i.e., a metric induced by a scalar field. Section 2 has some solutions for a scalar metric in the situation of a point mass in an otherwise empty space. Section 3 and 4 look at the Friedmann’s cosmological model from Chapter 5 of Einstein’s book combined from his lectures in Princeton. The findings are that each of Einstein’s equation can be solved for a model that only depends on t and r, but the Einstein equations do not have a solution that solves them all and gives a valid metric. Additionally the Friedmann model does not give the cosmological solution that Einstein’s book says.
Category: Relativity and Cosmology
[68] viXra:2510.0074 [pdf] submitted on 2025-10-14 20:46:43
Authors: E.P.J. de Haas
Comments: 6 Pages. (Note by viXra Admin: Please cite and list scientific references)
The Dirac adjoint is traditionally defined to restore Lorentz covariance of bilinear spinor quantities. In this paper we reinterpret the adjoint within the biquaternion Weyl--Dirac formalism. Starting from the Weyl-level algebra, where the metric is represented by the Pauli biquaternion basis $(T,K)$, we extend to the Dirac level through a parity--time doubling that introduces the basis $beta_mu=(beta_0,{bs beta})$. We demonstrate that the adjoint $overline{Psi}=Psi^daggerbeta_0$ acts as an algebraic inclusion of the local time-lapse field (the $g_{00}$ metric component), and that its generalization $overline{Psi}=Psi^dagger frac{1}{N}(beta_0-N^ibeta_i)$ incorporates also the spatial shift vector $N^i$. This step algebraically embeds both static and dynamic gravitational effects into the Dirac formalism.We analyze the Weyl-Hilbert and Dirac-Hilbert spaces, showing that only the latter couples to gravity through the adjoint. Finally we assess the stage reached in the ongoing unification of quantum mechanics, special relativity, and general relativity within the biquaternion approach.
Category: Quantum Gravity and String Theory
[67] viXra:2510.0073 [pdf] replaced on 2026-02-11 20:56:44
Authors: Khalid Ibrahim Al-ibraheem
Comments: 18 Pages.
This paper presents a comprehensive and decisive proof of the Collatz conjecture by partitioning the odd integers into three disjoint sets:B,C,D and analyzing their behavior. It then proves the non-existence of non-trivial cycles, and finally applies the conjecture to odd integers starting from the set V=5+12n , demonstrating that all such integers converge to 1.Please refer to v4, as it is clearer and free of typographical errors.
Category: Astrophysics
[66] viXra:2510.0071 [pdf] replaced on 2025-10-17 17:41:39
Authors: Clark M. Thomas
Comments: 7 Pages.
Telepathy among humans is a tantalizing phenomenon. Experimentalists generally say it doesn’t verifiably exist among our known senses. Others insist it is real, but only when we look for it with the proper experimental designs. This essay presents a new phenomenological model.
Category: Mind Science
[65] viXra:2510.0070 [pdf] submitted on 2025-10-14 19:47:52
Authors: Taiki Takahashi
Comments: 10 Pages.
Recent advances in behavioral economics elucidated a number of deviations of actual human decisions and choices from mathematical principles of normative decision theory in neoclassical economics. This study demonstrates, by utilizing the mathematical model of probability discounting theory in behavioral psychology, that normative principles of decision making under risk (von Neumann and Morgenstern’s expected utility theory) and over time (dynamic consistency, i.e., exponential discounting) are incompatible. Possible future applications of this finding in behavioral economics and quantum epistemics are discussed.
Category: Economics and Finance
[64] viXra:2510.0069 [pdf] submitted on 2025-10-13 20:38:07
Authors: Buylin Igor Aleksandrovich
Comments: 15 Pages. In Russian (Note by viXra Admin: All entries on the Submission Form should be in English)
Based on a simple vortex model of solids, as well as the well-known principles of hydrodynamics of an ideal incompressible fluid, namely d’Alembert’s paradox, the equilibrium condition of an incompressible fluid and the principle of added masses for potential flows, it was possible to find a simple explanation for the mysterious concepts of classical mechanics and theoretical physics - inertial and non-inertial reference frames, inertial forces, motion bodies by inertia. Based on the resulting model, it was possible to give a new interpretation of Newton's three laws.
На основе простой вихревой модели твердых тел, а также широко известных принципов гидродинамики идеальной несжимаемой жидкости, а именно — парадокса Даламбера, условия равновесия несжимаемой жидкости и принципа присоединенных масс для потенциальных потоков, удалось найти простое объяснение загадочным понятиям классической механики и теоретической физики —инерциальным и неинерциальным системам отсчета, силам инерции, движению тел по инерции. На основании полученной модели удалось дать новую интерпретацию трех законов Ньютона.
Category: Classical Physics
[63] viXra:2510.0068 [pdf] replaced on 2026-04-30 03:10:05
Authors: Quinton R. D. Tharp
Comments: 47 Pages.
This paper presents an early, cycle-normalized formulation of Planck-scale structure within a deterministic lattice framework (informally referred to as the "Titus" model). In this version, the Planck energy is defined through a loop-based action condition,[E_P t_P = h,]giving[E_P = frac{h}{t_P}.]This representation is mathematically consistent but corresponds to a cycle-normalized (2pi-based) description of phase—action transport. In the modern Quantum Lattice Model (QLM), this formulation is reduced to a per-radian primitive using the relation (h = 2pi hbar), yielding the canonical identity[E_P = frac{hbar}{t_P}.]The transition from (h) to (hbar) removes the implicit (2pi) redundancy and establishes the minimal primitive set ({hbar, ell_P, t_P}), which underlies the fully developed QLM framework.Within the earlier formulation presented here, symbolic derivations, numerical evaluations, and CODATA-validated unit checks are provided for Planck-scale quantities, including electromagnetic and atomic-scale relations. These results should be interpreted as a non-minimal precursor to the reduced-action QLM canon.
Category: Quantum Physics
[62] viXra:2510.0067 [pdf] submitted on 2025-10-13 00:56:21
Authors: Udo E. Steinemann
Comments: 16 Pages.
Movement-patterns of geodesics are compared who firstly are conceived for masses floating mentally through space-time inside earth and secondly can be observed for spice-grains in kneaded dough.
Category: Mathematical Physics
[61] viXra:2510.0066 [pdf] submitted on 2025-10-13 20:29:01
Authors: Aldrich K. Wooden Sr
Comments: 8 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
The semiconductor industry stands at an inflection point where physics, eco-nomics, and geopolitics converge to reshape global technology. This analysis decon-structs the entire value chain from atomic-scale fabrication to trillion-dollar marketimplications, revealing why three companies control humanity’s computational fu-ture and what this means for AI development through 2030. Using first principlesreasoning—starting from transistor physics, lithography wavelength limits, fabrica-tion process complexity, and capital intensity economics—this paper demonstrateswhy the industry’s oligopolistic trajectory is inevitable, not coincidental. The analy-sis covers quantum-mechanical transistor operation, extreme ultraviolet lithographyphysics, advanced packaging bottlenecks, high-bandwidth memory constraints, andmarket dynamics across foundries, equipment manufacturers, and AI acceleratorproducers.
Category: Economics and Finance
[60] viXra:2510.0065 [pdf] submitted on 2025-10-13 20:28:31
Authors: Aldrich K. Wooden Sr
Comments: 21 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This research examines space-based data centers through first principles analysis, de-constructing fundamental physics constraints and reconstructing complete market ecosystemimplications across direct and indirect sectors. Analysis reveals orbital data centers couldachieve 60-70% cost advantages over terrestrial facilities by 2035 if launch costs decline below$100/kg, but face immutable thermal management constraints from Stefan-Boltzmann radia-tion law that limit power density to 10-20 kW per rack versus terrestrial facilities’ 30-100+kW. The market is projected to reach $39 billion by 2035 at 67% CAGR, with first com-mercial operations launching in 2025. Fundamental physics analysis demonstrates space datacenters solve satellite data processing problems rather than general computing problems, withoptimal applications in Earth observation processing, AI training requiring unlimited cleanpower, and national security applications valuing physical isolation. The study maps complete1infrastructure ecosystem requirements including launch systems, thermal management, radia-tion hardening, communications networks, and servicing capabilities, while analyzing marketimpacts across 15+ sectors from aerospace to insurance. Critical success factors include Star-ship achieving target economics by 2028-2030, technology maturation to TRL 8-9, regulatoryclarity on data sovereignty, and anchor customer commitments enabling commercial viability.
Category: Economics and Finance
[59] viXra:2510.0064 [pdf] submitted on 2025-10-13 10:02:07
Authors: Steven Kenneth Kauffmann
Comments: 7 Pages.
Nonrelativistic quantum mechanics is obtained from nonrelativistic classical Hamiltonian mechanics by replacing the particle's spatial location by a complex-valued wave-function location distribution, and by replacing the particle's deterministic mechanics in universal time by an operator analog which acts on its wave function. But in relativistic classical mechanics, time isn't universal. One important time is given by a clock at rest with respect to the observer; in conjunction with the particle's observed spatial location it comprises the particle's observed space-time location. The Lorentz-invariant time of the particle's dynamics is its proper time, given by a clock at rest with respect to it. The wave function of relativistic quantum mechanics depends on the particle's observed space-time location, and time derivatives of the quantum operators are with respect to particle proper time. In the particle's zero-mass limit, however, the free-particle Schrödinger equation is independent of the particle's proper time, is second-order in observer time, and is real-valued. The relativistic correction to the hydrogen atom's Hamiltonian without spin is a very weak, short-range complement to the Coulomb potential.
Category: Quantum Physics
[58] viXra:2510.0063 [pdf] submitted on 2025-10-13 11:18:21
Authors: Jaba Tkemaladze
Comments: 16 Pages.
The rapid advancement of biogerontology, particularly through interventions like senolytic therapies and stem cell rejuvenation, is transforming the prospect of radical life extension and even biological immortality from science fiction into a plausible future scenario. This impending paradigm shift necessitates a profound re-examination of existential psychology, which has traditionally posited awareness of mortality as the fundamental source of life's meaning, motivation, and authenticity. This article argues that the psychological phenomenon of death awareness would not simply vanish with the elimination of biological finitude. Instead, it would undergo a critical functional transformation. It proposes the concept of an "existential alarm clock," a reconstituted internal mechanism that shifts from serving as a chronological limit to acting as a qualitative regulator of existence. In a state of immortality, this alarm would awaken the individual from the unique perils of an endless lifespan: existential apathy, identity stagnation, and the "bad infinity" of undifferentiated time. The article analyzes the mechanisms of this alarm—triggering identity crises, combating profound boredom, and stimulating self-transcendence—and explores its vast social, cultural, and ethical implications. It concludes that, paradoxically, a continued dialogue with the concept of death remains a crucial condition for a meaningful and authentic life, even in the context of biological immortality.
Category: General Science and Philosophy
[57] viXra:2510.0061 [pdf] submitted on 2025-10-13 20:19:05
Authors: Chandhru Srinivasan
Comments: 11 Pages. (Note by viXra Admin: Author name is required in the article; please submit article written with AI assistance to ai.viXra.org)
I report an empirical derived and theoretically motivated analysis of modular patterns in composite integers, with a focus on semiprimes. For any odd semiprime N(possibly all N ∈ ℤ), the results indicate the existence of N-1congruences of the form p+q≡r(modm), where p and q are factors of N, m∈{2,u2026,N-1} as 1 and N are trivial and always N ≡0 mod(1 or N) , and each residue r belongs to a restricted, well-structured subset R_m. Empirical experiments suggest that these residue constraints are non-random, deterministic and encodes all the necessary information about the factor pair (p,q). I formalize this observation as a conjecture and provide preliminary reasoning for its generality. These results point toward a potentially deterministic structure in the modular representation of factor sums, and potentially speedup the factorisation of any N and offering a new perspective on the arithmetic properties of semiprimes and composite numbers. I invite further mathematical verification and formalization.
Category: Number Theory
[56] viXra:2510.0060 [pdf] submitted on 2025-10-12 20:16:22
Authors: Marcello Colozzo
Comments: 7 Pages. (Note by ai.viXra.org Admin: Please list each cited reference in a complete and standard reference stype such as APS style)
We discuss the Nobel Prize awarded to John Clarke, Michel H. Devoret, and John M. Martinis "for the discovery of macroscopic quantum tunneling and of the quantization of energy in an electric circuit.
Category: Quantum Physics
[55] viXra:2510.0059 [pdf] submitted on 2025-10-12 10:04:53
Authors: L. Martino
Comments: 13 Pages.
In many fields, including computational statistics, ecology, economics, and physics, normalized weights define a discrete probability mass function over a set of entities/samples. The effective sample size (ESS) quantifies the concentration of these weights, providing a measure of sample representativeness. In this work, we show that, among various ESS formulations, the Berger-Parker index uniquely preserves the relative proportions of the weights, acting as a true particle counter. Other commonly used ESS expressions tend to overestimate the effective sample size when only normalized weights are considered. Several examples and formal demonstration are provided.
Category: Statistics
[54] viXra:2510.0058 [pdf] replaced on 2026-01-14 22:16:14
Authors: Felix M. Lev
Comments: 14 Pages. Relationship between the foundation of mathematics and quantum theory is discussed in more details.
A common situation in physics involves two theories, ${cal A}$ and ${cal B}$, where ${cal A}$ contains a nonzero parameter, and ${cal B}$ arises as a limit of ${cal A}$ as this parameter approaches zero or infinity. In such cases, ${cal A}$ is more general and ${cal B}$ is a degenerate case of ${cal A}$. Well-known examples include relativistic theory being more general than non-relativistic theory and quantum theory being more general than classical theory. In this short review we argue that an analogous situation holds in mathematics. Classical mathematics (CM) is based on the infinite ring of integers $Z$, whereas finite mathematics (FM) is based on the finite ring $R_p=(0,1,2,...p-1)$ of residues modulo $p$. CM has foundational difficulties (as highlighted by Gödel's incompleteness theorems) while FM does not. All attempts to construct a quantum theory of gravity within CM encounter unavoidable divergencies. The existence of elementary particles also suggests that infinitesimals do not exist in nature. Despite this, CM is usually regarded as fundamental theory, while FM merely as a tool useful only in some models. We argue instead that FM is the more general theory, with CM appearing as its degenerate limit as $ptoinfty$. The key points are: $R_pto Z$ as $ptoinfty$, and this can be proved using only potential (not actual) infinity; quantum theory based on FM is more general than quantum theory based on CM.
Category: General Mathematics
[53] viXra:2510.0057 [pdf] replaced on 2025-10-16 22:49:56
Authors: Juchi Ye
Comments: 11 Pages. License: CC BY-NC-ND
The ability to measure the exact one-way speed of light had often been thought to be impossible [4, 6]. Under most simplified theoretical conditions, there appears to be no intuitive method of measuring the strict one-way speed of light, with the main problem being the synchronization of clocks at point A and B [4, 6]. The significance proving or disproving the illusion of invariant lightspeed may shed light on the incompleteness and possible improvements of special relativity [5], while leading to new discoveries and verifications of hypotheses and theories. However, as of relative recency experiments designed to measure the one way speed of light appears to still be highly limited, where special cases may give false positives [3]. The experiment designed in this paper can provide a precise measurement, when performed under ideal conditions will produce no false positives, while taking time dilation into account. Two spacecraft launched together are sent into a stable solar orbit between the Earth and Mars, spaced out a significant distance (>10 light minutes). Both spacecraft will be synchronized to constantly observe a pulsar - counting pulses, our start signal. Upon reaching a specified number of pulses the probes send signals to each other while starting their timer, ending their timer when they receive the signal from the other side. The experiment is performed once when the probes form an isosceles triangle, with atomic clocks that start as a documentation of their orbital period. The observation of the pulse rate differences should overlap at certain points on orbit where light is hitting the two probes simultaneously, where the experiment can be performed again to provide a measurement.
Category: Relativity and Cosmology
[52] viXra:2510.0056 [pdf] submitted on 2025-10-11 23:11:58
Authors: Haroon Khan
Comments: 3 Pages. (Note by viXra Admin: Please cite and list scientific references)
This paper proposes a unified hypothesis connecting cosmic expansion, human perception, and dimensional overlap, incorporating spiritual insight into the understanding of reality. Observations suggest that reality may consist of layered frequencies, which occasionally interact with human consciousness, resulting in perceptual shifts, memory anomalies, and other phenomena often dismissed as coincidence. Earth, composed of materials from multiple cosmic sources, functions as a node where these frequencies intersect. Historical and philosophical traditions suggest that conscious beings are naturally limited by a "veil," which aligns with the proposed frequency and dimensional interactions. This framework integrates physical, biological, and metaphysical considerations, inviting independent investigation.
Category: Quantum Physics
[51] viXra:2510.0055 [pdf] submitted on 2025-10-11 23:05:53
Authors: Osvaldo Duilio Rossi
Comments: 2 Pages. (Note by viXra Admin: Please cite and list scientific references)
The quantum nature of reality implies a formulation of conscious perception which implies the many-world theory: every instance (E) of the Totality (S) of particles (s) preserves quantum effects, distributing them among Objects (e) within specific Environments (E), where Objects exist only (as random collapses of wave functions) from the perspective of a Consciousness-Object (c) observing them, while remaining incoherent to other Consciousnesses in different Environments. The Totality allows multiple Punctuations (P) of particles, each P i defining distinct local Objects, reflecting how Consciousness perceives reality through coherent patterns.
Category: General Science and Philosophy
[50] viXra:2510.0054 [pdf] submitted on 2025-10-10 19:47:54
Authors: Nobuyuki Tanamura
Comments: 40 Pages. (Note by viXra Admin: Please cite listed scientific references)
This paper posits that physical phenomena arise not from equations but from the 'mechanisms' created by matter. In other words, it seeks to examine whether the main physical phenomena can be explained through these mechanisms. Firstly, the cause of the Doppler effect in the electric field is considered. This occurs because 'something' that generates an electric field from a charge continues to move at the speed of light. Here, this 'something' is referred to as a 'spatial element.' Furthermore, the reason spatial elements do not instantly exhaust from a charge is because the electric field strength and spatial density are proportional. Additionally, the constancy of the speed of light arises because the amount of spatial elements passing through is proportional to the passage of time. Elementary particles are considered as the points from which spatial elements are ejected. Thus, time, space, and the electric field are three-dimensional entities with direction. Moreover, all physical phenomena arise from the overlap of spatial elements, including gravity, the strong force, the weak force, and electromagnetism. An attempt is made to explain these through the 'mechanisms' formed by spatial elements. There also exist mysterious phenomena that cannot be explained by physics. The mechanisms by which these phenomena occur are explored. In conclusion, physical phenomena are not defined by equations but result from the overlapping of electric fields, which creates a mechanism through which they arise.
Category: Classical Physics
[49] viXra:2510.0052 [pdf] replaced on 2025-11-15 03:41:26
Authors: Hashem Sazegar
Comments: 7 Pages.
Oppermann’s conjecture states that for every positive integer n, there exists at least one prime number between n 2 and n 2 + n. Priorto this, Legendre had conjectured that there is always at least one prime number between n2 and (n + 1)2 . In this paper, we not onlyclaim to prove Oppermann’s conjecture but also propose a smaller interval, asserting that there exists at least one prime between n 2 andn 2 + n/2.
Category: Number Theory
[48] viXra:2510.0051 [pdf] submitted on 2025-10-09 20:57:18
Authors: Theophilus Agama
Comments: 10 Pages. (Note by viXra Admin: Frequent/incessant submissions of highly speculative/abstract articles will not be accepted)
We prove an extension of the lower bound due to Schonhage on addition chains.
Category: Number Theory
[47] viXra:2510.0050 [pdf] submitted on 2025-10-08 21:34:22
Authors: Vincent B. Moneymaker
Comments: 64 Pages. https://doi.org/10.3389/fnsys.2025.1649748
This paper proposes that learning in animals occurs thru sleep and is fundamentally driven by dynamic information valuation processes. These take the form of either pain and pleasure sensations or the more nuanced emotions that evolved from them. Acting as value identifiers, these sensations and emotions enable animals, from the simplest to the most complex, to mark valuable experiences for both retention and later recall. In this way, the paper argues that learning itself is made possible. The remainder of the paper explores the cognitive, neurological and behavioural implications of this framework, including several novel, testable hypotheses derived from it.
Category: Mind Science
[46] viXra:2510.0049 [pdf] submitted on 2025-10-09 20:52:23
Authors: Ekam Chatterjee
Comments: 19 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
Epileptic seizure detection from electroencephalogram (EEG) signals represents a fundamental challenge in computational neuroscience, with traditional approaches limited by their inability to capture complex topological transformations in brain connectivity during ictal events. While topological data analysis has demonstrated promise for EEG analysis, existing methodologies primarily employ persistent homology features with conventional classifiers, failing to leverage the geometric structure inherent in neuralcomputation. To the best of our knowledge, this is the first work that applies topological neural networks—message passing architectures on simplicial complexes—to EEG seizure detection, integrating persistent homology features across multiple distance functions with temporal modeling, building upon Hajij et al.’s foundational work on topological deep learning architectures. The proposed approach introduces a novel 3-layer TNN framework that integrates multi-scale persistent homology with theoretically grounded topological message passing mechanisms. This research establishes mathematical foundationsfor seizure detection through topological invariants and provides convergence guarantees for the neural architecture. The model constructs four complementary distance matrices (correlation, Euclidean, phaselag, and coherence-based) from multi-channel EEG recordings, applying Vietoris-Rips filtrations to extract multi-dimensional topological features across scales. The core innovation lies in the rigorous implementation of the four-step topological message passing framework: message computation, within-neighborhoodaggregation, between-neighborhood aggregation, and feature update, combined with bidirectional LSTMnetworks for temporal modeling. Evaluation on the CHB-MIT dataset across 10 patients using event-based metrics demonstrates an F1-score of 74.36%, establishing the first successful integration of topological neural architectures with neurological signal processing. Theoretical analysis reveals that seizure events exhibit characteristic changes in topological entropy and Betti numbers, providing interpretable biomarkers for clinical translation.
Category: Artificial Intelligence
[45] viXra:2510.0048 [pdf] submitted on 2025-10-09 20:49:17
Authors: Aldrich K. Wooden Sr.
Comments: 8 Pages.
This paper presents PodX, the first Mobile Dis-tributed Data Center (MDDC) engineered to achieve a perfect 100% Weighted Composite Benchmark Index (WCBI) score across all seven XdoP (eXtreme Distributed Operations Platform)domains. Through strategic integration of 14 USPTO patents spanning MDDC, aerospace, automotive, and environmental monitoring technologies, PodX delivers unprecedented capabilities for mission-critical deployments. The system achieves >24-hour DDIL (Disconnected, Disrupted, Intermittent, Limited) autonomy, 99.99% availability, 100% renewable off-grid operation, and MIL-STD-810H environmental compliance while reducing carbon footprint by 51% compared to traditional data centers.We detail the complete system architecture, patent integration strategy, domain-by-domain optimization, and present a five-year roadmap aligned with XdoP standardization milestonesprojecting $500B+ market impact by 2045.
Category: Data Structures and Algorithms
[44] viXra:2510.0047 [pdf] submitted on 2025-10-09 20:43:54
Authors: P. K. Meher
Comments: 22 Pages.
This paper introduces the Energy Hole Model (EHM), as a framework that interprets gravitational interaction as the manifestation of the persistent energy deficit, termed as "energy hole", formed concurrently with the synthesis of mass. It is based on a core hypothesis which states that the synthesis of a mass $M$ requires the confinement of energy $E = Mc^2$, extracted from the surrounding spacetime. This process leaves a corresponding energy deficit (hole) of $-Mc^2$, which acts as the source of the gravitational field. From this premise, we derive the energy hole density profile for point masses, generalize that for stabilized extended objects, and propose a modified Poisson equation. It is demonstrated that a modified Poisson equation recovers Newtonian gravity as a limiting case, and the classical tests, including light bending, Shapiro delay, and gravitational redshift, are in full agreement with observations, establishing their empirical consistency. Beyond reproducing the established tests of general relativity, the EHM provides conceptual resolutions to cosmological puzzles like:(i) the cosmological constant problem, via the corollary of the core EHM hypothesis, which states that energy confinement is a unique physical process, which is not observed in any other phenomenon except mass formation.(ii) the dark matter as the additional energy deficit over that of baryonic mass, (iii) the dark energy as the residual, positive energy of the spacetime vacuum,(iv) and the gravitational behaviour of compact objects and black holes, including the black hole singularity and the hard horizon problem. The gravitation by negative energy is shown explicitly in the Friedmann equations. The EHM thus offers a unified and physically intuitive description of gravity and cosmic structure, fundamentally linking the concepts of energy synthesis, binding, and deficit formation.
Category: Relativity and Cosmology
[43] viXra:2510.0046 [pdf] submitted on 2025-10-09 20:38:51
Authors: Peter M. Enders
Comments: 7 Pages.
I present an axiomatic foundation of non-integrable phases of quantum wave functions like the Aharonov—Bohm phase and show the gauge invariance of the phase difference in the Aharonov—Bohm setup in a much simpler manner than in that article by Kholmetskii et al.
Category: Quantum Physics
[42] viXra:2510.0043 [pdf] submitted on 2025-10-08 18:41:01
Authors: Tatsuyuki Sugawa
Comments: 12 Pages.
In this short report we investigate the wave function of the Universe near Cosmological Singularity in pure Einstein Gravity. The space time is considered in three dimensions for simplicity. We use the minisuperspace model and the canonical quantization . We extend the Hamiltonian Constraint , concretely , from H= 0 to H≈0. So we obtain the Schrodinger equation instead of the Wheeler-De Witt equation. Theresulting wave functions and the energy levels are represented by the Harmonic Oscillator. Our models treat dS spacetime ( k = 1,Λ > 0), which explains the closed expanding universe. However when we consider the neighborhood of the Big Bang Singularity, within the Planck scale, the space time has to be treated as quantum gravity. So our most interested wave function of the universe is the one near the singularity. However the identity of the Hamiltonian constraint is an open question in quantum gravity. Originally Quantum Cosmology was considered as the candidate of the quantum gravity , which is represented by the wave function instead of the metric structure.
Category: Quantum Gravity and String Theory
[41] viXra:2510.0042 [pdf] submitted on 2025-10-08 06:08:54
Authors: Yilin Li
Comments: 10 Pages.
Mathematical and logical reasoning is an important component of human intelligence. Thus, a common metric for evaluating Large Language Models (LLMs) is their ability to solve mathematical problems. Recently, LLMs have shown remarkable performance in completing various tasks such as text generation, text understanding and image analysis. Their mathematical and reasoning ability has also advanced rapidly, allowing them to solve complex algebra problems. However, LLMs still exhibit limitations in describing and reasoning about geometric and spatial concepts, failing to accurately identify and understand the logic within geometric figures. In order to address this gap in understanding, numerous diverse datasets of geometric figures and metadata are needed to continue training their geometric reasoning capabilities. In this research paper, we introduce an innovative algorithm to create synthetic polygon geometric shape datasets, and define methods to integrate synthetic geometric images and metadata into major LLMs for training, validation, and evaluation of their geometric reasoning abilities.
Category: Artificial Intelligence
[40] viXra:2510.0040 [pdf] submitted on 2025-10-08 18:31:01
Authors: Geraldine Geoffroy
Comments: 12 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
This paper proposes a novel architecture for distributed, traceable, and event-driven execution of LLM-related tasks by combining W3C Linked Data Notifications (LDN) with remote Python scripts executed via uv run. This architecture enables any AI task-especially inference- to be executed locally with no software installation, traced via interoperable notifications, and archived with full provenance metadata (e.g., models, parameters, etc.). To achieve this, the system leverages LDN as a semantic pub-sub orchestration layer, combined with uv-based scripts as reproducible, stateless microservices. We demonstrate the value of this architecture for building transparent, auditable, and distributed Large Language Model (LLM) inference workflows with three working proof-of-concepts: (1) a basic semantically-notified inference where notifications populate a register of evidences for transparency, (2) a Retrieval-Augmented Generation (RAG) pipeline triggered by Create events and executed through script-based stages, and (3) a distributed inference setup where task-specific SLMagents independently process jobs and respond via Announce messages. Each stage archive full provenance metadata (model version, script SHA, parameters, runtime) using PROV-O, supporting reproducibility and auditability. This architecture lays the groundwork for a lightweight, decentralized, and FAIR-aligned standard for orchestrating LLM tasks.
Category: Artificial Intelligence
[39] viXra:2510.0039 [pdf] submitted on 2025-10-08 18:28:36
Authors: Aldrich K. Wooden Sr.
Comments: 4 Pages.
Abstract—The convergence of rural healthcare andenvironmental monitoring demands an integrated,AI—native architecture spanning Critical Access Hospitals (CAHs) and rural water utilities. Despite near—universal basic EHR adoption in hospitals, only a minority of CAHs fully exchange data, while a large share of rural water utilities face critical cybersecuritydeficiencies and many rural areas lack minimum broadband capacity required for modern operations [R1—R3]. This white paper synthesizes technical requirements and a reference architecture (ARIS—2025 ) across connectivity, edge computing, data interoperability, andcompliance, mapping vendor ecosystems, cost benchmarks, and phased implementation to achieve resilient, privacy—preserving cross—sector analytics.
Category: Artificial Intelligence
[38] viXra:2510.0038 [pdf] submitted on 2025-10-08 18:26:18
Authors: Kenesov Nursat
Comments: 48 Pages. (Note by viXra Admin: An abstract in the article is required)
In this paper, I explore the idea that time might not just be a parameter we measure events by, but something deeper — a physical quantity that has its own energy and structure. I start with the problem that modern physics, despite all its achievements, still doesn’t really explain what time is. In classical mechanics, it’s just a background variable; in relativity, it becomes flexible but still undefined. So I propose that time should be seen as a measure of energy density — the higher the energy, the slower the flow of time.Using Einstein’s energy equation and the Schwarzschild metric, I show that changes in the rate of time flow (the time dilation factor) can directly describe a body’s total energy. This means that rest energy, kinetic energy, and even gravitational energy are all connected through how time itself "flows" around an object. In short — energy and time are the same thing seen from two sides.Then I extend this idea to antimatter and symmetry. If time can flow forward, it can also flow backward — and that reversed flow corresponds to negative energy, antimatter, and even antigravity. I rewrite Maxwell’s equations for such an inverted world and show that they still work, only with reversed signs: like charges attract, and electromagnetic energy moves "backward." This creates a consistent mirror-world picture that fits CPT symmetry.Finally, I discuss the discreteness of time — whether time moves continuously or in small jumps ("chronons"). Overall, the paper suggests that the flow of time itself could be the key to understanding why our universe is dominated by matter — not antimatter — because time doesn’t just move; it expands in one direction, creating asymmetry at the most fundamental level.
Category: Relativity and Cosmology
[37] viXra:2510.0037 [pdf] submitted on 2025-10-07 18:37:53
Authors: David Park
Comments: 13 Pages.
The Van der Pol oscillator is a nonlinear system known for its self-sustaining oscillation and behavior. This paper analyzes how the system evolves as the damping parameter μ changes, focusing on equilibrium points, phase plane trajectories, and limit cycles. Throughout the paper, we highlight how the equation also relate to physical systems, such as electrical circuits and biological rhythms, showing the significance and relevance of the Van der Pol oscillator in modeling real-world nonlinear behavior.
Category: General Mathematics
[36] viXra:2510.0036 [pdf] submitted on 2025-10-07 18:31:45
Authors: Jorma Jormakka
Comments: 15 Pages.
This article is not a historical look at a rejected theory. It aims to recover a working scalar theory of gravitation from Nordstr"om's old theory because the General Relativity Theory has very serious flaws that cannot be fixed. The scalar gravitational theory does not have the faults that Einstein claimed it had. I hope that the presented article shows that the scalar gravitational theory is a quite good simple classical field theory for gravitation without any serious flaws. It can be quantized easily: the scalar gravitational theory gives the free field Lagrangean of a (real) scalar field for a quantum gravitation theory and can be connected with spontaneous symmetry breaking in the Higgs mechanism, and the Higgs mechanism gives the mass to elementary particles in the Unified Field Theory. As the generating functional $W[J]$ for the free field can be exactly solved, a scalar gravitational theory allows investigating gravitation in the very early universe when all mass still formed a quantum system.
Category: Quantum Gravity and String Theory
[35] viXra:2510.0035 [pdf] submitted on 2025-10-07 18:29:09
Authors: Constantin Sandu
Comments: 8 Pages.
In two earlier studies, we demonstrated that due to the enormous accelerations arising during the perpendicular reflection of a photon by a mirror, the photon’s energy distribution behaves as a quadrupole, thereby generating a graviton (or gravitational wavelet) at the same frequency and direction as the reflected photon. For simplicity, only the contribution of the quadrupole component Qxx was previously considered. Here, we extend the analysis to include all quadrupole components associated with perpendicular photon reflection. By applying the standard Einstein quadrupole radiation formula, we show that the energy of the emitted graviton scales as ν3, revealing a direct coupling between electromagnetism and gravitation. This finding is of high importance because it challenges the long-standing but unverified assumption that graviton energy depends linearly on frequency (ν1). Our results establish that quantum gravity theories must instead incorporate cubic frequency dependence. Another remarkable feature of the proposed framework is that it offers a theoretical connection between general relativity and quantum approaches, suggesting that confined electromagnetic radiation can serve as a direct source of high-frequency gravitational wavelets.
Category: Quantum Gravity and String Theory
[34] viXra:2510.0034 [pdf] submitted on 2025-10-06 20:23:24
Authors: Jaba Tkemaladze, Gabro Gakely
Comments: 19 Pages.
The centriole is a fundamental organelle templating cilia formation and ensuring genomic stability. While most cells assemble centrioles using a pre-existing mother as a template, the de novo pathway allows for assembly in their absence. However, the physiological role and regulation of de novo biogenesis in vivo remain poorly understood. The planarian Schmidtea mediterranea, with its abundant somatic stem cells (neoblasts) and dependence on a massive ciliated epithelium for locomotion, presents a unique model to address this gap. We demonstrate that quiescent neoblasts are acentriolar, lacking the templates for canonical duplication. Upon tissue injury, neoblasts are activated and initiate a programmed de novo centriole biogenesis pathway. Super-resolution microscopy and transmission electron microscopy reveal the formation of cytoplasmic procentriolar foci and mature centrioles, independent of any parental structure. Crucially, genetic ablation of Sas-6 or pharmacological inhibition of PLK4—interventions that effectively block the canonical pathway—fail to prevent the formation of new centrioles and functional basal bodies in the regenerating ciliated epithelium. This work provides the first in vivo evidence in a whole organism for an induced de novo centriole biogenesis pathway in adult somatic stem cells. We propose this pathway is a key evolutionary adaptation, enabling rapid, large-scale ciliogenesis essential for planarian regeneration, and represents a distinct, genetically regulated program separable from canonical duplication.
Category: Biochemistry
[33] viXra:2510.0033 [pdf] submitted on 2025-10-06 20:21:50
Authors: Arturo Tozzi
Comments: 2 Pages. (Note by viXra Admin: Please cite and list scientific references)
For years, I have published across diverse academic journals and disciplines, including mathematics, physics, biology, neuroscience, medicine, philosophy, literature. Now, having no further need to expand my scientific output or advance my academic standing, I have chosen to shift my approach. Instead of writing full-length articles for peer review, I now focus on recording and sharing original ideas, i.e., conceptual insights and hypotheses that I hope might inspire experimental work by researchers more capable than myself. I refer to these short pieces as nugae, a Latin word meaning "trifles", "nuts" or "playful thoughts". I invite you to use these ideas as you wish, in any way you find helpful. I ask only that you kindly cite my writings, which are accompanied by a DOI for proper referencing.
Category: Mind Science
[32] viXra:2510.0032 [pdf] submitted on 2025-10-06 20:17:45
Authors: Leonardo Rubino
Comments: 50 Pages. In English and Italian (Note by viXra Admin: Please cite and list scientific references)
The need we have (with the Schrodinger’s Equation) of running into the imaginary unit (i) is a simple way the nature uses to inform us that a particle, with its wave function Ψ, is half a reality and there exists a complex conjugate (-i) wave function (antiparticle) Ψ* that, together with Ψ, leads us, as to say, to a real "trigonometric" wave (with real values).
Category: Quantum Physics
[31] viXra:2510.0031 [pdf] submitted on 2025-10-06 14:42:59
Authors: Julio Jaramillo
Comments: 13 Pages.
This work introduces a unified operator framework for quantum field theory through the emph{generalized relativistic wave differential operator} $D^{mnlambda}_{alphabeta}$. By appropriate parameter choices, this operator encompasses the Klein-Gordon, Dirac, and non-monogenic operators, revealing their fundamental connections. We propose a emph{quantum field tensor} $Psi_{sigmahotau}$ with binary activation indices for scalar, spinor, and gauge fields, generating all possible interactions while preserving Lorentz covariance. The framework incorporates gauge invariance through minimal coupling and yields both Proca and Maxwell equations as special cases. This approach provides a powerful operator-based unification of quantum field theory with applications to higher-spin theories.
Category: Mathematical Physics
[30] viXra:2510.0030 [pdf] submitted on 2025-10-06 20:11:38
Authors: Neil D. Duffy
Comments: 4 Pages.
The cause of the movement of L rods used by a dowser is investigated. It is observed that the adrenal glands act as an antenna for radiation that causes the dowsing effect and that current from this antenna flowing through the shoulders and arm muscles causes muscle movement that affects the rods. The use of polyethylene screening material and conductive pads to replace the effect of the adrenal glands allows current flowing from the pads to the shoulders to be investigated by observation of the effect of resistors and capacitors in the circuit. Observations suggest that the current is not electromagnetic.
Category: General Science and Philosophy
[29] viXra:2510.0029 [pdf] submitted on 2025-10-05 23:59:57
Authors: Mikheili Mindiashvili
Comments: 21 Pages. 1 Figure (Note by viXra Admin: Further repetition may not be accepted)
We present "φ-geometry," a visual re-parameterization of familiar relations by a single angle on a unit (or scaled) circle: β = sinφ = v/c, γu207b¹ = cosφ, and γβ = tanφ. In this language, key results of special relativity—time dilation, length contraction, the energy—momentum invariant—and collinear velocity addition reduce to elementary trigonometric identities. We also give compact sin/cos forms for basic electromagnetic transformations and the de Broglie pair, introduce an atomic scale φ(Z,n) from v/c ≈ Zα/n, and read the string momentum—winding plane as (cosφ, sinφ). The aim is clarity and pedagogy: one diagram organizes diverse formulas and helps trace transitions across domains. A practical benefit is numerical robustness: switching to the complementary branch (sin ↔ cos) mitigates division-by-zero and precision loss near limiting angles. We do not propose new dynamics. Rather, we present a re-parameterization aimed at clarity and pedagogy. The physical content of SR/GR remains unchanged; what we provide is an alternative geometric language. The accuracy for specific processes and regimes requires further study, analysis, and refinement, which we leave to future work.
Category: Classical Physics
[28] viXra:2510.0028 [pdf] replaced on 2025-10-09 21:20:17
Authors: James A. Smith
Comments: 5 Pages.
Professor Philippe Eenens, who has taught GA-based courses to freshman engineering students, asserts that "For GA to become mainstream, we must convince high-school teachers of its advantages for the teaching of basic topics of mathematics and science." One of those basic topics, according to prize-winning educator Edward Redish, is the "dimensionality" of the variables in science equations. Unlike the variables in most equations that students have worked with in math classes, the variables in science equations represent measurements that have fundamental units like time, length, and mass. Acting upon the advice of professors Eenens and Redish, we show here how the dimensionality of GA’s geometric product might be explored at the high-school level in a way that would be of lasting benefit to students.
Category: Classical Physics
[27] viXra:2510.0027 [pdf] submitted on 2025-10-05 23:54:41
Authors: Andrey V. Voron
Comments: 4 Pages.
The geometric parameters of the pyramidion of Amenemhat III have been calculated based on the numerical values of three proposed size variants. Analysis of the numerical data from these three geometric models yielded results that closely align with the Golden Ratio constant and whole numbers within the metric system. According to our formulated hypothesis, there is a possibility that the metric system was used as a 'key' to interpreting the symbolic information of the Giza pyramid complex, and as a kind of social trigger for contemporary human society.
Category: Archaeology
[26] viXra:2510.0026 [pdf] submitted on 2025-10-05 06:58:33
Authors: Dominique MAREAU
Comments: 7 Pages.
The standard model limits itself to calling the origin of the BIG BANG "singularity". The theory of everything, DUO5, justifies the existence of a Permanent Stochastic Universe (PSU) state by the unavoidable inertial paradox [1]. The PSU is the origin of the Provisional Observable Universe (OPSU) state. This permanent state is represented by a quasi-infinity of preons (also called BODYS or tachyons), in the form of oscillating stochastic dipolar 1D strings. The PSU has a quasi-infinite informational entropy. Such a state (without mass-space-time continuum) has a non-zero probability of synchronizing a part of the preons, in a BEC (Bose-Einstein-Condensate). This BEC is at the origin of the BIG BANG, by a phenomenon of [saturation→fusion→inflation→causal separation] described in [1]. There is a constant between generalized informational entropy and partial negentropy in the cycle [entropy↔negentropy]. Informational negentropy, in the form of synchronization, generates the fundamental preon which, according to DUO5 and John Wheeler [2], is the common elementary particle at the origin of matter and space-time.
Category: Astrophysics
[25] viXra:2510.0025 [pdf] submitted on 2025-10-05 08:21:13
Authors: Jorma Jormakka
Comments: 14 Pages.
If a gravitational theory derived from a metric has metric induced by a scalar field, it is called Nordstr"om's gravitational theory. Einstein claimed that light does not bend in Nordstr"om's theory. The article shows that light does bend in the scalar gravitational theory if light travels along geodesics of the space-time with a scalarmetric. If the scalar theory is understood as describing a field's geometry, not space-time geometry, then light does not travel along geodesics of the gravitational geometry. Lightdoes bend e.g. when passing close to the Sun, but this is caused by matter in the space around the Sun. The article concludes that there are no good reasons to think that light should travel on geodesics of the gravitational field. The article also refutes the geodesic metric for a small test mass and therefore also Einstein's correction to the precession speed of Mercury's perihelion.
Category: Relativity and Cosmology
[24] viXra:2510.0024 [pdf] replaced on 2026-01-10 20:59:02
Authors: Teo Banica
Comments: 400 Pages.
This is an introduction to mathematics, with emphasis on geometric aspects. We first discuss numbers, counting, fractions and percentages, and their basic applications. Then we get into plane geometry, with a study of triangles and trigonometry, followed by coordinates and complex numbers. We then go into functions and analysis, with a detailed discussion of the polynomials, the basics of continuity explained, and with the derivatives and integrals discussed too. Finally, we provide an introduction to vector calculus, space geometry, linear algebra and basic mechanics.
Category: General Mathematics
[23] viXra:2510.0023 [pdf] submitted on 2025-10-05 23:49:44
Authors: Marciano L. Legarde
Comments: 2 Pages. (Note by viXra Admin: An abstract in the article is required and please cite listed scientific references)
I present two results known as the Leaf Theorems, that were initially noticed via numerical experiment and subsequently proved analytically. Each of these theorems illustrates that the disparity between rapidly oscillating and slow growth functions, and rapidly diminishing functions and disappearing power functions, respectively, result in constant, interpretable, and finite values when integrated over the unit segment. Together, these results demonstrate that contrasting mathematical behaviors may cancel in the process of integrating these functions and result in interpretable and finite quantities, and offer apparent and pedagogical demonstrations of real analysis convergence. They could prove helpful for pedagogy, for use in asymptotic analysis, and for applications in number and numerical methods and in probability, and could serve to inform and educate analysts and students in these and related fields.
Category: General Mathematics
[22] viXra:2510.0022 [pdf] submitted on 2025-10-05 17:27:40
Authors: Islem Ghaffor
Comments: 1 Page.
In this paper we prove Collatz conjecture by giving an equivalent formulation of the shortcut Collatz sequence.
Category: Number Theory
[21] viXra:2510.0021 [pdf] submitted on 2025-10-05 23:42:08
Authors: Felipe Wescoup
Comments: 12 Pages. (Note by viXra Admin: Please cite listed scientific reference and submit article written with AI assistance to ai.viXra.org)
This paper provides a practical guide for determining the optimal mixed strategy in two-player, zero-sum games. It presents a method for calculating the Nash Equilibrium by starting with the well-understood 2x2 matrix and intuitively extending the logic to 3x3 and larger NxN scenarios. The purpose is not to derive new mathematical theory, but to make the powerful concepts of game theory accessible to a wider audience, such as coaches, athletes, and business strategists. This paper is supplemented by a GitHub repository containing a spreadsheet tool that performs the calculations, allowing for direct practical application of the concepts discussed.
Category: General Mathematics
[20] viXra:2510.0020 [pdf] submitted on 2025-10-04 17:56:11
Authors: Yew Kee Wong
Comments: 324 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org, please also remove non-academic texts/image)
"Common Prosperity" represents a transformative paradigm shift in our conception of societal progress, decisively moving beyond the narrow confines of Gross Domestic Product (GDP) as the primary measure of success. It envisions a state of holistic well-being where human flourishing encompasses not just material wealth, but also robust physical and mental health, equitable access to quality education and healthcare, meaningful social connections, cultural enrichment, personal safety, and genuine opportunities for self-fulfillment and participation in civic life. Crucially, this prosperity must be fundamentally inclusive, ensuring that the benefits of development are shared equitably across all segments of society, actively dismantling systemic barriers based on gender, race, ethnicity, geography, ocioeconomic status, or other identities, and guaranteeing that marginalized and vulnerable populations are not left behind but are empowered to thrive. Furthermore, Common Prosperity is intrinsically sustainable, demanding that the pursuit of current well-being does not compromise the ability of future generations to meet their own needs; this necessitates responsible stewardship of natural resources, urgent action on climate change, protection of biodiversity, and the building of resilient economic and social systems that operate within planetary boundaries. Ultimately, it is a vision of shared global well-being, recognizing the interconnectedness of all nations and peoples, where collective action and international cooperation foster a world where every individual has the foundation to live a dignified, healthy, and fulfilling life, in harmony with each other and the planet.
Category: Social Science
[19] viXra:2510.0019 [pdf] submitted on 2025-10-04 17:45:58
Authors: Yew Kee Wong
Comments: 214 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org, please also remove non-academic texts/image)
As we stand firmly within the third decade of the 21st century, the global landscape presents a stark and undeniable paradox: unprecedented technological advancement, interconnectedness, and aggregate wealth coexist with profound and persistent inequalities that fracture societies and destabilize nations. While the post-WWII era lifted billions from absolute poverty, the prevailing model of globalization, often prioritizing unfettered market expansion and short-term gains, has demonstrably failed to ensure that the benefits of progress are broadly and equitably shared. Vast swathes of the global population remain marginalized, denied access to basic necessities, quality education, healthcare, and dignified livelihoods, while simultaneously facing existential threats like climate change, pandemics, and resource depletion that disproportionately impact the most vulnerable. This deepening chasm between the privileged and the excluded is not merely a moral failing; it is the primary source of social unrest, political polarization, mass migration, and geopolitical friction that now define our era. Consequently, the pursuit of common prosperity — a state where sustainable economic development is intrinsically linked to social inclusion, environmental stewardship, and shared opportunity for all, within and across borders — transcends mere aspiration. It emerges as the single most urgent, complex, and defining challenge of our time. Addressing it demands a fundamental reimagining of global cooperation, economic systems, and governance structures, moving beyond outdated paradigms to forge a genuinely equitable and resilient future for humanity.
Category: Social Science
[18] viXra:2510.0018 [pdf] submitted on 2025-10-04 05:59:47
Authors: Yvan-Claude Raverdy
Comments: 4 Pages.
This article examines how we can interpret, or understand, the most counterintuitive elements of Quantum Physics such as complementarity, entanglement, superposition, nonlocality, wave packet reduction, etc., using concrete concepts about particles (standing waves) and Superfluid (granular) Space-Time, which we have studied.
Category: Quantum Physics
[17] viXra:2510.0017 [pdf] submitted on 2025-10-04 08:08:56
Authors: Philip Gibbs
Comments: 15 Pages.
Many of the elements in the periodic table play a part in supporting life, either directly in biochemistry or indirectly in some supporting role. This is a catalogue of how each element is used, updated to reflect recent discoveries in marine biology, microbial metabolism, and structural biochemistry. This paper is a substantial revision and update of an earlier review.
Category: Chemistry
[16] viXra:2510.0016 [pdf] submitted on 2025-10-04 09:38:22
Authors: L. Martino, G. Villacrés, S. Arcidiacono
Comments: 12 Pages.
Feature selection is a crucial task in statistics and machine learning, with direct implications for model interpretability and computational efficiency. This study introduces aunifying approach that combines the four possible sequential wrapper methods employedfor variable selection, aiming to exploit their complementary strengths. The proposed procedure computes feature relevance scores and, subsequently, integrates the outputs from each sequential wrapper method. The underlying idea is simple and efficient. We test it in a controlled experiment with a known ground truth. The results indicate that the ranking obtained by consensus clearly outperform the individual rankings obtained by the wrapper methods.
Category: Statistics
[15] viXra:2510.0015 [pdf] submitted on 2025-10-04 10:00:17
Authors: L. Martino, J. Lopez-Santiago, J. Miguez, G. Vazquez-Vilar
Comments: 26 Pages.
When neither prior knowledge nor expert opinion is available, non-informative priors provide a practical alternative for conducting Bayesian inference. However, in the context of model selection, genuinely non-informative priors do not exist. In fact, diu2000use priors on the parameters can drastically alter the value of the Bayesian evidence, making them effectively highly informative, while improper priors are even not allowed. Furthermore, in many real-worldapplications, the use of informative priors can substantially improve the computational efficiency by driving sampling algorithms toward regions of high posterior probability. In this work, we introduce a data-driven procedure for an automatic prior construction. The underlying idea is to exploit the posteriors of the hyper-parameters from non-parametric models, to construct priors for Bayesian inference in parametric models. We test the proposed scheme in four different experiments, two of which involve real astronomical data.
Category: Statistics
[14] viXra:2510.0014 [pdf] submitted on 2025-10-04 17:15:16
Authors: Yew Kee Wong
Comments: 70 Pages. (Note by viXra Admin: For the last time, please submit article written with AI assistance to ai.viXra.org)
Humanity faces unprecedented global challenges—climate change, biodiversity loss, resource scarcity, and social inequity—threatening planetary stability and human well-being. This paper argues that Nature Intelligence (NI), defined as the capacity to understand, emulate, and integrate nature’s time-tested patterns and processes into human systems, is the foundational key to securing a sustainable and thriving future. By examining NI’s core principles, its application to critical challenges, case studies of successful implementation, and strategies for broader adoption, this research demonstrates that NI offers a transformative framework for innovation, resilience, and regeneration. The findings indicate that transitioning to NI-guided systems is not merely advantageous but essential for humanity’s long-term survival and flourishing.
Category: General Science and Philosophy
[13] viXra:2510.0013 [pdf] submitted on 2025-10-04 12:39:32
Authors: Thierry L. A. Periat
Comments: 62 Pages. In French: Les matrices périennes ordinaires - lien avec le problème des trois générations de particules
The belief that one may unify physics, especially the theories concerning electromagnetism and gravitation, motivated my first intuitive ideas in 2003. I reanalyse the way of thinking which has been introduced in the initial document because it is in fact at the origin of the so-called (E) question. The search for answers to this question has allowed the development of diverse methods explaining how to divide (decompose) deformed cross products. It turns out that the kerns of the main parts of the decompositions have remarkable properties, useful in particle physics. This is what this document demonstrates since it succeeds in proving a link between a set of ordinary perian matrices and the mass-matrices proposed twenty-five years ago by theoreticians. The approach I have developed over the past quarter century therefore provides an algebraic and geometric explanation for the matrices which had been proposed by the physicists with the inner logic of particle physics. It helps to give particles a concrete face.FR version - Ma croyance en la possibilité d'unifier les théories de l'électromagnétisme et de la gravitation m'avait motivé à présenter quelques idées en 2003. Je reviens ici sur le raisonnement sous-jacent au document initial parce que c'est lui qui a justifié la naissance de la question dite (E). La recherche de réponse à cette question a permis le développement de méthodes mathématiques qui toutes avaient l'espoir de généraliser la formule initiale. L'étude répétée et approfondie des réponses apportées montre que les noyaux des parties principales des décompositions des produits vectoriels déformés ont des propriétés mathématiques remarquables et utilisables en physique. C'est bien ce que ce document finit par démontrer puisqu'il établit enfin un lien clair entre un ensemble particulier de matrices périennes ordinaires et les matrices des masses proposées il y a maintenant plus de vingt-cinq ans par théoriciens de la physique. L'approche que j'ai développé au cours du quart de siècle passé fournit donc une explication algébrique et géométrique à des matrices qui résultaient jusque-là de la logique interne du modèle standard des particules. Elle contribue ainsi à leur donner un visage plus concret.
Category: Mathematical Physics
[12] viXra:2510.0012 [pdf] submitted on 2025-10-04 16:27:50
Authors: M.S. Petrovskaya
Comments: 42 Pages. Translation of Estimates of the residual members of the Hill series. Bull. ITA, IX, 4 (107). Translator: Thomas S. Ligon, orcit 0000-0002-4067-876X.
Estimates have been obtained of the residual members of the Hill series for cases where the coefficients of these series, which are series by powers of $m (m=n_0/(n_1-n_0 ),n_0,n_1$ — average movements of the sun and moon), calculated with precision of the 2nd, 3rd, 4th, 5th, 6th power of $m$. There are also new estimates of the residual members of Hill's series, based on the powers of $m^2$, considered in the paper (Lyapunov, 1954). Estimates were found for the case $|m|≤sigma (≈0.080849)$, where $sigma$ is the value of the m parameter for the moon.
Category: General Mathematics
[11] viXra:2510.0011 [pdf] submitted on 2025-10-04 16:19:56
Authors: Bernard Lavenda
Comments: 37 Pages. (Note by viXra Admin: Please submit article written with AI assistance to ai.viXra.org)
We propose Ricci UNsuppressed Gravity (RUNG), a framework that rejects Ricci-flat assumptions in favor of full Riemann dynamics. We present a unified framework that demonstrates sectional curvaturesof the Riemann tensor—-not Ricci contractions-—govern gravitational wave (GW) physics. Through rigorous analysis of connection coefficients, we identify how Christoffel symbols mediate stress term suppression while preserving essential physics. Our model reveals longitudinal "breather modes" (ell=0) represent spacetime expansion/contraction at sub-luminal speeds ((v_{ttheta}<1)), distinct from GR's transverse-traceless (v_{ttheta}=c) waves. Static curvature stresses, highlighting nonlinear stress-energy effects absent in General Relativity (GR), and restoring suppressed terms in static solutions like Schwarzschild, are shown to have observational consequences even though they are deemed unobservable through projections onto non-timelike surfaces.
Category: Relativity and Cosmology
[10] viXra:2510.0010 [pdf] submitted on 2025-10-03 19:53:48
Authors: Jorma Jormakka
Comments: 11 Pages. (Note by viXra Admin: Please cite and list scientific references of other authorities besides self-citations)
Section 1 of the article shows that the first Maxwell's equation is not Lorentz invariant when the charge is nonzero and not stationary. The reasons for the failure of Lorentz invariance is that in order for the transformed equation to allow a solution of the untransformed equation, the charge must transform because of velocity of the inertial frame. This is not possible for reasons explained in Section 3. Section 2 looks at the Lorentz covariant formulation of Maxwell's equations that is used in quantum gauge field theories. The equations this formulation use a Lorentz invariant operator but a Lorentz invariant operator does not imply that theequation is Lorentz invariant. Maxwell's equations are not Lorentz invariant.
Category: Relativity and Cosmology
[9] viXra:2510.0009 [pdf] submitted on 2025-10-03 19:54:08
Authors: Jorma Jormakka
Comments: 12 Pages. (Note by viXra Admin: Please cite and list scientific references of other authorities besides self-citations)
The Schwarzschild metric is not a valid metric. Additionally, it does not make the local speed of light equal to c to all directions. It follows that all physical experiments which claim to verify the General Relativity theory by using the Schwarzschild solution as a substantial part of the argument are invalid as verifications of the theory. These include the Pound-Rebka experiment, precession of the perihelion of Mercury, clock dilatation in GPS satellites, Shapiro delay, black holes and bending of light in a gravitational field. Alternative explanations that do not use the the Schwarzschild metric exist [3][6]{19]. By selecting a valid metric that agrees with the Schwarzschild metric on the $x$-axis the article proves that the Einstein equations are not Lorentz invariant. This fact makes the requirement that equations of motion should be Lorentz invariant irrelevant.
Category: Relativity and Cosmology
[8] viXra:2510.0008 [pdf] replaced on 2026-01-11 15:33:29
Authors: Andreas Ball
Comments: 11 Pages.
Besides Leonardo da Vinci's drawing "Vitruvian Man" the author was also occupied with da Vinci's painting "The Last Supper" (performed between 1494/1495 and 1497/1498) and discovered several hidden information of the Gospels in it. Besides various scenes and quotations from the Gospels also connections to an early Renaissance painting are presented. Furthermore one can find aspects of hidden critisism referring the Popes, the first Representatives of Jesus Christ on earth especially for many Catholics. These Popes are of interest who lived at the time of Leonardo da Vinci.
Category: Religion and Spiritualism
[7] viXra:2510.0007 [pdf] replaced on 2025-12-09 01:04:19
Authors: Muhammad Saad Bhatti
Comments: 2 Pages. (Note by viXra Admin: For the last time, Please cite listed scientific references)
Shor’s algorithm factors large integers in polynomial time by reducing the problem to finding the order of a randomly chosen base modulo N. The algorithm succeeds when the chosen base a has an even order and avoids a trivial root of −1. In this paper, we prove a symmetry property: if a is a successful base for Shor’s algorithm, then so is N − a. This symmetry implies that successful bases always occur in pairs, allowing us to restrict the search range of bases to less than N/2 without loss of generality.
Category: Quantum Physics
[6] viXra:2510.0006 [pdf] submitted on 2025-10-02 23:36:17
Authors: Jorge Martinez Fernandez
Comments: 4 Pages. In Spanish
In this work, two real implementations of sliding mode controllers (CMD) are shown based on a model established as a first order plus dead time (FOPDT) plant, the LabVIEW environment will be used to capture sensor data, as well as to generate the control signal that will be sent to a LabVOLT 3501-M pressure control plant.
Category: Digital Signal Processing
[5] viXra:2510.0005 [pdf] submitted on 2025-10-02 23:15:40
Authors: Ikechukwu Iloh Udema
Comments: 18 Pages.
The root mean square mass radii of the proton are produced by means of the extraction of the effective mass from different kinds of photoproduction and, strangely, center of mass energy without overt exploration of the equation (R(m)^2 = 12/m(eff)^2) that ought to contain the missing fundamental constant. Thus, by exploring theoretical and computational means, the study’s goal is to address frameworks’ and related methods’ congruence with the outcomes that could be obtained from them with the following two of four objectives. 1) To derive an equation defining the fundamental constant missing from the controverted equation in the literature; 2) to justify the argument that the missing fundamental constant obscured the relationship, if at all, between QCD and the derived equation, etc. Some of the computed values of root mean square mass radius (R(m)) of the proton based on 1.19518 exp. (-25)/m(eff) (From Eq. (13) are 0.644869 fm (its effective mass (m(eff)) is equal to 1.06 GeV); 0.83361 fm (its m(eff) is equal 0.82 GeV); 0.551246 fm (its m(eff) is equal to 1.24 GeV). If the derived equation R(m)^2 =12ħ^2c^2/m(eff)^2 bears any iota of relevance to the QCD framework, then there was no basis whatsoever for the omission of the fundamental constant, ħc or ħ^2c^2. Thus, the omission, until proved otherwise, only served to obscure the fact that the equation does not have any bearing on the QCD framework. A definite value of Rm is 1.10168 fm. While m(eff) showed an inverse relationship based on a power law with rest mass, future study may focus on the theoretical determination of the former. PACS Number: 40, 12.38, 12.10.KtKeywords: Classical framework, QCD framework, mass-energy equivalence, root mean square mass radius, mass, proton.
Category: High Energy Particle Physics
[4] viXra:2510.0004 [pdf] submitted on 2025-10-02 23:13:50
Authors: Vaggelis Talios
Comments: 16 Pages.
With the discovery of the atoms by Einstein (1905), and the proof that atoms have subdivisions, Rutherford (1911), that is, they consist of other smaller particles, the formation of the Quantum theory, the theory of the particles found within the atom, began. Initially, it was discovered that each atom consists of a nucleus, which is also the solid part of the atom, which is composed of protons (particles with a positive electromagnetic charge) and electrons (particles with a negative electromagnetic charge) around which other electrons rotate. Then it was discovered that the assumption that atoms consist of a nucleus of protons and electrons, around which other electrons revolve, was not correct but it was discovered that the nucleus consists of protons and uncharged particles (not electrons), which the Quantum theory called neutrons, James Chadwick (1932). With the progress of the research, it was discovered that the protons and neutrons, of which the nucleus consists, also have subdivisions, the particles, up and down quarks, Murray Gell-Man (1970).With the discovery of the up and down quarks, which together with the electron were considered to be the elementary particles, that is, the smallest subdivisions of matter, the foundation of the Standard Model theory began, that is, the theory for the investigation of elementary particles and antiparticles, as a branch of the Quantum theory. The Standard Model theory was completed with the discovery of the Higgs particle (2012), a particle that is not an elementary particle but has the property of contributing to the creation of the mass of the remaining elementary particles and antiparticles.The calculations and the various elements of subatomic and elementary particles in the Quantum theory and the Standard Model are based on the Yang-Mills equations formulated in the 1970s and are based on the assumption that the same laws apply to subatomic and elementary particles in the microcosm as they do in the macrocosm. The successful use of Yang-Mills theory to describe the interactions of elementary particles has relied on a subtle quantum mechanical property called the Yang-Mills "mass gap" . Experiments and computer simulations suggest the existence of this "mass gap" in the solution of the Yang-Mills equations, but no theoretical proof of this property is known. The property has only been discovered by physicists in experiments and confirmed in computer simulations, but it has not yet been understood theoretically. Theoretical physicists believe that the explaining of the property of the Yang-Mills "mass gap" will require the introduction of new fundamental ideas, both in physics and mathematics. In fact, to stimulate scientists’ interest in solving the Yang-Mills "mass gap" problem, the problem was included as one of the seven unsolved problems of the Millennium Prizes, announced by the Clay Mathematics Institute, which offers a prize of one million dollars for the solution of each problem.With the progress of science, it was found that while Quantum theory is based on very strong and correct scientific foundations, the Standard Model theory, in addition, have to clarify the case of the Yang-Mills "mass gap", before its establishment, and many other weak points, such as, whether the electron and the up and down quarks are indeed elementary particles, whether quarks move or not inside the nucleus of the atom, whether bosons actually exist, what about the fundamental interactions, whether the Higgs mechanism for the origin of the mass of the elementary particles is the correct mechanism, etc. [5], [6] and [7]. However, regardless of the clarification of the above points of the Standard Model theory, in the section 5, I propose a New Model for describing elementary particles and fundamental interactions to replace the Standard Model. The New Model I propose clarifies all the unanswered questions of the Standard Model, includes the interaction of gravity and, at the same time solves the problem of the Yang-Mills "mass gap".
Category: Quantum Physics
[3] viXra:2510.0003 [pdf] submitted on 2025-10-01 16:20:27
Authors: Taha Sochi
Comments: 70 Pages.
This is the fourth article in our series "The Dark Sides of Modern Science" and is about ethics, morality, values and professionalism in science (and knowledge in general). The remarks that we stated in the Introduction of the first article of this series (i.e. "Knowledge Production and Authoring") generally apply to this article and hence we do not need to repeat.
Category: General Science and Philosophy
[2] viXra:2510.0002 [pdf] submitted on 2025-10-01 19:33:29
Authors: Kenneth C. Johnson
Comments: 30 Pages.
This document contains implementation notes for the MATLAB class "mpoly" (Multivariate Polynomial), which represents a numeric array (of any nonempty size, any number of dimensions) as a polynomial function (any degree) of a set of independent parameters (any number), or as a truncated Taylor series approximation. The class supports most standard array operations (algebra, indexing, etc.), employing automatic differentiation to calculate series coefficients of function outputs.
Category: Classical Physics
[1] viXra:2510.0001 [pdf] replaced on 2026-04-18 15:38:05
Authors: Luca Martino
Comments: 24 Pages.
In this work, we focus on mixtures with negative coefficients and their applications in computational statistics. Mixtures of probability densities are widely used in statistics and machine learning. While classical mixtures restrict weights to be non-negative, allowing negative weights enables more flexible density approximation. However, negative weights introduce challenges in handling and sampling such distributions. For this purpose, we propose efficient Monte Carlo (MC) methods (including MC quadratures, rejection sampling and importance sampling schemes) for computing integrals and generating samples from these mixtures. A tailored proposal density ensures accurate and efficient generation of (unweighted) samples. Furthermore,we introduce an IS scheme which employs a mixture with negative coefficients as a proposal density, yielding samples with both positive and negative importance weights. Applications in Gaussian process-based density estimation demonstrate the practical relevance and efficiency of proposed schemes. An adaptive importance sampling procedure based on GP-regression is also proposed. The numerical results provide clear empirical evidence of the accuracy and computational efficiency of the proposed methods.
Category: Statistics