miércoles, 28 de enero de 2026

Carta abierta al Presidente Abinader: Un método dominicano para el Cálculo Integral

Santo Domingo, 28 de Enero del 2026

Señor

Luis Rodolfo Abinader Corona

Presidente de la República Dominicana

C.c.: Directores y redactores de los principales medios de comunicación nacionales e internacionales

De mi consideración:

Me dirijo a usted y, por su intermedio, a los medios de comunicación del país para comunicar un avance científico que considero de interés nacional y para solicitar el apoyo institucional necesario para que la República Dominicana aproveche y difunda esta aportación.

Soy Emmanuel Antonio José García. He publicado recientemente en arXiv (Cornell) el trabajo “A Unified Substitution Method for Integration” (enlace: https://arxiv.org/abs/2505.03754), en el que presento el Método de Sustitución Unificada (USM), una propuesta matemática y metodológica destinada a simplificar y acelerar la resolución de integrales que aparecen de manera frecuente en matemáticas aplicadas, ingeniería, física y ciencias de datos.

Resumen de la contribución

El USM es un método unificado para integrar expresiones con radicales cuadráticos y composiciones trigonométricas de medio ángulo, fundamentado en identidades algebraicas explícitas para las exponenciales de funciones trigonométricas inversas principales e^{± i cos⁻¹(y)} y e^{± i sec⁻¹(y)}, lo que permite derivar cinco transformaciones parametrizadas que convierten dichas integrales en formas racionales en un solo parámetro, manejando de manera coherente tanto los casos circulares como hiperbólicos. Este marco no solo subsume y generaliza técnicas clásicas, como las sustituciones de Euler (primera y segunda) y la sustitución de Weierstrass, sino que también simplifica significativamente el manejo de ramas y signos, ofrece ventajas computacionales al reducir la hinchazón de expresiones y mejora la eficiencia en la integración de estructuras mixtas.

Resultados comparativos relevantes

Para ofrecer evidencia empírica, ejecuté un benchmark con 100 integrales representativas y comparé el rendimiento del USM con la función `Integrate` de Mathematica:

  • USM fue más rápido en 82 de 100 casos.
  • USM produjo una antiderivada de menor tamaño (ByteCnt) en 50 de 100 casos.
  • Incidencia de antiderivadas “monstruo” (≥ 10.000 bytes): USM: 5 casos vs Integrate: 24 casos.
  • Máximo tamaño observado: USM: 19,840 bytes; Integrate: 150,360 bytes. En otro mini-benchmark (Ejemplo 19), el recuento de bytes de Integrate superó los 600,000 (¡la antiderivada ocupa 20 páginas!), mientras que para USM no superó los 5,000 (y la antiderivada cabe en media página).

Estos resultados se traducen en dos beneficios prácticos: ahorro de tiempo de cómputo y expresiones simbólicas más legibles y reutilizables, lo que facilita su integración en pipelines de ingeniería y en material educativo avanzado.

Vinculaciones teóricas del USM

El USM está estrechamente relacionado con conceptos matemáticos de gran utilidad. Como señaló el físico alemán Fred Hucht en MathOverflow (foro donde di a conocer la primera versión del USM): 

“The OP's relations are related to the Gudermannian...", 

lo que lo conecta con identidades elípticas y la Transformación Imaginaria de Jacobi”. La Gudermanniana es fundamental en aplicaciones como la proyección cartográfica de Mercator, mientras que la estructura paramétrica del USM se asemeja a la Transformada de Joukowsky (clave en aerodinámica para el diseño de perfiles alares) y a la Transformada de Tustin, usada en control digital para discretizar sistemas dinámicos. 

Reconocimientos y revisiones externas

El trabajo ha suscitado interés y comentarios de especialistas con trayectoria internacional:

Dr. Oleg Marichev (Wolfram Research, figura legendaria de la integración simbólica):

“I was impressed, looking on your files. I saw holes in my work, that you already found and you can fix them (even without understanding many moments). I felt that we can work.”

“You made large improvement to collecting formulas for doable Integrate situation because we with you found wide class of cases for MeijerG.”

“As I wrote, I have built collection with near 4500 cases of MeijerG. If we remove special functions we have subset of such elementary functions. There we have subset of algebraic functions. I am doing re-organization of this collection and see how important and how large subclass Fun[v ArcGun[z]]^n that you found.”

Dr. Sam Blake (PhD, Univ. Monash, investigador; ex-ingeniero en Wolfram Research y conocido por su participación en el descifrado del célebre Zodiac Cipher):

“That’s a very neat trick… As far as I know this is a new result.”

Daniel Lichtblau (Wolfram Research):

“You are certainly getting nice results, and we'll take a look at it.”

Ninad Munshi (ex-ingeniero de la NASA):

“Complexification formulas are great and it seems like this simplifies the right away.”

Kamila Szewczyk (programadora experta):

“One benefit of your method that I see over Rubi is that the process of applying transformation rules in USM is much clearer and more efficient to evaluate (no need to rely on transformation heuristics).”

Importancia histórica y cultural de la integración


     


La integración no es solo una técnica matemática: es una herramienta que ha modelado el progreso científico. Lo subrayan objetos culturales oficiales (una moneda conmemorativa vinculada a la técnica de integración de Ostrogradski y una estampa postal que honra a P. L. Chebyshev) que evidencian cómo los estados y las comunidades científicas reconocen la integración como patrimonio intelectual y cultural.



Por qué esto importa para la República Dominicana

1. Innovación descentralizada: que una contribución en un área clásica como el Cálculo Integral provenga de un ingeniero dominicano demuestra que nuestro país puede generar conocimiento original en áreas matemáticas de alto impacto.

2. Aplicaciones tecnológicas: la reducción de tiempos de cómputo y la menor proliferación de expresiones simbólicas gigantescas beneficiarán desarrollos en software.

3. Potencial educativo: incorporar una metodología unificada podría simplificar la enseñanza del Cálculo Integral en bachillerato y universidad, privilegiando la comprensión sobre la memorización.

Solicitudes concretas 

Con respeto, solicito al señor Presidente y a las autoridades competentes las siguientes acciones:

1. Reconocimiento institucional y difusión oficial. Que la Presidencia y el Ministerio correspondiente (MESCYT / instituciones científicas nacionales) respalden la difusión del hallazgo y promuevan su consideración en foros académicos y tecnológicos.

2. Divulgación mediática responsable. Invito a los medios a cubrir el trabajo con rigor, entrevistando a expertos y verificando las cifras y resultados, para que el país conozca y evalúe la importancia del avance.

Ofrezco mi compromiso de colaborar estrechamente con las instituciones que lo soliciten: puedo presentar los datos del benchmark y entregar material didáctico (apuntes, ejemplos resueltos y código). 

Creo firmemente que las matemáticas pueden y deben ser un motor de desarrollo social y económico. El USM es, en mi opinión, una oportunidad para que la República Dominicana demuestre su capacidad de producir conocimiento relevante y para transformar esa producción en ventajas educativas y tecnológicas concretas.

Agradezco su atención, quedo a disposición para una reunión informativa y para coordinar las acciones que sean pertinentes.

Atentamente,

Emmanuel Antonio José García

Ingeniero 

República Dominicana

lunes, 15 de diciembre de 2025

Weierstrass as a Special Case of the USM Framework

"The world's sneakiest substitution." Michael Spivak


The classical Weierstrass substitution for integrals of the form
\[\int R(\sin\omega, \cos\omega)\,d\omega\]
is a special case of Transform 5 in the USM framework, corresponding to the circular case with parameters \(a = 1\) and \(b = 0\).

Derivation
Let
\[I = \int R(\sin\omega, \cos\omega)\,d\omega.\]
Set \(x = \sin\omega\), so that
\[\cos\omega = \sqrt{1 - x^2} \quad (\text{using the principal square root, e.g., } \cos\omega \geq 0 \text{ for } \omega \in [-\pi/2,\pi/2]),\]
and
\[d\omega = \frac{dx}{\cos\omega} = \frac{dx}{\sqrt{1 - x^2}}.\]
Hence,
\[I = \int \frac{R\!\left(x, \sqrt{1 - x^2}\right)}{\sqrt{1 - x^2}}\,dx.\]
Transform 5 handles integrals involving \(\sqrt{a^2 - (x+b)^2}\) on the domain \(|y| \leq 1\) with \(y = (x+b)/a\). Take \(a = 1\), \(b = 0\) (so \(y = x\)) and let the parameter be \(r\) (as in the paper). The transform gives:
\[x = \frac{2r}{1+r^2}, \quad \sqrt{1 - x^2} = \frac{1 - r^2}{1 + r^2}, \quad dx = \frac{2(1 - r^2)}{(1 + r^2)^2}\,dr.\]
Substitute into the integral
\[\begin{aligned} I &= \int \frac{R\!\left(x, \sqrt{1 - x^2}\right)}{\sqrt{1 - x^2}}\,dx \\ &= \int \frac{R\!\left(\frac{2r}{1+r^2}, \frac{1 - r^2}{1 + r^2}\right)}{\frac{1 - r^2}{1 + r^2}} \cdot \frac{2(1 - r^2)}{(1 + r^2)^2}\,dr \\ &= \int R\!\left(\frac{2r}{1+r^2}, \frac{1 - r^2}{1 + r^2}\right)\frac{2}{1 + r^2}\,dr.\end{aligned}\]
The final expression is exactly the Weierstrass substitution formula:
\[\int R(\sin\omega, \cos\omega)\,d\omega= \int R\!\left(\frac{2r}{1+r^2}, \frac{1 - r^2}{1 + r^2}\right) \frac{2}{1 + r^2}\,dr.\]
On the principal branch where \(\psi=\sin^{-1}(x)=\omega\), this parameter is
\[r=\tan\!\Bigl(\frac{\psi}{2}\Bigr)=\tan\!\Bigl(\frac{\omega}{2}\Bigr),\]

Thus, the Weierstrass substitution emerges naturally from Transform 5 by setting \(a = 1\), \(b = 0\) and interpreting the integrand appropriately. This demonstrates (again!) that the USM unifies and generalizes classical substitution techniques (refer to Section 6 in the paper, which details how the USM also encompasses Euler substitutions 1 and 2), including the half‑angle tangent substitution of Weierstrass. Moreover, USM creates new "Weierstrass-like" substitutions (Transforms 1 & 2) that work for hyperbolic/algebraic regions ($|y| \ge 1$) where the standard $\tan(\omega/2)$ is not typically applied.

miércoles, 10 de diciembre de 2025

MIT Integration Bee 2023 - Finals - Problem 3

The problem:



SolutionWe assume $x > 0$.

$$\begin{aligned}I &= \int \sqrt{x^2+1+\sqrt{x^4+x^2+1}} \, dx \\ &= \int \frac{t^2-t+1}{(2t-1)^{3/2}\sqrt{t-2}} \, dt & \left(t = x^2+1+\sqrt{x^4+x^2+1}\right) \\[1em] &= 2 \int \frac{u^4+3u^2+3}{(2u^2+3)^{3/2}} \, du & \left(u^2 = t-2\right) \\[1em] &= \frac{\sqrt{2}}{2} \int \frac{3s^8+12s^6+34s^4+12s^2+3}{8s^3(s^2+1)^2} \, ds & \left(\text{USM Transform 3: } u = \sqrt{\frac{3}{2}}\frac{s^2-1}{2s}\right) \\[1em] &= \frac{\sqrt{2}}{2} \int \left( \frac{3s}{8} + \frac{3}{4s} + \frac{3}{8s^3} + \frac{2s}{(s^2+1)^2} \right) \, ds & (\text{PFD}) \\[1em] &= \frac{\sqrt{2}}{2} \left( \frac{3s^2}{16} + \frac{3}{4}\ln|s| - \frac{3}{16s^2} - \frac{1}{s^2+1} \right) + C \\[1em]
&= \frac{3\sqrt{2}}{32}\left(s^2 - \frac{1}{s^2}\right) + \frac{3\sqrt{2}}{8}\ln|s| - \frac{\sqrt{2}}{2(s^2+1)} + C
\end{aligned}$$

Where:
$$s = \sqrt{\frac{2}{3}}u + \sqrt{\frac{2}{3}u^2+1}, \quad u = \sqrt{t-2}, \quad t = x^2+1+\sqrt{x^4+x^2+1}.$$

Using the symmetry of the even function $f(x)$, we calculate $2 \int_{0}^{1/2} f(x) \, dx$:

$$\begin{aligned}\text{Limits for } s: \quad & x=0 \implies s=1 \\
& x=1/2 \implies s = \sqrt{\frac{2+\sqrt{7}}{\sqrt{3}}}\end{aligned}$$

$$\begin{aligned}\int_{-1/2}^{1/2} f(x) \, dx &= 2 \left[ F(s) \right]_{1}^{\sqrt{\frac{2+\sqrt{7}}{\sqrt{3}}}} \\[1em] &= 2 \left[ \left( \frac{\sqrt{14}}{8} - \frac{\sqrt{2}}{4} + \frac{3\sqrt{2}}{16}\ln\left(\frac{2+\sqrt{7}}{\sqrt{3}}\right) \right) - \left( -\frac{\sqrt{2}}{4} \right) \right] \\[1em] &= 2 \left[ \frac{\sqrt{14}}{8} + \frac{3\sqrt{2}}{16}\ln\left(\frac{2+\sqrt{7}}{\sqrt{3}}\right) \right] \\[1em] &= \frac{\sqrt{14}}{4} + \frac{3\sqrt{2}}{8}\ln\left(\frac{2+\sqrt{7}}{\sqrt{3}}\right) \end{aligned}$$

sábado, 29 de noviembre de 2025

Calculus: A Little Story of Unification

One man’s trick may be another man’s method, and there may be more to an apparent trick than first meets the eye. 
— Todd Trimble

In my somewhat crazy obsession with proving/deriving or generalizing everything from the half–angle formulas, one day I sat down to try to generalize my own generalization of the Newton–Mollweide formula. In the process, I found an expression that factored neatly in terms of sines and cosines of half-angles (aha!). Playing with that expression I derived an unusual trigonometric formula for the roots of quadratic polynomials. In a few days I realized that this formula was just one member of a whole family of trigonometric formulas for the roots of quadratic polynomials (very wild formulas, by the way).

If we have several formulas that all produce the same two roots, then we can combine them to generate expressions that are numerically identical, right? That is how Theorems 1–2 in this draft arose. From these two theorems I was able to derive five transformations that achieve:

Unification. They unify the use of complex exponentials with half-angle tangents substitutions, as well as hyperbolic parametrizations and Euler substitutions (1 and 2; the third seems to have been added by someone other than Euler). In Section 6 I show how Euler substitutions 1 and 2 are recovered (up to trivial reparametrizations) by Transformations 2 and 5, respectively.

Automatic sign handling. You do not have to worry about signs depending on the domain, since the branch-wise back-substitution formula automatically takes care of them for you.

Usefulness for CAS. They allow one to solve integrals built from
$$\tan\!\left(\tfrac12\sec^{-1}/\csc^{-1}(\dots)\right),$$
without much difficulty (most CAS systems fail here). Please have a look at the results of this benchmark against Mathematica (MMA). Using a branch-wise back-substitution, the USM (that is what I call this method) beat MMA in speed in $67/100$ cases with an average speed-up of $\times 34$. It produced only $5$ monstrous antiderivatives versus $24$ from MMA: the maximum byte count of USM was $21{,}616$ versus $150{,}360$ for MMA. In another mini-benchmark (Example 19), the byte count of MMA was above $600{,}000$ (the antiderivative takes $21$ pages!) while for USM it did not exceed $5000$ (and the antiderivative fits in half a page).

As an illustration, consider the following integral (Example 5 in the draft):

$$\int \sqrt{\frac{x+1}{x+3}}\,dx \qquad (x \ge -1).$$

Notice that

$$\frac{x+1}{x+3} = \frac{x + b - a}{x + b + a}$$

with $a = 1$, $b = 2$. Apply Transform 2 (upper sign for $x \ge -1$):

$$\sqrt{\frac{x+1}{x+3}} = \frac{1 - t}{1 + t}, \qquad dx = \frac{t^2 - 1}{2t^2}\,dt, \qquad t = x + 2 - \sqrt{x^2 + 4x + 3}.$$

Thus

$$\int \sqrt{\frac{x+1}{x+3}}\,dx = \int \frac{1 - t}{1 + t} \cdot \frac{t^2 - 1}{2t^2}\,dt = -\frac12 \int \Bigl(1 - 2t^{-1} + t^{-2}\Bigr)\,dt = \ln|t| + \frac12\bigl(t^{-1} - t\bigr) + C,$$

hence

$$\int \sqrt{\frac{x+1}{x+3}}\,dx = \ln\!\bigl(x+2-\sqrt{x^2+4x+3}\bigr) + \sqrt{x^2+4x+3} + C.$$

It is instructive to contrast the algebraic economy of USM with standard approaches for this integrand. The classical rationalization $u = \sqrt{\frac{x+1}{x+3}}$ yields the rational form $\int \frac{4u^2}{(u^2-1)^2}\,du$, which typically necessitates a rather cumbersome partial fraction decomposition. Trying to bypass this with a second substitution introduces its own friction: The hyperbolic choice $u = \coth z$ leads to a fairly manageable integration of $\cosh^2 z$, but the back-substitution is algebraically tedious, requiring double-angle expansions and inverse hyperbolic identities to revert to $(x)$. The trigonometric choice $u = \sec \theta$ leads to the laborious integral $\int \csc^3 \theta\,d\theta$, which usually involves recursive integration by parts or a reduction formula that almost nobody remembers. Crucially, both traditional paths impose a distinct second layer of substitution ($x \to u \to z$ or $\theta$), whereas USM Transform 2 structurally cancels the denominator in a single step, collapsing the integrand immediately to the elementary expression $1 - 2t^{-1} + t^{-2}$.

Relation to $y = \frac12\left(t + t^{-1}\right)$ and $t = x \pm \frac1x$

Now observe that setting

$$y = \frac12\left(t + t^{-1}\right)$$

has an effect equivalent to what we did previously using USM.

The starting integral is

$$\int \sqrt{\frac{x+1}{x+3}}\,dx.$$

In the general setup, with parameters $a > 0$ and real $(b)$, the normalized variable is defined by $y = \frac{x + b}{a}$. For this specific example we have $a = 1$ and $b = 2$, so the normalization is simply $y = x + 2$. If we now apply the substitution $y = \frac12\left(t + \frac1t\right)$, we obtain the same rational form immediately:

$$\begin{aligned} y &= \frac{t^2 + 1}{2t} \quad\implies\quad dx = dy = \frac12\left(1 - \frac1{t^2}\right)\,dt = \frac{t^2 - 1}{2t^2}\,dt, \\[10pt] \sqrt{\frac{x+1}{x+3}} &= \sqrt{\frac{y-1}{y+1}} = \sqrt{\frac{\frac{t^2 - 2t + 1}{2t}} {\frac{t^2 + 2t + 1}{2t}}} = \sqrt{\frac{(t-1)^2}{(t+1)^2}} = \frac{|t-1|}{t+1} = \frac{1 - t}{1 + t} \quad\text{(for } x \ge -1\text{)}, \\[10pt] \int \sqrt{\frac{x+1}{x+3}}\,dx &= \int \underbrace{\frac{1 - t}{1 + t}}_{\text{Radical}} \cdot \underbrace{\frac{t^2 - 1}{2t^2}\,dt}_{\text{Jacobian}}. \end{aligned}$$

More generally, for expressions of the form

$$\sqrt{\frac{x + b - a}{x + b + a}},$$

defining $y = \dfrac{x+b}{a}$ gives

$$\frac{x + b - a}{x + b + a} = \frac{ay - a}{ay + a} = \frac{y-1}{y+1},$$

so the same pattern repeats in the general case. Solving integrands of the type $\sqrt{\frac{x+p}{x+q}}$, where $p$ and $q$ are real numbers, via the substitution $y = \frac12\left(t + \frac1t\right)$ is quite unusual (to my surprise). In the Math StackExchange community, you can find several threads (see here and here for examples) where integrators take significantly more convoluted routes for such integrals, rarely using this substitution.

The natural question is: What is the relation between the substitution $y = \tfrac12(t + t^{-1})$ and the substitution $t = x \pm \frac1x$ (which integrators often use (see here) when dealing with pseudo-elliptic integrals such as $\int \frac{x^2-1}{(x^2+1)\sqrt{x^4+1}}\,dx$)?

Relation between $y = \tfrac12(t + t^{-1})$ and $t = x \pm \tfrac1x$

Both substitutions are, in essence, two presentations of the same underlying rational transformation, just written with different variable names and possibly rescaled.

Begin with

$$y = \frac12\left(t + \frac1t\right).$$

Multiply by $(2t)$:

$$2yt = t^2 + 1 \quad\Longrightarrow\quad t^2 - 2yt + 1 = 0.$$

Seeing this as a quadratic in $(t)$, we get

$$t = \frac{2y \pm \sqrt{(2y)^2 - 4}}{2} = y \pm \sqrt{y^2 - 1}.$$

So the inverse of our substitution is

$$t = y \pm \sqrt{y^2 - 1}.$$

Now consider the substitution commonly used for pseudo-elliptic integrals:

$$t = x \pm \frac1x.$$

Take, for concreteness, the plus sign:

$$t = x + \frac1x.$$

Multiply both sides by $(x)$:

$$tx = x^2 + 1 \quad\Longrightarrow\quad x^2 - tx + 1 = 0.$$

Viewed as a quadratic in $(x)$, we obtain

$$x = \frac{t \pm \sqrt{t^2 - 4}}{2}.$$

Now compare this with the inverse of your substitution $y = \frac12(t + t^{-1})$, namely

$$t = y \pm \sqrt{y^2 - 1}.$$

If we perform a simple rescaling

$$y = \frac{t}{2},$$

then

$$x = \frac{t \pm \sqrt{t^2 - 4}}{2} = \frac{2y \pm \sqrt{4(y^2 - 1)}}{2} = y \pm \sqrt{y^2 - 1}.$$

But this last expression is exactly the same functional form as the inverse of our substitution. The only difference is which symbol we call the “input” and which we call the “output,” plus that harmless factor of $2$.

So, up to the linear rescaling $(y = t/2)$ and a relabeling of variables, the equations

$$y = \frac12\left(t + \frac1t\right) \quad\text{and}\quad t = x + \frac1x$$

describe the same algebraic relation between two variables and its inverse.

In particular:

Our substitution uses

$$t \longmapsto y = \frac12\left(t + \frac1t\right).$$

The pseudo-elliptic substitution can be seen as

$$x \longmapsto t = x + \frac1x,$$

and when you solve for $x$ in terms of $t$, you get the same square-root structure as when you solve for $t$ in terms of $y$.

We can construct our own pseudo-elliptic integrals. Assume $t>0$ for simplicity. Notice that 

$$\int \sqrt{x-\sqrt{x^2-1}}\,dx \overset{x=t-\frac1t}{=} \int \frac{t^2+1}{\sqrt{t^5-t^3+t^3\sqrt{t^4-3t^2+1}}} \, dt$$

We can return to the original integrand by doing

$$\int \frac{t^2+1}{\sqrt{t^5-t^3+t^3\sqrt{t^4-3t^2+1}}} \, dt \overset{x=t-\frac1t}{=}  \int \sqrt{x-\sqrt{x^2-1}}\,dx.$$

Similarly,

$$ \int \frac{1}{x\sqrt{x^{2}-2}}\,dx\overset{x = t+\frac1t}{=} \int \frac{t^{2}-1}{(t^{2}+1)\sqrt{t^{4}+1}}\,dt$$

or

$$\int \sqrt{\frac{x+b-a}{x+b+a}}\,dx\;\overset{x=\frac12\left(t-\frac1t\right)}{=}\;\int \sqrt{\frac{t^2 - 1 + 2t(b-a)}{t^2 - 1 + 2t(b+a)}} \cdot \frac{t^2+1}{2t^2}\,dt.$$

Finally, I opened this story with a comment from Todd Trimble that I found on Todd and Vishal's blog: “...One man’s trick may be another man’s method, and there may be more to an apparent trick than first meets the eye.” Let me be so bold as to claim that the method Todd is talking about is very likely USM, and that man is me. Thank you for reading.

lunes, 24 de noviembre de 2025

Benchmarking USM Transform #3 vs. Mathematica’s Integrate - Part 2

In these tables we benchmark the Unified Substitution Method (USM) change of variables that this arXiv draft (version 1) calls Transformation 3 (this is the “Transform 1” used in the Mathematica code: see this notebook): a half-angle substitution that converts integrals built from tan(½ csc⁻¹((x+b)/a)) and tan(½ sec⁻¹((x+b)/a)) into a rational integrand in a new variable t. The general transformation formula is

$$\int f\!\left[x,\,\tan\left(\tfrac12\csc^{-1}\left(\frac{x+b}{a}\right)\right),\,
\tan\left(\tfrac12\sec^{-1}\left(\frac{x+b}{a}\right)\right)\right]\,dx=
\int f\!\left(a\,\frac{t^{2}+1}{2t} - b,\, t,\, \frac{1-t}{1+t}
\right)\, a\,\frac{t^{2}-1}{2t^{2}}\,dt.\tag{1}$$


For each integrand in the 10 datasets, we compare Transform 1 + back-substitution against Mathematica’s Integrate by timing both methods (total t-USM time, split into the y≥1 and y≤−1 branches, versus Integrate time) and by measuring the structural size of the resulting antiderivatives (LeafCnt and ByteCnt for USM and for Integrate). The integrands are systematically built from tan(½ arccsc(…)) and tan(½ arcsec(…)), with powers, factors of x, x², x³, and rational combinations such as 1/(1+tan²(…)) and products of the two half-angle tangents, so the benchmark probes exactly the niche Transform 1 is designed for, from simple to highly intricate cases. Overall, the data show that on many of the “hard” mixed cases USM is much faster (often by an order of magnitude) while producing antiderivatives of comparable or smaller complexity; on very simple, pattern-friendly integrands Integrate can be faster because USM has a fixed overhead; and across the full test family USM timings are more predictable, with the y≥1 / y≤−1 branch split adding only modest extra cost. In short, Transform 1 is a robust, domain-specific integrator for these arccsc/arcsec tan-half-angle families: it typically yields simpler or similar antiderivatives and large speedups on difficult examples, at the price of some overhead on the easy ones. 

Conclusions from the benchmark

1. On many “hard” mixed cases, Transform 1 is much faster than Integrate.

In datasets like 3, 5, 7, and 8 (the ones with products and rational combinations of both arccsc and arcsec half-angle tangents), Integrate often takes from a few tenths of a second up to several seconds, while t-USM usually stays in the tens to low hundreds of milliseconds.

In some individual examples we get around one order of magnitude speedup: Integrate is in the 1–10 second range while t-USM is still below about 0.2 seconds.

⇒ For structurally complicated expressions in this class, the Transform 1 route is clearly advantageous.

2. On simple or pattern-friendly cases, Integrate can be faster than Transform 1.

In datasets like 1, 4, 9, and 10, several examples have Integrate times of just a few milliseconds, while t-USM has a relatively fixed overhead (often between about 0.02 and 0.06 seconds).

In those cases, Mathematica recognizes a very simple pattern (such as standard tan or rational trig identities) and wins on raw speed.

⇒ Our transform has a nontrivial constant overhead. It shines when the problem is hard for Integrate, but cannot beat Mathematica’s near-instant pattern match on the easy ones.

3. Runtime variability vs. predictability

Integrate is highly variable: sometimes extremely fast, sometimes very slow, even for similar-looking integrands in the same dataset.

t-USM is more stable: most examples sit in a narrow time band, with far fewer extreme slowdowns.

⇒ Transform 1 gives more predictable performance over this whole integrand family, whereas Integrate is opportunistically very fast but with occasional expensive spikes.

4. Result size and complexity stay comparable and reasonable.

The leaf and byte counts show that:

USM antiderivatives are usually similar or somewhat larger in size compared to Integrate’s results, reflecting the mechanical tan-substitution and back-substitution.

There is no systematic explosion in size: the USM expressions stay in the same general range as the ones produced by Integrate.

⇒ From a “how big and messy is the final formula?” standpoint, Transform 1 is competitive and practical, even if it does not always find the most compact form that Integrate sometimes can.

5. The y >= 1 / y <= -1 split is reasonable and not a major cost.

The USM y>=1 and USM y<=-1 times are usually of the same order, with the y >= 1 branch often a bit slower but not dramatically.

Summing them to get t–USM total time roughly doubles the branch time, but that combined cost is still modest compared with the multi-second peaks seen in Integrate.

⇒ The branch-based back-substitution strategy works well in practice and does not dominate the runtime.

Some more specific details
  • USM total time was faster than Integrate in 82 cases.
  • USM produced a simpler antiderivative (smaller ByteCnt) than Integrate in exactly 50 cases.
  • “Monster” antiderivatives (ByteCnt >= 10,000) occurred 5 times for USM and 24 times for Integrate.
  • The largest ByteCnt observed for a USM antiderivative was 19,840, compared with 150,360 for Integrate.

martes, 19 de agosto de 2025

A generalization of the law of cotangents

Introduction (classical law of cotangents)
In trigonometry, the law of cotangents is a relationship among the side lengths of a triangle and the cotangents of the halves of its angles.
For a triangle with side lengths \(a',b',c'\) opposite the vertices \(A,B,C\) respectively, let
\[s=\frac{a'+b'+c'}{2}\quad\text{and}\quad r=\text{inradius}.\]
If the angles at \(A,B,C\) are \(\alpha',\beta,\gamma\), then
\[\boxed{\;\frac{\cot(\alpha'/2)}{s-a'}=\frac{\cot(\beta/2)}{s-b'}=\frac{\cot(\gamma/2)}{s-c'}=\frac{1}{r}\; }.\]

In this note, we generalize the law of cotangents to cyclic quadrilaterals.

Setup (cyclic quadrilateral)
Let \(ABCD\) be a cyclic quadrilateral with side lengths
\[|AB|=a,\quad |BC|=b,\quad |CD|=c,\quad |DA|=d,\qquad s=\frac{a+b+c+d}{2}.\]
Set \(\alpha=\angle BAD\), \(\beta=\angle ABC\), \(\gamma=\angle BCD\), \(\varphi=\angle CDA\) (see Figure 1). Let \(\Delta\) denote the area of \(ABCD\).

Figure 1. A cyclic quadrilateral $ABCD$.

Lemma (Half–angle formulas). For \(\alpha=\angle BAD\),
\[\boxed{\;\sin^2\frac{\alpha}{2}=\frac{(s-a)(s-d)}{ad+bc},
\qquad\cos^2\frac{\alpha}{2}=\frac{(s-b)(s-c)}{ad+bc}
\;}.\]

Proof. Let \(\gamma=\angle BCD\). Since \(ABCD\) is cyclic, \(\alpha+\gamma=\pi\). Applying the Law of Cosines in triangles \(ABD\) and \(BCD\) and using \(\cos(\pi-\alpha)=-\cos\alpha\),
\[a^2+d^2-2ad\cos\alpha=b^2+c^2-2bc\cos(\pi-\alpha)=b^2+c^2+2bc\cos\alpha,\]
hence
\[\cos\alpha=\frac{a^2+d^2-b^2-c^2}{2(ad+bc)}.\]

For \(\cos^2(\alpha/2)\):
\[\begin{aligned}\cos^2\frac{\alpha}{2}&=\frac{1+\cos\alpha}{2}
=\frac{2(ad+bc)+a^2+d^2-b^2-c^2}{4(ad+bc)}\\[2pt]&=\frac{(a+d)^2-(b-c)^2}{4(ad+bc)}=\frac{(a+d-b+c)(a+d+b-c)}{4(ad+bc)}\\[2pt]&=\frac{\bigl((a+b+c+d)-2b\bigr)\bigl((a+b+c+d)-2c\bigr)}{4(ad+bc)}\\[2pt]&=\frac{(s-b)(s-c)}{ad+bc}.\end{aligned}\]

For \(\sin^2(\alpha/2)\):
\[\begin{aligned}\sin^2\frac{\alpha}{2}&=\frac{1-\cos\alpha}{2}
=\frac{2(ad+bc)-(a^2+d^2-b^2-c^2)}{4(ad+bc)}\\[2pt]&=\frac{(b+c)^2-(a-d)^2}{4(ad+bc)}=\frac{(b+c-a+d)(b+c+a-d)}{4(ad+bc)}\\[2pt]&=\frac{\bigl((a+b+c+d)-2a\bigr)\bigl((a+b+c+d)-2d\bigr)}{4(ad+bc)}\\[2pt]&=\frac{(s-a)(s-d)}{ad+bc}.\end{aligned}\]
This proves the two identities for \(\alpha\). \(\square\)

Theorem (Generalized law of cotangents for cyclic quadrilaterals)
As a consequence of the half–angle formulas,
\[\boxed{\;\frac{\cot(\alpha/2)}{(s-b)(s-c)}\;=\;\frac{\cot(\beta/2)}{(s-c)(s-d)}=\frac{\cot(\gamma/2)}{(s-d)(s-a)}=\frac{\cot(\varphi/2)}{(s-a)(s-b)}=\frac{1}{\Delta}\; }.\]

Proof. From the lemma for \(\alpha\),
\[\cot^2\frac{\alpha}{2}=\frac{\cos^2(\alpha/2)}{\sin^2(\alpha/2)}
=\frac{(s-b)(s-c)}{(s-a)(s-d)},\]
hence
\[\frac{\cot(\alpha/2)}{(s-b)(s-c)}=\frac{1}{\sqrt{(s-a)(s-b)(s-c)(s-d)}}.\]
By Brahmagupta’s formula, \(\displaystyle \Delta=\sqrt{(s-a)(s-b)(s-c)(s-d)}\), so \(\displaystyle \frac{\cot(\alpha/2)}{(s-b)(s-c)}=\frac{1}{\Delta}\). Cyclic relabeling of \(a,b,c,d\) and \(\alpha,\beta,\gamma,\varphi\) yields the other three equalities. \(\square\)

Reduction to the classical law of cotangents (triangle case \(d=0\))
Let \(d=0\). Then \(D\) coalesces with \(A\) and \(ABCD\) degenerates to the triangle \(ABC\) with semiperimeter \(s=\tfrac{a+b+c}{2}\) and area \(\Delta\) (see Figure 2). Taking \(\alpha\) as the angle formed by \(\overline{AB}\) and the limiting tangent at \(A\), the tangent–chord theorem and the limiting relation \(\alpha+\gamma=\pi\) give \(\alpha=\pi-\gamma\), hence \(\cot(\alpha/2)=\tan(\gamma/2)\).

From the generalized theorem, with \(d=0\),
\[\frac{\cot(\beta/2)}{(s-c)s}=\frac{1}{\Delta},\qquad\frac{\cot(\gamma/2)}{(s-a)s}=\frac{1}{\Delta}.
\tag{1}\]
Figure 2. Since $d=0$, $D$ coalesces with $A$.

Let \(r\) be the inradius of \(\triangle ABC\). Since \(\Delta=rs\) and (by Heron) \(\Delta^2=s(s-a)(s-b)(s-c)\), we have
\[r^2s^2=s(s-a)(s-b)(s-c)\quad\Longrightarrow\quad
r^2=\frac{(s-a)(s-b)(s-c)}{s}. \tag{2}\]
Using \((1)\) and \(\Delta=rs\),
\[\frac{\cot(\beta/2)}{s-c}=\frac{1}{r},\qquad\frac{\cot(\gamma/2)}{s-a}=\frac{1}{r}. \tag{3}\]

To obtain the relation at \(A\), set \(\alpha'=\angle BAC\).
Since \(\alpha'=\pi-(\beta+\gamma)\),
\[\cot\frac{\alpha'}{2}=\tan\!\left(\frac{\beta+\gamma}{2}\right)=\frac{\tan(\beta/2)+\tan(\gamma/2)}{1-\tan(\beta/2)\tan(\gamma/2)}.\]
From \((3)\), \(\tan(\beta/2)=\dfrac{r}{s-c}\) and \(\tan(\gamma/2)=\dfrac{r}{s-a}\). Therefore,
\[\begin{aligned}\cot\frac{\alpha'}{2}&=\frac{r\!\left(\frac{1}{s-c}+\frac{1}{s-a}\right)}{1-\dfrac{r^2}{(s-a)(s-c)}}=\frac{r\,\dfrac{2s-(a+c)}{(s-a)(s-c)}}{1-\dfrac{r^2}{(s-a)(s-c)}}\\[4pt]&=\frac{r\,\dfrac{b}{(s-a)(s-c)}}{\dfrac{b}{s}}
\quad\text{(since \(2s=a+b+c\) and by \((2)\))}\\[4pt]
&=\frac{rs}{(s-a)(s-c)}=\frac{rs}{\dfrac{r^2s}{\,s-b\,}}
=\frac{s-b}{r}.\end{aligned}\]
Hence
\[\frac{\cot(\alpha'/2)}{s-b}=\frac{\cot(\beta/2)}{s-c}=\frac{\cot(\gamma/2)}{s-a}=\frac{1}{r}.\]
Relabeling \(a':=|BC|\), \(b':=|CA|\), \(c':=|AB|\) gives
\[\boxed{\;\frac{\cot(\alpha'/2)}{s-a'}=\frac{\cot(\beta/2)}{s-b'}=\frac{\cot(\gamma/2)}{s-c'}=\frac{1}{r}\;},\]
which is precisely the classical law of cotangents for \(\triangle ABC\).

See also

lunes, 26 de mayo de 2025

USM Transform #3 vs. Mathematica Integrate - Part 1

Warning: The 19-example table is an eye-catching demo, but it is not large or diverse enough to qualify as a “representative” benchmark in the usual software-testing sense. It convincingly illustrates Transform 3’s strengths on its target family—integrals built from $\tan\bigl(\tfrac12\operatorname{arcsec/arccsc}(\cdots)\bigr)$—yet it is too small for reliable statistics and too narrow to say much about performance outside that niche.

USM Transform 3 is described in the draft paper “A Unified Substitution Method for Integration”.

All raw timings and expression statistics were taken verbatim from the diagnostic printouts in USMvsMMAfinal.pdf and USMvsMMAFinal2.pdf . 

How to read the table

  • MMA time – wall-clock seconds for Integrate[…] on the original integrand (labeled Direct time in the notebook).

  • USM timesingle figure obtained by adding:

    • parameter-extraction time +

    • conversion ( x → t ) time +

    • USM-upper-branch integration time +

    • USM-lower-branch integration time

  • LeafCnt / ByteCnt – Mathematica’s LeafCount and ByteCount of the returned antiderivative.

  • Any USM entry that beats Mathematica (smaller value) is red bold-faced.

$$\begin{array}{r|r|r|r|r|r|r}\# & \text{MMA time (s)} & \text{USM time (s)} & \text{MMA LeafCnt} & \text{USM LeafCnt (lo--up)} & \text{MMA ByteCnt} & \text{USM ByteCnt (lo--up)}\\\hline1 & 0.0031 & 0.0283 & 311 & \mathbf{\color{red}{92\text{--}96}} & 9920 & \mathbf{\color{red}{2840\text{--}2984}}\\2 & 1.1700 & \mathbf{\color{red}{0.0026}} & 295 & \mathbf{\color{red}{76\text{--}84}} & 9208 & \mathbf{\color{red}{2256\text{--}2544}}\\3 & 6.1800 & \mathbf{\color{red}{0.0025}} & 8962 & \mathbf{\color{red}{64\text{--}72}} & 283576 & \mathbf{\color{red}{1928\text{--}2216}}\\4 & 0.2580 & \mathbf{\color{red}{0.0018}} & 85 & \mathbf{\color{red}{66\text{--}74}} & 2768 & \mathbf{\color{red}{1960\text{--}2248}}\\5 & 0.0113 & \mathbf{\color{red}{0.0098}} & 23 & 32\text{--}36 & 680 & 984\text{--}1128\\6 & 0.0268 & \mathbf{\color{red}{0.0225}} & 40 & 62\text{--}66 & 1248 & 1888\text{--}2032\\7 & 0.2130 & \mathbf{\color{red}{0.0015}} & 57 & \mathbf{\color{red}{47\text{--}53}} & 1808 & \mathbf{\color{red}{1416\text{--}1632}}\\8 & 19.2000& \mathbf{\color{red}{8.1960}} & 325 & \mathbf{\color{red}{250\text{--}278}} & 9984 & \mathbf{\color{red}{7368\text{--}8376}}\\9 & 2.1300 & \mathbf{\color{red}{0.0262}} & 193 & \mathbf{\color{red}{63\text{--}71}} & 6216 & \mathbf{\color{red}{1960\text{--}2248}}\\10 & 4.5200 & \mathbf{\color{red}{0.0274}} & 151 & \mathbf{\color{red}{90\text{--}102}} & 4720 & \mathbf{\color{red}{2736\text{--}3168}}\\11 & 0.0120 & \mathbf{\color{red}{0.0015}} & 58 & \mathbf{\color{red}{42\text{--}46}} & 1776 & \mathbf{\color{red}{1288\text{--}1432}}\\12 & 0.0146 & \mathbf{\color{red}{0.0016}} & 158 & \mathbf{\color{red}{49\text{--}53}} & 4880 & \mathbf{\color{red}{1440\text{--}1584}}\\13 & 0.0122 & \mathbf{\color{red}{0.0112}} & 39 & 44\text{--}48 & 1240 & 1360\text{--}1504\\14 & 0.2250 & \mathbf{\color{red}{0.0023}} & 106 & \mathbf{\color{red}{62\text{--}68}} & 3312 & \mathbf{\color{red}{1888\text{--}2104}}\\15 & 0.0154 & \mathbf{\color{red}{0.0018}} & 47 & \mathbf{\color{red}{43\text{--}47}} & 1448 & \mathbf{\color{red}{1312\text{--}1456}}\\16 & 6.0500 & \mathbf{\color{red}{0.0357}} & 456 & \mathbf{\color{red}{332\text{--}360}} & 13976 & \mathbf{\color{red}{9960\text{--}10968}}\\17 & 0.0068 & 0.0220 & 308 & \mathbf{\color{red}{85\text{--}93}} & 9720 & \mathbf{\color{red}{2608\text{--}2896}}\\18 & 1.0500 & \mathbf{\color{red}{0.0041}} & 524 & \mathbf{\color{red}{109\text{--}117}} & 16272 & \mathbf{\color{red}{3232\text{--}3520}}\\19 & 9.3900 & \mathbf{\color{red}{0.0933}} & 19148 & \mathbf{\color{red}{129\text{--}139}} & 600840 & \mathbf{\color{red}{3880\text{--}4240}}\\\end{array}$$

Interpreting the numbers

  • LeafCount (Lower--Upper) counts the total nodes in Mathematica’s expression tree; ByteCount (Lower--Upper) measures the memory footprint in bytes. Both are crude but serviceable proxies for “formula complexity”. Lower values signal more compact, and usually more readable, antiderivatives.

  • USM total time is purposely holistic: the tiny setup overhead (parameter extraction + conversion) is included together with the actual integration of both branches, so a reader sees the real end-to-end cost of calling the transform.

  • Why two branches? Transform 3 generates an “upper” and a “lower” substitution dictated by the radical’s sign conventions; the faster of the two could in principle be chosen automatically, but here both were timed and summed for fairness.

Average speed-up

$$\begin{array}{r|r|r|r}\# & \text{MMA time (s)} & \text{USM time (s)} & \text{Speed-up} \\\hline1 & 0.0031 & 0.0283 & 0.11 \\2 & 1.1700 & \mathbf{\color{red}{0.0026}} & 450 \\3 & 6.1800 & \mathbf{\color{red}{0.0025}} & 2472 \\4 & 0.2580 & \mathbf{\color{red}{0.0018}} & 143 \\5 & 0.0113 & \mathbf{\color{red}{0.0098}} & 1.15 \\6 & 0.0268 & \mathbf{\color{red}{0.0225}} & 1.19 \\7 & 0.2130 & \mathbf{\color{red}{0.0015}} & 142 \\8 & 19.2000& \mathbf{\color{red}{8.1960}} & 2.34 \\9 & 2.1300 & \mathbf{\color{red}{0.0262}} & 81.3 \\10 & 4.5200 & \mathbf{\color{red}{0.0274}} & 165 \\11 & 0.0120 & \mathbf{\color{red}{0.0015}} & 8.0 \\12 & 0.0146 & \mathbf{\color{red}{0.0016}} & 9.13 \\13 & 0.0122 & \mathbf{\color{red}{0.0112}} & 1.09 \\14 & 0.2250 & \mathbf{\color{red}{0.0023}} & 97.8 \\15 & 0.0154 & \mathbf{\color{red}{0.0018}} & 8.56 \\16 & 6.0500 & \mathbf{\color{red}{0.0357}} & 170 \\17 & 0.0068 & 0.0220 & 0.31 \\18 & 1.0500 & \mathbf{\color{red}{0.0041}} & 256 \\19 & 9.3900 & \mathbf{\color{red}{0.0933}} & 101 \\\end{array}$$

Using the arithmetic mean of the 19 speed-up values


$$\text{speed-up}_i \;=\;\frac{\text{MMA time}_i}{\text{USM time}_i},$$

the average speed-up is


$$\boxed{\displaystyle \overline{\text{speed-up}}\;=\;216.3\ (\text{approximately})}.$$

Take-aways

  • Speed advantage – USM outruns built-in Integrate on 17 / 19 examples, sometimes spectacularly (Ex 3 is > 2000× faster). Only the Ex 1 and Ex 17 favour MMA on raw time. However, for these two examples USM gave much simpler antiderivatives.

  • Compact answers – In 16 / 19 cases the USM result is much smaller (LeafCount & ByteCount). The gains are eye-catching for the hardest problems: Ex 3 shrinks from 8 962 leaves to 64--72, and Ex 19 from 19 148 to 129--139.

  • Overhead is negligible – Parameter extraction and x ⁣ ⁣tx\!\to\!t conversion stay in the micro-second realm; almost all of the USM total is the symbolic integration itself, yet seldom exceeds a few hundredths of a second.

Bottom line: On this mixed radical-inverse-trig mini-benchmark, the USM (Transform 3) consistently yields faster and simpler antiderivatives than Mathematica’s built-in integrator.