<div dir="auto">To paraphrase Dickens ... they were the good old days ... and the bad old days!<div dir="auto"><br></div><div dir="auto">Daniel's idea reminds me of Robert B-J's comment about Heaviside and Dirac functions. Dirac unit impulse functions are approximated by Gaussians with infinitesimal variance in the Theory of Distributions. And every Probability distribution is a convolution of itself with a Dirac delta ... which is useful because Laplace transforms turn convolution products into algebraic products.</div><div dir="auto"><br></div><div dir="auto">Electrical engineers are used to approximating all kinds of input signals with sums of standard functions ... impulse, step, ramp, sinusoids, Gaussians, that have well known Laplace and Fourier Transforms.</div><div dir="auto"><br></div><div dir="auto">How useful it is to be able to go back and forth between the time and frequency domains! Even in quantum mechanics ... the more compact the support of a wave function, the more spread out its Fourier transform, and vice-versa. That's the wave mechanical basis of the uncertainty principle.</div><div dir="auto"><br></div><div dir="auto">A convenient way to get the mean, variance, and higher moments of a probability distribution (think voter distribution) is by finding the Taylor/MacLauren coefficients of the Laplace or Fourier transforms of the distribution functions. </div><div dir="auto"><br></div><div dir="auto">[Fourier and Laplace transforms differ by a 90 degree rotation in the complex frequency domain.So what I say for one also goes for the other without needing to mention it every time.]</div><div dir="auto"><br></div><div dir="auto">My wife needs the phone ... more later..</div><div dir="auto"><br></div><div dir="auto">W </div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">El mar., 25 de ene. de 2022 9:40 a. m., Kristofer Munsterhjelm <<a href="mailto:km_elmet@t-online.de">km_elmet@t-online.de</a>> escribió:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On 25.01.2022 06:06, Forest Simmons wrote:<br>
> Thanks Ted and Daniel. Very interesting!<br>
> <br>
> In the early 70's we did our minuteman missile simulations on a room<br>
> size mainframe IBM 360/65 computer with FORTRAN code, double precision<br>
> arithmetic ... punched cards interface.... and all. We got one<br>
> turnaround per night.... night because the Top Secret runs had to be<br>
> totally isolated from the daytime use of the computer. In 1974 our group<br>
> got ahold of a couple of the mini-computers that were just coming out<br>
> ... TTY "ticker tape" interface at first then (unreliable, but more<br>
> convenient) floppy discs. Very slow, single precision, but interactive<br>
> BASIC for the spine of the simulation. We employed pseudo-double<br>
> precision for the numerical integration, and modified BASIC so we could<br>
> call on assembled bottle-neck subroutines, etc.<br>
<br>
It's things like these that makes me think that current computers are<br>
capable of vastly more than they're currently being used for. Computers<br>
with less than 1M of RAM could be used to calculate missile<br>
trajectories, run industrial process control, etc. We now have 16G or more.<br>
<br>
Of course, I know that part of the reason is that programmer time is now<br>
the most scarce resource. The programs that are developed now (mostly<br>
user-facing stuff) are much slower than they need to be in part because<br>
it would take too much time and effort to optimize down to the bare metal.<br>
<br>
-km<br>
</blockquote></div>