I've decided after good example to write some diary pages with toughts and events.
Oh, in case anybody fails to understand, I'd like to remind them that these pages are copyrighted, and that everything found here may not be redistributed in any other way then over this direct link without my prior consent. That includes family, christianity, and other cheats. The simple reason is that it may well be that some people have been ill informed because they've spread illegal 'copies' of my materials even with modifications. Apart from my moral judgement, that is illegal, and will be treated as such by me. Make as many references to these pages as you like, make hardcopies, but only of the whole page, including the htmlreferences, and without changing a iota or tittel...
And if not? I won't hesitate to use legal means to correct wrong that may be done otherwise. And I am serious. I usually am. I'm not sure I could get 'attempt to grave emotional assault' out of it, but infrigement on copyright rules is serious enough. And Jesus called upon us to respect the authorities of state, so christians would of course never do such a thing. Lying, imagine that.
Previous Diary
Entries  List
of Diary Pages  Home Page
Today, saterday, I've been at work some time, which was fun enough,
I tried out the amplifier chip I described on my pages I still had from
my last place, minus the speaker, who someone must have thought better
off demolished on a little sony 6 milkpack size speaker. Well, maybe 12.
I does its best, and with 100W or so (short time continuous sine) the amp
could blow it to pieces easily, so I must be a bit carefull driving, especially
with the software synths I use, but it makes a nice enough noise, and as
expected it starts to distort at a volume way over average load room volume,
but waaaaayyyyy under the amount of floor moving power the 12 incher I
used a long time was capable of, that one got the 150W or so music power
enough to make the transformers quite hot, and survived about full power
analog and digital synth sounds, and made them and my walkman sound stronger
than most things I've listened to, probably because of the straightness
and capableness of the whole chain. The tweeter of the little sony, a paper
cone one, started distorting the clearest with I think a synth organ sound,
probable it moved its coil simply against some solid boundary of its freedom
of movement, it sounded like clipping. Carefull that meant, of course,
the little sony is far better then any multi media set, so I'd rather not
test it as a fuse.
Suppose some intelligent martian lands on this planet and finds the mess on this wonderfull blue planet, what would happen? I had great fun as teenager doing another version of the radio play I refered to above, especially I remember the sound effect part, where I found a way to mimic the sound of the original 'war of the worlds' unscrewing of the martians spaceship capsule lid. That spooky sort of hard to find sound I made by rubbing my fingers along a natural thin rope by which I suspended my Philips microphone, the best one I had at the time, I normally used for al recordings of voices or effects. With a bit or maybe quite a bit of spring generated reverb added that was seriously spooky like the original.
Suppose one of the little nonET's would walk into a church and try to communicate. 'Need input'. Space age? Lealy lealy, space age? Yes yes, we've built rockets, and electronics.. Ah, yes yes, electlonics! (this alien has a chinese accent for unknown reasons) The translation computer did mess up, so talking should go well. Well well electlonics, intelesting. How make, electlonics. Oh chips! Big ones, very powerfull 2 Gigaherz. Little supressed laugher from the martian side, Yes yes 2 gigahelz. Good good, intelesting. 9 zeros, light? Even in church such remark would not be met without competitiveness, so voices dimmed a bit, and the martians would of course have to explain their ideas about life and God and wether they comply with human social rules enough to e permitted near, no martian heathen in our church, of course. Well well, a say eh, martian man (this particular chuch person had a dutch accent), what are your views on life and eh people? He didn't dare to be straight about the eternal life and God question, lets say it wasn't the actual best christian mankind could come up with, and one has ones' doubts, isn't it. The martion tried to be amiable: ah God ! Yes yes God. Important, peace, love evelything peace love, God, yes yes.
A bit taken by the answer, they might take him serious as real christian, our churchperson crawled back a bit, yes indeed important, but what about law of god, any laws on mars? Laws? Yes yes, laws, of course, no dirty no nuclear waste (their translation machine wasn't a pityfull one, it had learned quick, so it sounded all very well cnn like), no colluption, yes yes many laws. Easy, laws, all know. But why laws, must have happy life! Boling laws, must live nice, yes?
Oh boy, that wasn't the idea here, the martians were allowed to mingle, the gouvernement had decided they probably weren't dangerous or a threat to mankind, but that would have to include they wouldn't just walk over all earthly human mess and secret rules of conduct, hidden agendas and societies horrible backbones. Then they would have to be punished as dangerous nutcases, sent back, killed, or whatever, but no glosnost, not too much honesty, and please, before you know they pr the whole of mars against human relational rules and make us all look like morans, couldn't allow that, of course. But they were there, so somehow we'd have to deal with them, and they were too intelligent to brush away, and some had started to understand some things about them were too inportant to write them of as homesick ET's, these creatures were for real enough.
Its starting to look familiar! I've hears such stories before..
Then the earthling would have to explain the essence of religion, the belief in a higher being, who would actually exist, and as must churches officially hold for true who has sent His Son to save the world, and than how mankind deals with that. What is the motivation for a person to become official believer, why are people sinners by nature, and what does that mean, how can (western) society claim to be logical, and what is its official foundation and how in practice does power legitimate itself, and why is so much so bad with so much technology and riches around, and so many leaders seemingly meaning so well and with so much to back them up, why are so many people even dying of famine with all we have, who can explain that logically? Mankind must admit its major messup to the martians, and they didn't like it, so of course they sent the poor sods to church. 'There they know Why'. Oops, also familiar picture...
Martians, naturally intelligent (and ugly, of course, God, they were ugly, but that takes just a little getting used to, and the translation machine is realy quite nice) as they are knew where they were getting themselves into, and made pretty clear their replacement for warp drive was quite worth it, so more than a few rich and influential wouldn't just dump them in church circuits and have them out of the way.
It took them a few services to get the hang of it all, and they naturally
didn't like it much, but it seemed that Jesus figure sort of got their
attention, so sweating reverents would have to focus on that highest judge
of their own system, and explain why there was such suffereing why God
had doomed mankind so much that even that one had to die to show what the
world had come to, even though he supposedly was only good, and just and
even peaceble in general. After a while the martians developed diplomatic
skills enough to simply prevent asking direct question about this shamefull
sin problem, and let them have the idea their system was realy something,
while in fact of course it sucked at least, as always.
Faculties grow quick, they are just about the fastest rising funtions, even faster than the exponential. When we take 10 numbers, we have over 3.5 million possibilities already, and when we take 69 numbers we approach a figure with a 100 zeros possibilities (1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000,000,000,000,000,000,000,000,000,000,000)
Why would we care about permutations? Basically because most everything can be put in order, and when we do, the number of possibilities are big, and may give us an idea of the probability of having found a certain possible ordering.
I physics, this sort of thinking in amoung others mathematics in statistical and other senses caught on about a century ago when describing systems made of particles at atomic level. Suppose we have an atom with electrons circling around the nucleas as most of us can envision, it is almost impossible to distinguish between one electron and another, because even under a complicated STM or electron microscope, we can only distinguish where electrons are most of the time, because then there is a sort of lobe of mass we can observe, but they move so fast that we can hardly pinpoint electrons at all, let alone distinguish which one is which.
The interesting thing about the physics which is now world famous probably as it was then is that by reasoning on observation probabilities and taking basic harmonic oscilations as starting point (like swings), just about every physical phenomenon can be described mathematically using just straight but complicated reasoning, and in great detail, with more than a bit of accuracy.
The same quantum physics theory in second world war was used by einstein to try to predict what would happen with a certain number of kilos of nuclear material brought together in a bomb, wether not the whole globe would burn, wether the effects wouldn't be to bad for the surroundings of the target cities, or too little to have the intended impact. Little and big boy worked. Maybe unfortunately, but certainly as a result the evil japanese system was defeated into unconditional surrender, and quantum physics and mastering them were needed to make the war material to make that happen, and luckily on the right side of the nazi line. A V2 with such a thing aimed at a major allied countries' city would not have been nice a lot worse, so luckily the right physicists and technicians either ran or kept their scientific peace long enough. Einstein and rocketeer von Brown as we know left the evil nazi system before the end. I'm not oppenheimer has good nightrest but at least two bombs, as I seem to know with prior warning, were effective for their purpose against very great and world threatening danger and evil, and luckily none thereafter were used, even though it is not hard to imagine what 50 years of progress in this area must be capable of. Blowing the face of the earth away, for certain, I don't doubt. For real. Scary. Luckily power distribution as it is has given us famine and bin Laden or whoever, but no pakistani thinking themselves above the nuke. Luckily. I remember little chernobil and the interesting effect on my greengrocers holyday job about the whole width of europe away from that not worst possible of its kind desaster. I didn't laugh, I'm not stupid.
Suppose we have an atom with a number of orbits and a number of electrons who can occupy a number of places in each orbit. Than the probability of finding a certain electron in one of all possible orbits assuming distributions are equal is one out of all possible orbits. If we then take a certain number of electrons and we want to know the probability of a certain distribution, we at least know all possibilities when we know to number of total possibilities of putting them in a certain order in the available places, which we then take as the starting point of the actual statistical distribution, so we know that all probabilities of possible electron distributions add up to 1 or 100%.
In real physics, there is the idea of not having an even distribution probability, for instance for some time we may have electrons being 'exited', meaning they are in an orbit with higher number and energy, until they fall back to ground state and emit light or energy when that happens, and that state is less likely than normal ground state.
The way the quantum model goes is strange. In fact, the idea of dealing with waves instead of particticles in normal 'hard' sense of the word means that our deterministic world view is challenged at its root, namely that mass isn't just 'there' and tangible, but a statistical process which at larger than atomic scale happens to average out good enough to not let tables move and look like they're in the star track transporter, as atoms and electrons would in a very high speed recording, they're sort of uncertain in the sense of where they exactly are at any point in time, on average they are at a certain place, but electrons and the nucleus parts are hard to pinpoint at an exact place, they are like waves, all over a certain range.
There is famous and important fundamental physical law about how we can observe anything at all physically in partical sense called Heisenberg's uncertainty principle, which teaches in normal language that we fundamentally cannot know both the position (place) and speed of a particle at the same time, either we know how fast it moves, but then we cannot measure where it is, or we pinpoint where it is at a certain moment, but we have no idea how to measure it exact speed at that moment. It is a quite 'hard' observational law, which realy has proven to be true in all experiments in the area for a century.
It is stated in statistical sense, so in hopefully lay enough words, we may say about some particle we measure, there is a probability of 90% that it is between place x and place x+e with e a small number, while we can determine that the speed with a probability of for instance 95 percent is between v and v+d, where d is a small number.
That is reasonable, but it is not like normal physics where we'd say that we can make our measurement devices for position and speed more exact, and then say, well we now have a measurement where e and d are a lot smaller, both, and when we spend some more money on measurement equipment, we may be sure for 99.9999 percent that x and v are accurate within for instance 3 digits, which for instance in electrical engineering are fine enough figures.
Heisenberg tells us that for atomic particles, fundamentally e or d are limited in their smallness, that means they cannot be made arbitrarily small, both at the same time. And when e would be zero, that means d would approach infinity, and the vice versa, meaning if we'd know exactly where an electron is, we wouldn't have a clue as to its speed, and fundamentally, so, that is we can theoretically not make a measurement method or machine or a computation theory to compute the speed, at all.
That means we're left with statistical computations and models instead of laws like Newtons motion laws which just require us to crack or approximate differential equations, which is hard enough, but when mathematics gives us an answer, that answer is exact, even of infinite accuracy, given that the data are correct.
Not that we're free from differential equations in quantum physics, in fact the basic equation, called Schroedingers' equation, is completely a differential equation, and on top of that, we need to do our job of solving it in a statistical manner.
That, too, is in itself fine enough. I well remember my first physics practicum at university, where I would measure for instance the stretch in a bending experiment, meaning how much the surface of a metal rod is stretched when it is bowed by a known force. When one knows the measurement outcomes, and adds the uncertainties at all stages of the experiment and computations, one in the end with great certainty can state that given the experiments set up the stretch for a certain force applied to the metal rod is so and so much, within an inaccuracy range of for instance 3 to +6 percent, with great probability. And when such experiment is carried out faithfully, and the measurement equipment is calibrated well, that end result is quite reliable. It means that one takes a measurement lint, measures an object to be 49.5 centimeter, and states that it with great probability indeed according to correct meter standards is between 49.4 and 49.6 centimeter. Given a good measurement device, that is reasonably possible, though when we just put a cheap lint in place and use a quick glance, most likely we can say it is between 49 and 50 centimeter, but not much more.
In physics experiment of some standard, we might say that the uncertainty interval and accuracy are more like 'this front plate is 494.56 +/ 0.03 milimeter wide', which is still nothing much compared with the accuracy obtained in many nuclear physics experiments based on quantum theory. Which should be well noted, these theories and their practical outworking are serious stuff, many scientists may be quite dusty, but these are the hard sciences, and when for instance an electrical engineer wants to measure a voltage and have 6 figures, no one laughs, that is possible, it just takes some effort or money to buy expensive accurate measurement equipment, but were talking accuracies of certainly up to cents on a million dollar, even for quite complicated experiments, for instance on the scale of electrons, which is quite an achievement, generally, and requires accurate and logical work, not philosophical ideas.
But with all that accuracy quantum physics is still quirky in that it dictates fundamental uncertainties of combined measurement on speed and position of a particle. And that particles are described as probability densities, meaning they're represented by graphs which say the probability of finding it at a certain position has a certain likeliness, usually with smoothly varying graph lines.
Which brings us to the necessity of conditional probabilities. What's the probability of throwing a six with a decent dice? 1 out of six, of course. What's the probability of throwing a 6 the next time? 1 out of six again, of course! But what is the probability of the combined probability of throwing a six first, and then again a six? 1 out of 36 as we know, the product of the two seperate independent probabilities. How is that?
Now suppose we want to make a graph of the probability of the next key being pushed on a computer keyboard a certain time from the previous, statistically speaking. We'd say that probably that chance is small for 1000 years, and also for an hour, mostly, the intermedeate time between two key presses is maybe a second, maybe a third, and with beginners, maybe a few seconds, and then smaller, most keys are part of a word or sentence or maybe url, so they are pressed on a row mostly. So we'd have a graph like this:
Probability
^


> Time
Rapidly rising probability for smal time values, more or less even probability for lets say half a second to a few secons, and then lower probability averages, approaching zero. Probably most computers never are left with untouched keyboard for more than a month, so that is where the graph would be zero.
The graph was quickly made with bwise, which allows wonderfull graphs, but I just wanted to make the idea clear, the lines are the graph, of course. The way to deal with such probability graphs is to say that for instance between t = .3 and t=.6 seconds, the total probability is the integral of the graph between those time values, which is the area between those boundaries, the horizontal t axis, and the graph itself. Assuming the graph is correctly normalized, meaning the total probability is exactly 1 or 100%, the integral immedeately yields the probability for the integration interval, meaning in this case the likeliness of a keypress between .3 and .6 seconds, average, which could for instance be 0.25 (1 out of four or a 25%) .
To have lookup graph without having to integrate, we could compute a graph of the integral of the above one, directly representing the probability of a keypress between 0 and t seconds, which would be a rising graph of probability. Then the example of P(0.3, 0.6) of above could be computed as Integral(P(0.6))  Integral(P(0.3)), because integrals are linear and in mahematical sense distributive for this example, meaning we can take the limits of the integral through the braces and compute the integral by the subtraction I mentioned.
The idea of having a function where we vary the bound of an integral and get all kinds of outcome possibilities is a functional integral.
I've done my best a bit better with the next graph, to show what a gaussian
probability distribution looks like:
For those with an interest, this is the line of tcl/tk
to draw the graph in the bwise application, or on any canvas whose name
is in variable mc:
$mc create text 188 400 text "x" font
"Helvetica 12" tag gr1;
$mc create text 2 168 text "y" font "Helvetica
12" tag gr1;
$mc create line 182 402 182 402 width
2 tag gr1 arrow both;
$mc create line 2 402 2 182 width 2 tag
gr1 arrow last;
for {set t 4} {$t < 4} {set t [expr
$t + 0.05]} {
set x [expr 40*$t] ;
set y [expr 40 * exp($t*$t)];
set y [expr 4003*$y];
$mc create oval $x $y
[expr $x+5] [expr $y+5] fill purple tag gr1
}
It's width is characterised by what is called the variance, measured between the bowing points of the graph, which can be changed by changing the power base (e), or the giving the x an extra constant in the equation. Without wanting to get too mathematical, this text is intended for interested audiences of maybe average or so mathematical skills, whatever that is, it is important to in the whole of heavy theoretical physics, which should not be underestimated, they are responsible for more than just the nuclear bomb, be made aware of the value of these statistical buildings, including preferably with some actual mathematical knowledge, because the whole house of quantum physics and the essence of its success relies on it, and it is essential scientificknowledge in my not so light opinion.
There are dutch physicists (or was it one?) who won the Nobel price for making an effort to cover their and many other peoples basis in physics, and chart it out.
The gaussian experimentally is known to be the curve one gets when for instance measuring some size with a centimeter and plotting the distribution of measurements in small ranges around the expectation value (the average), one would measure the actual size, quite some measurements would be a bit to the left or right of the average, and then a few missers further away, and not many errors greater then a certain distancea away, though of course an occasional large mistake can make up for some measurement values far away from the average.
The statistical, in fact experimental, hard to formally proof beyond making credible law of large numbers makes such measurements, when graphed converge to the shown, when the number of measurements per small range of X values is graphed as a y value which represents the number of measurements, for instance a few hundred, or even a few millions, to make a very smooth and accurate graph. If you'd graph the average amplitude of the noise from for instance your FM receiver tuned between stations, you'd also find a gaussian distribution. With a soundcard and a little program that isn't too hard, maybe I'll do the experiment when I have the chance.
One point is that the broader the curve, the more power is in the signal, the variance is a measure for the power when the noise indeed has gaussian distribution, which should be fine enough in the case of receiver noise. The more probability of the signal being away from zero, with the whole graph neatly normalized to an integral of 1 for the total probability, the wider the graph, the more variance the more average amplitude for instance of the speaker playing the noise signal, the more power.
How does the story continue? At the risc of taking on way to much in way too little time and space, I think it is good idea to make clear that there is very good logic in this whole game, which remarkably enough is so incredibly lets say powerfull to model what so many scientists measure already for at least a century, and which hasn't fundamentally failed at the intended scale, normally speaking. Not that the model is complete or without limitations, of course relativity complicates the whole picture, and the 'hidden variables' idea isn't cracked anywhere near mathematical foundation in the sense that even though heisenbergs' and other laws limit the amount of knowledge we can gather from our experiments and models, it isn't guaranteed that the unmodeled information isn't something else than pure noise, there might indeed be information passed between parts of our models which we don't currently know how to characterize or model, but which might be there real enough.
I'll try to make the story short, just to have the satisfaction of at least building it up to some relvant picture, not knowing who will ever be able to or want to follow.
The idea of the gausian around the expectation value of a measurement is general, and can be taken as the starting point for measuring data of elementary particles, such as their position. In that case we'd have three dimensions, and maybe all of the x y and z and maybe t would have gaussian distributions of probabilities of being at the expected point in space, or there would be a spherical distribution where the gaussian is found using the radius length to the expected average.
Now assuming that we want to position particles and deal with the probabilities of where they realy 'are' at some point by assuming they are gaussian distributed around the expectation value, we could compute the probability of finding a particle at a certain point in space by: P = IntegralP (gaussian (position  expectation value)) between the desired distances, which with above reasoning means we have enough when we have the accumulated three dimensional probability distribution function centered around the expected position. When we make the relevant simplification to assume that always the error distribution or in other words the measurement errors, or the quantum behaviour or uncertainty of a certain particle is the same all over the space it can be in, meaning the previous function is translationally invariant, we have one function to give us an impression of the probability distribution of a particle, given were we thing it is on average.
That idea gets more interesting when we add the idea of wanting to analyse particle behaviour after good electrical or optical engineering use in the frequency domain. That means we use a fourier transform to characterize repeating or cycling behaviour by principal sine wave analysis, where we take all possible frequencies and record for each frequency how much of that frequency is present in the particles' wave function description. Wave function because that is the name when a particle is according to it supposed and proven wave nature seen as the solution to the wave equation or schroedingers' equation which states that everything is like waves traveling through space at the speed of light, normally. It is a fundamental differential equation, which can partly be written as an integral equation, which makes the mathematics surroundingits use possible in the way I'll try to outline.
Fourier analysis means taking a signal, and recording which frequencies make up for the signal, and for which portion. Like with organ notes. The official, general fourier integral checks and describes every possible frequency component, out of an infinite amount of them. It can be proven that every signal can be fourier transformed as long as it fullfills some reasonable prerequisites like not being jumpy, for a signal of limited length, not all frequencies need be present in a sensible and complete fourier analysis.
The frequency analysis is good if we assume a particle is in some orbit, where it wraps around at some point because of the specific length of the orbit, it will probably exhibit the idea of standing waves in such orbit, which can be characterized well by a fourier transform with wavelengths which are a multiple of the effective orbit length.
Suppose we want to analyse a particle at a certain position by combining the position gausian distribution function with a frequency analysis, which means normally a convolution integral with a sine like function, so that we can measure the functions' wiggly behaviour against the example of a certain position and energy and a certain frequency component. By multiplying with every possible or usefull frequency component, and also making the position variable by varying the gaussians expectation value, we get a general set of functions which can describe a particle by its repetative behaviour and its position in statistical sense.
This idea brings us to the idea of boson and fermion particles, the
most fundamental two groups of particles in such sense, which arise from
the idea of using the orthogonalising representatio possibilities of the
fourier transform in terms of the phase of the sine or cosine function.
Fourier tells us we can reconstruct any frequency component by knowing
its phase or the relative contribution of a sine and a 90 degree ohase
shifted cosine function, which gives us two handy and logical faces of
the wave function with both the gaussian distribution and cyclic components,
one which is antisymmetrical and one which is symmetrical. Those properties
are important in subsequent computations which lead to for instance the
bohr atom model by mere mathematical reasoning and the well known general
wave equation (schroedinger).
The boson basic wavefunction, a 1 dimensional example (changed tcl
code: set y [expr 40 * cos(1*3.1415*$t)*
exp(0.25*$t*$t)];), which is symmetrical around the gaussian
expectation value, here at x=0.
The antisymmetrical or fermion basis function. Anti symmetrical
function may cancel out when they are combined, even by addition one can
see that this function plus its reverse, which can be made by taking the
other direction in the time scale can cancel out, it is its own opposite
when flippen over the y axis.
Mathematically speaking, one can play a game with odd and even functions, that is function which stay the same under reversal of the domain or become inverted. Bosons cannot cancel eachother out that way, there is always that one positive peak around the expectation value, which doesn't get cancelled by its opposite.
The interesting part to begin with is that these functions form a basis for all functions, which means any function can be 'decomposed', taken to pieces or built up from combinations of these functions with the right frequencies and variances, with the right weighing factor for each of the probably infinite row of basis function componens. Mathematically it can be proven that functions which adhere to reasonable conditions can be approximated to infinite accuracy even this way.
The idea is that it is even possible to make an orthogonal (and orthonormal after normalisation) basis this way, which means these functions are made into a set with for instance integer frequency components, pieced a single basic wavelength apart, and that that set can approximate any function with a certain position and frequency accuracy, up to infinite accuracy, meaning exact.
In signal processing, similar ideas exist, where they are usefull to, sometimes they are called waveletss, in somewhat limited implementation usually they can be part of what are well known as orthogonal filters, and some of the idea, the cosine transform and translational seperation of blocks, is present in JPEG image coding libraries. Mathematically and applicationwise physically, the idea is at least about a century old, but good application and understanding of how the idea can be applied and used is good and usefull today, as with anything of more than little value.
Why bother about all this? Mainly because the equation which seems to so fundamentally govern natures' small building blocks, Schroedingers' equation, is a wave type of differential equation which can be solved thru the use of these types of basis functions, and the fact that they can be made into integral functionals and made into set of similar functions with straightforward and physically usefull parameters makes it possible to base further computations on them. This is no longer nice an easy math level, we're talking still popular top stuff, here, don't get me wrong thinking this is highschool level, the actual computations are hard, tedius, possibly mind boggling, and not to be tried at home unless you have a serious engineering or university degree far away from humanities, languages or medical subjects, and are willing to those skills seriously. And for some of the nicer stuff (gna gna) I found out have actually studied electrical engineering sort of comes in handy (for the signal stuff and to have a solid enough background in functional analysis and overview). Lets say this is the basis of page 1 of an advanced theoretical physics course, which never hurts of course, but I'm trying to popularise the subject on purpose, but it shouldn't be taken out of context, this is not childs play, though some of the mathematics and thinking are quite worth being brought forward for the intrinsic logic, because oftern that doesn't happen, and it is remarkable how relatively fundamental mathematics such as integral on even and odd functions can get so far in describing our physical worlds' makeup so accurately.
In fact it is even worse with the wave equation, which actually describes a wave in normal sense moving through space with certain speed and with certain frequency, the differential equation is not so far away from the wave equation I use to simulate guitar strings in my string simulator software, which is no coincidence, but then again, that fundamental particles are ruled so accurately, and that the whole of Bohrs' shell model can arise out of such considerations is very remarkable. Yet those bosons and fermions making up the traditional electrons, protons, photons, neutrons and even higgs particles are described as waves in some sort of confinement, with a certain frequency and energy, quantized by their mathematical appearence and natures' constants, and the models made this way work so well they are still alive and working 100 years after they were invented.
Even nowadays, doing a full computer simulation of a piece of 'space' where all possible wave events can take place forming the simulation of a little piece of matter of some kind is hard. My string simulator does coupled harmonic oscilators, which is one way of looking at the problem, and it does it without many conditions or limitations, in other words I don't make assumptions beforehand about the simulations outcome or on the limitation of classes of signal/waves the simulation can deal with, and I am not the worst programmer, yet I can compute sounds from a one dimensional array of point of maybe a few hundred in real time with reasonable accuracy, while a piece of matter of a few hundred points in a cubic sense, meaning 10 to the power of 6 or 7 would probably be quite inaccurate and slow to simulate and get usefull wave patterns from, even on fast computers. Not that that is definately necessary, certain simplifications, specialisations and assumptions can speed up things, make certain outcomes more accurate and fit a certain problem well, but when particles interact in unknown ways and we want to know how that may happen, exhaustive computations may be in order, which are computer time consuming.
A century ago all was done in the mind and with paper and a pencil (I recently saw Leiden university put a picure in their internal news bulletin of the actual fountain pen Einstein used to write his famous equations with, well well), so it was very necessary to let math do the job and solve equations analytically, which in this area is major work we can benefit from also in the computer age. May well still be interested only in the steady state of a particles state, which is not the same as knowing its every oscilatory wiggle and exact positional and velocity (or momentum) whereabouts. Then computations which just yeild that outcome without giving a femto second per femto second acound of every possible oscilation or motion. Various techniques exists and form significant part of the basis of physics also today to compute for instance such steady state, or energy distributions or zero temperature structure or statistical data about a certian substance or some chemical property.
Just to make clear I'm at least qualified enough to say I'm not bullshitting
people or going out of my authority league, I did participate in advanced
enough physics course to have at least seriously enough covered these subjects
and made some exercises successfully, such as (interesting enough) one
on Bose Einstein condensates (as I remember ex 3.3 in the Negele Orland
Mit textbook on Quantum Many Particle Systems). Fun enough, though I'd
want more handyness in various senses to feel easily at home breaking some
grounds in the area, though I'm sure even my computer graphics and certainly
computer simulation (parallel experience, too) and electrical engineering
relateted advanced theoretical knowledge in various areas can make for
more than interesting enough starting point. At the time I was surprised
how much my in my opinion modest enough linear algebra knowledge was in
demand in the presentations I did.