Theo Verelst Diary Page

Latest: 24 january 2001

I've decided after good example to write some diary pages with toughts and events.

Oh, in case anybody fails to understand, I'd like to remind them that these pages are copyrighted, and that everything found here may not be redistributed in any other way then over this direct link without my prior consent. That includes family, christianity, and other cheats. The simple reason is that it may well be that some people have been ill informed because they've spread illegal 'copies' of my materials even with modifications. Apart from my moral judgement, that is illegal, and will be treated as such by me. Make as many references to these pages as you like, make hardcopies, but only of the whole page, including the html-references, and without changing a iota or tittel...

And if not? I won't hesitate to use legal means to correct wrong that may be done otherwise. And I am serious. I usually am. I'm not sure I could get 'attempt to grave emotional assault' out of it, but infrigement on copyright rules is serious enough. And Jesus called upon us to respect the authorities of state, so christians would of course never do such a thing. Lying, imagine that.
 

Previous Diary Entries
 

Januari 24,000

What about the possibilities in development of networks? My technical background gives me hard and reliable information and angle on the physical networks on this earth, which are certainly not irrelevant. How about more theory in the area.

Networks

When I became a part of the ongoings at the section I eventually graduated in, it still hadthe name it had quite some years already: network theory. The main idea I had at the time, apart from the fact than I was invied at some point was that in the electrical engineering yearbook they advertised themselves as being into mainly the whole path of designing and implementing big silicon chips. I just had or was getting a dx7 at the time, and was building compute circuits for years already, so that was interesting. It looked kind of ambitious in between quite some gray fields in ee (in my perception at the time), and there was even work in the area of audio signal processing, so I though at some point synthesizer stuff could even be in the research picture. International publications also sounded up to standard.

As is clear form my other pages, I was offered a graphics subject (amoung others), and in the end my content was lets call it stolen and presented by other, and the leading lines implemented wrong also by others, but anyhow, my interest in graphics and in still the same ideas were the reasons for continuing. I worked hard, and made my own graduation work, basically harldy with any cooperation, but it sufficed to (in retrospect) even force them to acknowledge my ambitious enough angles and the results enough to get my degree, and in the area I wanted enough.

The idea of network theory was in fact important. I as any other from my and other years followed the first and second year courses on network theory and associated theory, which is quite strong and for me, then and now, appealing. It is possible to take an electronical (or computer, which is a special case) network made of idealized electronic components, of any size and composition, and compute with any required accuracy how such a network will respond. It may be worth seperate pages or maby some introduction, this hard-base stuff is pretty much the basis of a lot of success in the completely relevant area of electrical engineering, technically, economically, probably socially, and in many other ways.

If there is such a thing a thinking style that goes with a certain scientific or trade area, which is probably true, then ee's are at least capable of being both theoretical and practical, have probably the strongest (with few others) mathematical background, and are not unaccustomed generally with multi-disciplinarity.

Now what about networks? In that context, even networks can be completely defined, computed, theoretically covered, and mathematically founded taken to many practical uses. How?

Let's look at an example. Suppose that we understand what a light bulb is, and that a current through it is caused by a voltage over it, and that the current depends on the type (the resistence) of the light. And that we know what a battery is, and that we can make connections between such network elements, such as the lights in a christmas tree, or lights in car, or in a stage light system. Then some will easily understand how to compute the current and voltages in a network of two lamps in series, for instance. In parallel it is also understandable what the network will roughly behave like for most.

Now suppose we have two different lamps in series? Then some can compute the current and the voltages by thinking about ratios. Now imagine a network of a few dozen lights and batteries, of any not-shortcircuited structure. Then network theory, consisting of Kirchoffs laws and the law of Ohm, and linear algebra, and general mathematics together can give you the exact solution consisting of all currents and voltages in the circuit. Mathematically exact, and for such networks, the amount of work is such that even by hand these computations can be carried out (maybe in days).

Additionally, the method to finish these computations can be formalized, that is they can be written down as rules that can be carried out by a dummy, or, the contemporary slave (which is fine with me), the computer.

This already being interestin enough, in my opinion, it is nothing complicated, in two meanings, compared to networks containing dynamical elements such as capacitors and coils (inductors). In that case still, ANY network can formally be analized for instance in the sense of knowing what input yields what output, formally, and mathematically exact. And that, too, interested me at the time.

I was into audio, for instance, and knew a long time already how to make filters, like bass and treble controls, and found them interesting, though limited. A hot item at the time in general and for me was the familiar audio equalizer, the sliders on a row to adjust corresponding frequency ranges in the audio spectrum. I was well aware of the possibility to compensate for the behaviour of my audio chain including speakers with such a device. And how, and why, imperfect yet interesting that is.

I could of course get such an equalizer circuit from a book, and in fact I even made some, with interesting enough and 'officially' interesting, though practically quality limited parts (coils) in it. In boks there were all-capacitor circuits of high quality, and I found it interesting how they work, and in fact I still do. The main problem here is that I always sort of visualized the behaviour of electronics, minus certain computation al models, and that this method isn't up to circuits that cannot easily be broken down into components that can be computed with easy enough mathematics.

Lets say it is usually a circuit with signal path in it with feedbacks and crossconnections with dynamic elements that are not easily overseable, and the main point is they all influence eachother, so to accurately compute the overall behaviour, most dynamic elements must be taken into account at the same time. Unless I'd just believe some theory book I didn't have at the time, or some manufacturers data-sheet with example diagrams, which is fine but not insight-full.

In short, I didn;t like the equalizer circuit diagrams I got my hands on, and they were too part-complicated, and I looked for the right theory. Well, after about a year of heavy enough mathematrics, coming from highschool, and electrical engineering theory, and network theory, it all came together into a satisfactory theory and computation methos to tackle the equalizer circuit, and thousands other, in fact, as I said, practically any circuit, except that non-linear circuits can not be computed in theory with mathematical accuracy neccessarily, though they too can be approximated with any accuracy (all depending on wether they are networks which are 'possible', that is no short circuits or unknowns in it).

How does that work? Without now going over the theory, though that is interesting for wide enough audience, and completely unreadable in most ee books, I sum it up. The first is that a network has nodes and arcs, arcs come together in nodes, and at a node, Kirchoff sais that the amount of incoming current (electrons) must be equal to amount of outgoing current at any time (unless there is a dynamical element, but then still such an observations can be made, but then in two dimensional, complex numbers). That gives us the equation: incoming currents + outgoing currents = 0, or sigma over all i of Ii=0, which is nice for computers and mathematicists.

The second is that we have such an equation for every node in the network, which we can number 1 through n, so we end up with a system of equations (as it is called, basically a set of equations dealing with the same problem), with n equations and n unknowns. That is interesting for mathematicians. Why ? Well, assume that the parts of the equations, the variables are the amount of current coming to a node i from any other node, of which there are n-1, so we have almost enough to at least have the same number of unknowns as equations. Then we have the sources in the network, which generate a current (or voltage), and they give us the rest of the needed equations, and in the end we then even have more information than formally needed, so normally we can scratch one equation for lets say the 'ground' node. Formally this sounds more elaborate, and more clear when one dives into the theory with enough background (in linear algebra), but the idea is that a system of equation, if it has the right properties, can be solved by mathematical means.

And given certain conditions, it can always be formally solved, for any problem size, and by formal methods. The question is, how do we compute the currents between the nodes, which in fact is by observing Ohms laws, which relate resistance, current and voltage. Now the main idea is that by rewriting Kirchoffs laws, and filling in all the component values at the right spots, the resuling system of equations is soluable for physical networks, and the mathematical methods to do so are well known enough, and accurate.

In short it is possible to take any network, let a computer program make a set of equations from it, invert the system of equations, and thus find all currents and voltages in the network.

In my opinion at the time maybe not astounding, but at least remarkable and noteworhty, theory, short of beautifull, but appealing. And mistake not, this also holds for networks, with quite some additional theoretics needed, which contain any number of dynamical elements, and therefore also the equalizer circuit. Which is even more remarkable, and in the same area of computations that lets say turned on theoretical physisists and mathematicians over an extended period of time, in other words these theoretics are not some remote area of mathematics.

The trick is to use comple number, of which the root of -1 is defined as 'j' or 'i', introducing an algebra based on these numbers where the computations in the above can also be used in, except that for instance a multiply is not just a multiply, but some more math, and that currents and voltages are believed to have real and imaginary parts. Seriously, this works fine, and is complicated enough, but gives results.

Once I covered this theory in my studies, about a part into second year I think, I happened to buy a bbc computer version, the 32k 6502 based 'electron' for about $100, got myself a screen and a cassette storage device (fitted with remote control) and started programming in bbc basic. In about one or two nights (I did my first year practically in maybe 5 months, my gawd I worked concentrated in that time), I programmed the whole program together to fill in a network, sort of like:

Resistor 1 2 1000
capacitor 2 3 1e-6
voltage_source 1 3 1V
like the well know ee program 'spice' does, where each lines contains a component, with the nodes it is connected to, and its value, make the set of equations (in complex numbers), invert the set of equations, extract the solutions for a certain pair of input / output nodes, and draw a graph of that 'transfer function's magnitude for the audio range or frequencies in a nice graph. All that would fit in 32k of basic, and work in seconds to maybe minutes for networks up to I remember over 30 nodes, which is more then enough for a few equalizer sliders. It worked.

So I was into network theory, and I knew all to well that any synthesizer circuit, also the digital ones, could be described, analized, and designed using theoretics, including for network properties. In fact even in physics, the idea of networks isn't alien at all.

Not that network theory would necessarily be into this type of research, because by and large the theory I just described is formed and known enough, but there are other challenges, such as applying all this to very large networks (like on chips with millions of transistors), to very complicated (non-linear) devices (such as in electronics), or even to computers and communication systems, where other parts of what could be called network theory jsut the same would apply, such as how networks describe a certain behaviour in digital sense, and how connections can be described, managed, and (statistically) modeled. Including how a network can be made from smaller parts, the traveling salesman problem and its variations and applications, and how one can compute with probability densities and machines made on the basis of them (like Markov (chain) theory)).

An interesting idea in the information theory area (which in my opinion at the time wasn't alien from what I considered network theory), in fact one of the few I learned later on, apart from for instance sampling, fourier and shannon theory, is the idea of modeling the information in lets say a set of video channels, preferably compressed ones. A compressed video channel made from a normal video signal contains varying amounts of information, when a scene is stable, it takes little amounts of bits, during a lot of major scene changes, many more bits per second are needed. Now what can we say if we take all channels on a cable in compressed form, and don't want to be stuck with a maximum cable bandwidth (bits per second) following from the maximum per channel times the number of channels?

In short, we then like to know what the probabilities are of having certain combinations of traffic per channel, and normally the chances of having the need for complete top bandwith for all channels simultaneously are maybe once a year, as an example, so noone would built cable networks based on this kind of figure. Two channels full bandwidth peak at the time? May happen. And the average?

Then the main obsevation is that if we buffer the data, that is delay and store it for for instance half a second, viewers would not be too disrupted, and it may be that many short peaks are smoothed out in half a second considerably. Ever seen live events on TV/radio with delays in between various paths?

The internet can as most know also carry audio and video content, and as most may have noticed, the compression methods are indeed abundant, have delay (and coding times) as a result, and it is interesting to know how much data can be pushed over a phoneline (twisted pair) or a cable, or a gsm link. Or over a lot of them combined. The new gsm standards in the making make use of varying bandwidth connections, as an example of the reversed logic, how many channels of gsm radio are taken by a certain user at a certain time, how far does this impact the radio wave spectrum (in terms of where the signals can still be received) and how can all this be managed to make good use of resources.

Network theory.

Well, and information theory. And a lot of mathematics. And computer theory, mind you NOT informatics, generally. Honestly forbids letting computer-science claim or guide these subjects, they do in general not cover the theoretics needed to at least come up with good models. A microsoft problem (no pun intended)? Nah, I don't think so, but all the digital coding standards, and the definately innovative enough ideas in audio and video coding are not exactly programmers jobs to start with, normally. And realplayer does work, and I do think it is kind of fun to have higher quality (sounding) audio over phoneline in modem form than music sent directly over it ever sounds. And even pictures to go with it, with 1/30 and more of the original file size, for instance.

Not in general? In fact I didn't have much interest in speach coding, except for theoretical reasons, because I didn't think it would be used much. I did have an interest in coding of musical instrument signals, mainly because at the time memory wasn't enough practically, and theoretically because it is (I find it) interesting to derive instrument properties and base coding on them.

Now what else about network theory?
In fact, the idea of having a good foundation to design complicated chips and systems with them interested me. In general, and for applications. How is that? In fact: computers. Chips in the beginning were photographical, chemical, physical maybe, mechanical, and eventually strictly electrical devices. The patterns on the silicon were made sort of by using a reverse magnifier with an overhead sheet of what was called a bars or stick diagram. A certain overhead sheet would lead the etching process to make a certain layer of the chips surface connections. Another overhead sheet would contain the patterns for another layer, and together they can make connection, isolations, resistors, capacitors, and especially: transistors. Millions of them, and that of course was the challenge: the bigger and faster and more complicated chips the merrier.

The way to define the chips behaviour was influenced by the machines they enabled themselves: the computer. Colour screens can contain and present many connection wires, transistor chip areas, etc., and a computer tape can hold the data for all the overhead sheets containing the whole chip layout (official word).

That can be understood, but the main point is that to make complicated chips, more is needed to finish such a process with success. First, the amount of circtuitry requires at least a sort of cut-and-past wordprocessor for chip definitions, and preferably a way to for instance intelligently repeat patterns on the chip surface. Also, what is known as 'routing' can be done automatically by a computer, that is one defines the connections, and a computer program draws in all the little wires at the right place, with options for the bundles to put at the right place. Also, the computer can generate parts automatically from a library, with certain parameters. And finally, may the most important in general: the computer can make sure the chip will work after the expensive chip manufacturing has been done, by for instance simulating the whole chip the way it will tick when it is made. A complicated and mainframe computer time consuming job at the time, that sort of gets more complicated with more complicated chips being made on also faster and more advanced computers. An interesting observation in this area is the amount of connections a chip or circuit idea can have. Lets see, mainly n nodes with m connections can be connected in n over m ways, which is a very rapidly growing, faculty ruled figure. Faculty of 10 is already 3.6 million, and growing rapidly. That is why even with computers, when circuits are involved, simply going over all possibilities is often not an option, except for small networks. And the growth of the possibilities in terms of network connectivity is higher than exponential, which is higher then the quadric increase in complexity when transistors become smaller (halved over few years or so).

At sort of the application level, it is interesting how digital (and analog of course) circuits of large complexity, such as a computer system can be built up, tuned for a certain application, or how a new kind can be made, such as I aimed for in my graduation research. And even in software, it is clear enough that the ideas of modules, and time-behaviour, and parallel execution are not far from what networks are simulated like for long already. And in software, too, the idea of dealing with complicated systems is interesting, such as in case-software, or where various object oriented idea are hinting at, because they rarely include time as aparameter. Multitasking programming and network type of software (such as in networking machines and telecom excanges) do take this issues into account, but they regularly are limited in terms of their structural complexity, though for network machines this is arguable. The osi model is known enough, the questions lie in the area of applying another type of theory.

The last theory i'll mention associated with networks is lets call it event-theory. Petri nets, communicating processes, they are probably the major ideas in this area. Mainly, the modeling of communication between (independent) processes can be described formally (mathematically correct), and computations can be based on equivalency theoretical means; as seemingly easy enough example to prove that a telephone exchange will understand the dialing tone patterns on a phone, and simulation theory such as petri-net types, or single shot equivalents, which give vehicles for modeling and simulating event behaviour. An event is an occurence of change, normally described in a network which carries the effect of the event.

Networking machines

I think it was thinking macines that made the connection machine, though I coudl have confused them with a graphics company or another. Anyhow, quite long ago it has been tried to make a computer system that in contrast with mainly all other systems around then and now was quite different. A computer with list-processing at the heart of it, and associativeness as a main computation. The space-like term (would look good on star track old generation) 'Xector' sort of refers to a vector (a list) of data refering by reference (per element) to other lists. Nice for relational database stuff, for instance, and fun enough to look at, design-wise.

In the modern time, lets say the Cisco, Newbridge, Hewlett-Packard, (IBM?) and some others' area is networking machines to span the world wide web. All those machines that connect up when you or I make a little page request across the ocean, resolve an url, find an IP node, route data to and from the server and back, they are made by these and some other companies, Cisco being one of the major ones I learned (80 % marketshare even or so).

There are thousands and thousands of them, and just like the long-existing, world covering telephone exchange centrals, they need to understand eachothers signalling and data languages, prevent parts of the world from being cut of, work 24/7, and be reliable enough. Considering the internet is based on them, and there is not even direct gouverment control, and even a fixed payment method for their maintenance and use, they are an interesting and important phenonmenon.

When phone and video are also carried over them, their relevance is almost complete, and the amount of data carried by them grows considerably.

Recently, I started reading the Cisco design guide and other pages again (I put a few in asci on floppy even), and I also looked at HP's openview, which amoung all other main computer and networking management tasks also offers snmp and even specific cisco network control software and user friendly interfaces.

The cisco stuf ranges from ethernet bit patterns (I knew em from basic and advanced university network courses), evidently the OSI protocol stack, various network protocols head to toes, and the more advanced routing, multilevel switching, and network buildup subjects. Much of it I don't completely cover at all, except that I know the outlines, and especially where the actual routing protocols are involved, I had no hands on experience, except for my own tcp/ip programming on an ethernet.

The HP pages have software demos on them, also for network maintenance for instance, sort of like speedometers in a car: reliabibilty, hit success, server up time percentage, network links performance, error rates, all graphical and pleasing, except of course that under all this, the bare network knowledge still rules.

My current reading was no open shortest path first ideas and lets say how practically connection information is kept in connecting nodes.

I read some info about the HP top servers, for instance for business databases with online transaction processing. I think about a few hundred thousand on-line transactions per minute, reliably, probably including database lookups, that's on top of server trafic. That sort of stuff, compared with a PC running an nice Linux with server, how is that? Well for most applications a well programmed Linux server (such as tht tcl server) with a good enough PC is up to most normal trafic demands for even heavier sites with T1 connections or so, I think. In terms of bandwidth, well programmed current PC should be more than fine, and in terms of lookups and file processing, I don't think there should be a problem either. Of course a real cruncher is more fun as a concept...

Making routers or the likes may be quite different. I thought about the subject for some time, and I'm not longer in ee trouble with fast connections, I think I'd like to try some adsl electronics (I got some Analog Devices samples, for instance a 500 volt common mode range fast differential amplifier, and a 4 nS dual comparator, unfortunately with connections at .5 mm (!) pitch... I adapted to the idea of 1/20 inch, maybe 1/30, but this is near impossible with normal tools), and my own microcomputer would bus-structure-wise easily be up to wan connection bandwidths.

Bur resolving so many url pers second, making sure the machine doing it never chockes up and performs, that is not my area of expertise now, and does require good system thinking. The same with routing and switching protocols, where I'm reading on again.