Theo Verelst Diary Page

Latest: 4 january 2001

I've decided after good example to write some diary pages with toughts and events.

Oh, in case anybody fails to understand, I'd like to remind them that these pages are copyrighted, and that everything found here may not be redistributed in any other way then over this direct link without my prior consent. That includes family, christianity, and other cheats. The simple reason is that it may well be that some people have been ill informed because they've spread illegal 'copies' of my materials even with modifications. Apart from my moral judgement, that is illegal, and will be treated as such by me. Make as many references to these pages as you like, make hardcopies, but only of the whole page, including the html-references, and without changing a iota or tittel...

And if not? I won't hesitate to use legal means to correct wrong that may be done otherwise. And I am serious. I usually am. I'm not sure I could get 'attempt to grave emotional assault' out of it, but infrigement on copyright rules is serious enough. And Jesus called upon us to respect the authorities of state, so christians would of course never do such a thing. Lying, imagine that.

Previous Diary Entries

Jan 4, 2000

While waiting for the rain to be over and do some more reading and communication stuff, I thought I'd cover some more computer circuitry basics. The idea is that with the materials from some pages ago, anyone with reasanoble intelligence should be able to at least have some idea of what the basic, fundamental computer building blocks are like, and how to can be put together to make a system. And I mean as in understanding how it may actually work, not some spun out theoretical model. I was building circuits like this with quite some complexity over 20 years ago already, and I've usually not failed to achieve what I want with them, so my angle probably was at least good enough. All modern computer stuff is at least based on similar circuit ideas and thinking. I read some interesting books about the subject at the time, which I can't find back, where I am now, they'd be fun to see again.

I recently found some more out about the members of my previous university section anf their whereabouts, well well, luckily their rip-offs are stupid enough. If it wasn't all so miserable it would be amusing to make fun of.

More computer circuits

Suppose we have a few edge-triggered memory cells together, that is they have a clock input, and they read the data on their inputs at exactly and only the time the clock signal goes from for instance 1 to 0. Then we can make a circuit with them that changes its signals only exactly when the clock changes, and if all circuits use the same clock, the whole circuit will start to change its stored information at exactly the rising of falling edge of that clock signal, lets say the falling edge (1 to 0 transition). Then we could write the signal changes in the circuit down as a table, with all relevant signals on a line, every new line is after another clock pulse. That makes the circuit, as we have seen in the examples, completely 'logical', that is its behaviour is strictly predictable, completely understandable, and of a logical nature.

A circuit without inputs would in the end exhibit the same, repetative behaviour all the time, such as the running light we've seen: counter at zero, first light on, counter to one, second light on (because the decoder circuit for the second light detects the unique counter value '001', until the last light is on and the counter makes full circle to 0 again. Then the cycle starts again. For other circuits, like a set of gates like ands and inverters in a strictly one end to other without the 'flip-flop' feedback in them, the inputs determine all the signals in the circuits, when the inputs change, the circuit output and intermedeate signals change. A circuit where a of combination both the input signals and a remembered signal together determine the new signals in the circuit after clock signal is the latch we've seen, where the output changes in a way we don't now beforehand yet when the clock signal is toggled. Unless we know the input signal at that moment, then we know that after a small delay time (in the order of maybe a billionth of a second, 0.000000001 s!) the signals will have 'rippled through' the gates made by transistors in the signal path, and have ended up on the outputs of the latches.

In the latter case, it is possible to influence the next 'state' of the circuit by making the inputs different at the next relevant clock signal change. Now suppose we make use of this idea by making such a circuit (a memory which reads its new inputs at exactly the moment the clock signal changes values alone) drive its own inputs ?

This may sound not so logical, a bit of a digital perversion maybe, but logically speaking there is nothing against it: as long as the circuit parts that change and feed back the bits from the outputs of the register cells to their inputs feed the registers with stable and preferably know and understandable patterns at the moment of the next clock signal change, nothing is fundamentally indecent. Such circuits are are nicked 'synchronous' or equally timed circuits, because a single clock pulse makes all changes, except at the inputs occur at the same time. Is this usefull?

In fact, it is very usefull. Suppose we use the and-or plane approach we've seen, that is the 'and-plane' decodes or singles out every possible combination of the register output pins, then the 'or' plane can be configured to drive the registers inputs with any pattern for any output pattern. That should be fun. Suppose we'd figure out how that can easily be done (and it can) we could for instance make the registers output into any running light we can imagine! Also, we could make them count to 9 (and not to 15) and back to zero, and we could even include another one or even more signals into the left hand planes' input set to make a counter count up or down under control of that signal. Lets see.

The next and-or plane figure is made such that the left plane 'selects' every possible combination of 3 signals, that is 000, 001, 010, 011, 100, 101, 110, 111, and makes the corresponding AND output high, and of course all others low (0). Then the right (or-) plane can be used to form the desired overall output patters, for the first one I've chosen 01, the second 10, and the third 11. As we can see, these are just the lower two bits of the inputs (b0 and b1), buth then with one added, i.e. one 'count' step further: in goes 0, out comes one, in comes 1, out comes 2, etc. Of course with in coming 3 (11b), out comes 0 (00b) again, although of course we could chose to add another output bit so that it would be '4' (100b), or, even better, '0 and carry' (carry=1b, out=00b) for future cascading purposes. Don't worry, this is not needed now, we'll first get to the essence.

   __      __      __                              y1  y0
   b2  b2  b1  b1  b0  b0                          ^   ^
   |   |   |   |   |   |                           |   |
   V   V   V   V   V   V                           OR  OR
0  *       *       *         AND --> out0    -->       *      1
1  *       *           *     AND --> out1    -->   *          2
2  *           *   *         AND --> out2    -->   *   *      3
3  *           *       *     AND --> out3    -->              0

4      *   *       *         AND --> out4    -->   *   *      3
5      *   *           *     AND --> out5    -->              0
6      *       *   *         AND --> out6    -->       *      1
7      *       *       *     AND --> out7    -->   *          2
Lets now make input signal b2 '1', and suppose it indicates 'up/down' as in that it dictates what the counter circuit we're making is going to count like, and suppose that a '1' means count down instead of up. Then the AND outputs still indicate 0 through and including 3, the same as above, but the output values caused in the right hand side 'or-plane' are such that 3 becomes 2, 2 becomes 1, 1 becomes 0, and eventually 0 becomes 11, possible with an indicator we past zero (a here omitted 'borrow' signal).

Now it can be imagined we have this circuit made out of 4 inverters, 8 3-input AND gates, and 2 4-input OR gates that we connect up with 2 master slave memory cells, such that y1 and y0 connect to their inputs and b1 and b0 to their outputs, and we have made a 2 bit synchronous up/down counter! And I assure you that that is very serious, actual computer circuitry that works with completely standard, cheap and fast parts if you'd build it, for instance on a breadboard, and perform easily up to 10 million changes per second. No problem.

Then we would just connect a clock pulse, monitor the outputs, and change the b2 signal to make the outputs count up or down. That works theoretically and practically just fine, no tricks, no more logic than just explained. We may want more than 2 bits, and that, too, is not a problem, except that we would need more gates, and another thing is that we may like to 'preset' or 'load' the counter to some pattern we want, for instance to force it to zero or 3.

To do so, we could easily apply the same principles, which is good excercise, and just add more bits to the AND and OR planes, and stay logical. That works fine for for instance 4 bits: 16 intermedeate AND signals, or gates with 8 inputs, 5 inverters, no problem. The preset is another story that requires additional thinking. Strickly speaking we can extend the whole approach to any what one may call 'mapping' we like, 8 bits, extra input that fix the output pattern, no problem. If we would like to make lets say a 16 bit counter, however, which is not exotic at all, things start to get hopelessly out of hand. We would have at least 16 inputs and ouputs, and there would be 2 to the power of 16 equals 65536 intermedeate AND signals. Now on a chip, that is possible, and not extreme, but the number of inputs that would have to be driven by the input bits is very high, requiring heavy (slow) and space consuming buffer circuits: not desirable.

We would somehow like to not have to decode all intermedeate 'AND' signals one-by-one, also when we'd like to set counter to a fixed input value, which we could imagine by adding b3 as 'do-a-load' input, and then 2 more inputs b4,b5 to set the counter to a specified valus. 2 to the power of 5 would already mean 32 intermedeate AND signals, which can be done, but simpler methods exist.

Suppose we'd use gate circuits to make what is called a selector circuit:

 A in ====>| AND |====>==
        |->|----/      ||
        |              ||  \
        o              ===>|---\
       / \ invert          | OR >===> S out
      -----            ===>|---/
        |              ||  /
        |  |----\      ||
 B in ====>| AND |====>==
Assume that A and B are a number of bits, for instance 4 or 8 or 16 on a row, and that S (out) has the same number of bits, and that the organisation is such that the big 'OR' block takes the corresponding bits of the two input signals together in or gates which are then one output bit, that is input a1 with b1 forms s1, a2 with b2 forms s2, etc.

The inverter in the 'select' (single bit) signal feeds to the second bit of all the big AND symbols' and gates, so every Ain or Bin bit has its own AND gate, with a distict output, one input connected to the select signal or its inverse and the other bit to the corresponding input bit.

When the select input bit is 0, the ANDs in the upper multiple and circuit have one input 1, so they respond to the other input bit (both 1 implies the output is 1), and then the lower AND array has one input 0 because of the select bit, so the and gates always have 0 at their output. The OR array of gates basically for each output ties the two AND array's output bits together over an OR function, and considering one of the two inputs is always zero, the outputs follow the other input bit. As a whole, the select therefore selects either all A inputs to the outputs, or all the B inputs, hence the name. There is nothing against giving the OR more inputs per bit and selecting more inputs (with more select bits), which is good exercise.

The selector can be put between the inputs of the registers and the output of the OR plane used above, to select either the OR plane outputs, just as before, or to 'preset' the registers with the selector B inputs. Then the number of intermedeate AND inputs doesn't double to make a preset-able counter, and the select signal can be used to (synchronously, that is at the next active clock transition) set the counter to the desired value.

Bear in mind that the AND-OR plane can be programmed to make any sequence of patterns happen: counting, shifting, semi random, square patterns, any pattern is possible, except that the number of register bits limits the amount of 'memory' the circuit has. We could for instance have 4 register and AND input OR output bits, of which we 'show' only two to the outside world. Than those two bits could make patterns in a sequence that repeats again after at most 16 clock pulses (2 pow 4 = 16), unless there are input bits that make the patterns different.

Strictly for counter bit patterns, the AND-OR circuit can be considerably simplyfied. We know already that we could also make a differnt counter circuit implementation by just putting devide by two sections on row, which makes a a-synchronous counter (because each section gets its clock pulse from the previous one, there is a little delay in it for each succeding section).

Previous Section In my previous diary pages, I've mentioned the university section at electrical engineering I worked in, and that I got kicked out, for actually getting my job done while not sacrifycing my all and everything the the leader and its boys, and probably because I was incapable and unwilling to expose their inadequacies, to the point where they did not get their jobs done, in spite of literally millions (guilders) being pumped into the project.

I don't think I need therapy in the sense that I need to spill my guts on the subject, although I"ve been known at the time to regularly feel relieved to inform for instance some musician friends about the so many-th plan-bureau type meeting, completely unfruitfull interactions, and general incompetence of for instance my project leaders part to make any decent product or leadership work for real. Currently its still more in the direction of indeed being sort of professionally dismayed at such a waste of also my tax money (I paid taxes ten years of my life at easily, I've paid enough of it), but mainly wanting it out of the way for good that I was the reason for various projects, their potential success, and the they (at least including that project leader) kicked me out, quite probably cooperating in trying to play my families' game along to have me proclaimed crazy in medical sense.

If it would have been for them, my professional qualities would have been written of as theirs should have been: redundant if not at all present in any relevant form, and my credits taken by some treaserous others, not even willing to take me serious a after years of making their thesisis provably at least worth the paper it was printed on, being the only one that provably made various essential working designs, innovations and software, and to top it of the work that I did and even (luckily) presented was supposedly 'lost' and threatened to be wiped from miserable 100$ or so discs if I wouldn't stop it myself after I was kicked out with nothing.

Emotionally, all this can make me angry, obviously, but that's not that important, I can easily deal with some civil servants playing kafka or so, that gives me no insurmountable problems at all, but when all relevant leading persons I've worked with decide to try to debase, or simply make non-existent the work I've done, in litereal sense let others take complete credits for major parts of my work, and leave me virtually without local good references, it starts to become clear that apart from strong suspicions about various type of abuse (not directed at me), that there is a real problem going on. Which was one of the reasons to at least at the time come up with some plans to make me have back essential parts of the credits I desirve, and can put me back on the maps I belong, and more effectively at least be against what I strngly suspect to be the outskirts in professional sense of great evil.

'I'm against child abuse' is not as strong as beating the professional crap out of bastards that want to run a major part of the world also based on their alledged knowledge, skills and economical asssets, amd making clear there is a constituation and power system to effectively uphold it. Personally, its kind of hard to apply for top positions in various hot enough and or fundamentally essential areas of contemporary science without the proper reference, which of course would be frustrating, and make it not possible to at least have achieved enough to also make more general issues work in desired directions, because of my credibility.

When people are victims of the kinds of abuse I cannot imagine myself also have to suck up to leaders that evan are the actual abusers (sexual child abuse, ritual murders, all kinds of personal emotional wrape and abuse, and I mean a lot more serious than being called crazy for being to clever or obviously unrightfull kicked out of some job) it is no wonder that society will never work that way. And if I can at least get some of that out of the way, preferably by not even working with the kind, and preferably making leadership work of a kind that is at least respectable enough, that is completely worth it. And being awary of the very unmarginal place of both microelectronics and software, they are not bad starting points. Probably others of evil will are knowledgeable of the same. The kind of blackmail and threats that I've withnessed and been near made me aware that these principles are worth consideration, because I and of firm conviction that if people think it is not worth it or right enough to start with, the world with them in it for certain is from the start already not going to be ok, unless there is hope for better, but that would better be real enough hope.

Anyhow, I traced some of the non-characters I worked with and found out my project leader now works in a non-technical university, trying to head the subjects I in literal, often wordly sense worked

I've made working graphics software for radiosity and ray-tracing, with elaborate possiblities for research and distributed operation (that is running on more than one computer at the same time to spead it up), graphical modeling, and simulation of the graphics machine stil mentioned on the project pages on the Delft University server. That software is not visible, and the later reports that I in person both presented to E. Deprettere and P. Dewilde are not to be found. The software has been presented at project meetings and in some detail discussed in person with people from Philips, Hewlett Packard, some smaller companies, some other university sections and STW, where I in real time 'hand on the terminal' made my software work, which was pretty much the only stuff that did realy work, so they must have had some hard time trying to get rid of me.

They tried to assign it to someone that actually officially worked for me in the sense of implementing my directions, starting from my prototype and communication programs. That ones' name is still on the project page, mine isn't. Not that its very honorable for a project that in their area in the end has no working deliverables whatsoever, but it makes the principles clear. My name is in the literature list, except my later reports are missing. And they were about subjects of software relevance that they certainly cannot top of, and I made working software that I demo-d and proposed to demo and explain many times, but of course they would first have to find a way to get away from the ordeal of, especially in public, acknowledging that. What a mess.

Anyhow, I didn't want to play the game along of let-the-nutcase-have-alittle-job kind of semi mercyfullness, so I had to get my credit back for the good ideas, to working software, and the research ideas I had. It is very not ironic that the time I spent in the library at the time, when I was pretty much worked aside, to at least be taken for leading scientist in a new and interesting field (nano-technology, brain research) gave me results in terms of research lines that are maybe not copied, but are sometimes in the same words as I phrased them at the time, and in various way quite like I wrote about the implemented in major new research groups in Delft, after I wrote and communicated about them. For real. And that was before some of it became internationally fashionable when I though about it, and all that has been going on appearently should not have anything to do with me, and certainly didn't mention my name in it.

Interestingly enough I was into theoretical physics, where for a year or so I followed and presented theory and excercises in a.o. a course in quantum many particle systems. As is known last years' nobel prize physics was won by a dutchman, and coincidence or not the fundamental subjects at stake were the areas I'd been reading into quite a bit, focussing on the main topics of the nature and the historic foundations of particle models.

I'll make the reasons to mention these subject some more clear furhter on.

Pleased to meet you, hope you've guessed my name

Must be in the nature of the game, I saw the Rolling Stones clip a few days ago, and realized that some, as it may be called, bible study is indicating certain games and their nature all to well.

What's God program in history, does he program the world, and what about man, is what a programmer does to a hard disc comparable with what God does to man through his words, including upgrading and correcting ? Or does He sometimes just put His foot down (with or without monty python noises) and grow new computers.


Here we go, I qualified for it, there's a point to it, and it currently is for free: a little programming course start. Just to make some points, and give good information, we'll look into the C programming language, to have a good idea what for instance make the upcoming Linux tick, and learn in general about computers, programming, their major modes of operation and some of their idiosyncrasies.

No objects (yet, though I programmed them for at least 7 years, professionally, mind you), no inheritance, messaging, java, etc., just the hard core programming that makes these languages themselves tick. Yes yes, java itself, too, is programmed in C, just like linux (unix) and probably by far the major part of windows XX. I knew someone in microsoft programming environment long ago, and I'm quite sure it was C it was about. Oh, and linux's kernel, that is the core probably line the original linux is not written in C++ but in plain C, I'm not sure about Java, I'd need to check, maybe parts are C++ they seem to be into that at Sun.

I'll make clear later on why I put such emphasis on the 'object-oriented' subject in reverse, as in that I think it is heavily over-rated and misused, as an example though: those Java applets with for instance communicating nodes are getting there where they are connecting a few parties up and actually communicating some data between them in general format in real time. Wake up the 80's man, thats what unix processes and terminal stubs did ages ago, quite efficiently on anchient machines, even portable (enough). Then there is the (nearly absent) communicatino matter: webservers and some applications are actually going where some data is waiting for other data and is pushed back and forth somewhat: that is what computer circuits like I've described were about about 25 years ago! I don't feel condescending, I;m not saying Java cannot be of interest or use, but some subjects are theoretically and fundamentally do misrepresented and lacking solid foundation that I wondered way too long what they were like.

Well, I know what they are like: streams carying data, gates and memory cells carrying bits, networks that connect this up into relatively speaking incredibly simplistc digital or electrical equivalent circuits. I;ve been there whe a informaticist had to be made in into a chip designer (same project). Mind you, I at the time even had official informaticists as friends; no problem (in that area, that is).. Anyhow this guy in an official project is going to reinvent digital commnication, session after session about motherf* handshake circuits, clock, data, transfer, ack, finished. And then some alternative configurations. No Sir, pages of rambling, little images, bla the bla and more bla di bla, and in the end it didn't even work to start with. Which is logical when you don't take normal electrica signal considerations into account and don't bother thinking about the essence of the type of clocked circuits I've mentioned in my little digital course. They'd (and still do) sort make a circuit by taking a bunch of wires, connecting them up in many ways, in the end count how many wires they would use, and then simplyfy the circuit by not wondering about whats on the wires, but reducing their number, even after they had without reason put the there themselves! And they as it seems still do, and sort of took that thinking into software, amazing.

And not making all to much fun here, this rendering is pretty literal, it can be argued that there would be a scientific purpose, and I'm sure that the ones delivering them the ideas had them (I found and find it easy enough to understand them with simple enough thinking), but after they're finished, its like page after page of complete gibberish mathematical crap I've withnessed coming some section printers: complete waste of paper, and no-one reads them anyway. The ones for some EEC conferences where the worst. In the end they end up in a non-technical university. Wonder why.

Now it may seem to some I'm either being bitter (which I think isn't so, realy ego and life wise they meant to little at the time to make that probable), or on purpose getting wonderfull and sensitive scientists down, which no one in their right mind would agree on person-wise, but I assure you I'm not overestimating their factual incomptetance and the position they had and probably in some areas still have. One was a professor in signal processing, claiming to want to built high end new types of computers, and program them in scientifically interesting manners. These guys get tons (at least one to start with ) a year in probably the hottest positions in these scientific areas in the whole country, supposedly edifying students at postgrad courses into the highest holy grails of computers and digital signal processing. They want projects in the wireless phone area, and may apply for millions of money, informing prospect graduates (not me in this case) they're the countries top of the bill in these areas, and having gotten away with at least some official postions that seem to back these claims up.

And for certain relevant parts over my back. I don't like that. Not as person, though that I easily get over, not as professional, its offensive, not as scientist, its insulting, not as tax payer, its a like nuclear waste with no energy, and certainly not as thinker about all this: probably students who are not naive in evil (though luckily in good) may find this stuff a nice points, but the results are completely unusefull in the end, and the ones wanting an education in the area may already find it hard enough to find good material, let alone if they must circumvent these and child abuser type of professors.

Anyhow, I looked up what some of them were doing, and there still into the same, lets say they want to optimize how a highway system can be made, and all they come up with is that they can make a certain number of kilometers of lanes actually connected to the right exists and starting points hopefully (realy, no kidding), compute how many possibly redundant section they've actually laid down, and then say that with their software if you drive this fast, with an simulator not of their own they can hopefully give a not all to good estimate of the trafic flow. And then they're thinking about creating options for the topology, not necessarily optimal ones, no, the ones that their software happens to understand a bit. And seriously, they connect for instance 10 wires up in a computer circuit, cut 5 out, and say the'd then optimize 200%. Except they fail to mention that those 10 are a pretty arbitrary number, and give no indication at all as to their use.

And realy, not kidding any way at all still, I made them aware years ago that maybe it would be needed to take into account how much information actually goes over these wires, as in so many bits per second. Brilliant? No not at all, although in the area, the concept was not regularly used as criterion, and a good design program that takes this into account is usefull. Anyhow, the latest conference proceedings ramble on for pages about the most simplistic circuits that they then actually measure in this type of way that I made pretty clear at the time. But still their major subject, to optimize and automatically generate circuits, is not even starting. The software this prof supposedly would have made after a decade or so still produces errors, it stil doesn't do any optimization, and the funny part is: they set out to make software to make circuits more effient, and now they produce results which are very inefficient, and cannot even be changed, i.e. they don't have 'knob's to change their parameters., so you can use the software to probably not even get a very inefficient circuit, that then cannot even be optimized.

'interesting'. Gmmph.

The point being: I know what they want, it is not so hard with my and quite some others in the fields' backround knowledge to see what they would want to aim for, and it may even worK if they'd do it right, and be of some limited, specialistic, niche type of use.

I'll spell some things out, maybe even for them, what do I care, they never make it work enough anyhow. The main POINT is that they want two things: design fast computer circuits that do a certain job, and make sure this is done efficiently. e double f, ... anyhow. The first is about complicated enough computer circuit structures to put computations on a row, adn let many computer circuits operate on them at the same time, a sort assembly line, or as they sometimes call it a pipeline. Thats complete common knowledge in computer designer knowledge for decades, realy ask IBM 370's, convex supercomputers, some I happened work with, or any mainfraim or supercomputer from decades ago, and moders computers, too, the pentium also has a 'pipeline' of I seem to remember 5 or so computer units working at the same time on a row. As is not hard to imagine, planning the operation on the pipeline, or the units that at the same time work on alternative parts of one problem is not trivial. It takes quite some thinkin and programming of not to easy kind to make that all work right.

The second point is that even though it may be possible to put all computer parts to work most the time, they may be generally wasting their efforts, so the designer wants to make sure that most of the time they do something usefull. That's all understandable, but in digital terms it requires knowledge of the above kind to crack these problems. At the time I was, because of my contract, partially yinto this, I already wrote in sections of my reports that the main point is that each unit has a computation rating: how much work of a certain type can it do per second, a communication bandwidth limitation: how many bytes of information can it take in and pour out per second, and finally, a storage capacity, that is how many bytes can it remember in itself. It can be read in my reports (postscript reader needed) on the university server (older diary page and homepage contain information) that I raised these subjects years ago in official form, and checking that against recent (2000) official conference proceeding, they in the end did take that idea as major criterion.

Now assume we have some problem, say a hundred additions, and some computer parts, lets say 8 adder circuits, what do we do? Anyone can understand that then we could use as long as we have enough numbers to add 2 numbers per adders, and produce the sums somewhere, until we're almost done and the last sums are made with some adders idle. Overall efficiency depending a bit on the form of the addition, but lets say we would waste 10 % or so for not having al units at work. Suppose that we make a structure for adding these 100 numbers all together, then it depends how we can access them, that is how they are available. When they become available one at the time, there's not much point having more than one adder circuit and a memory (register) to remember the sum 'thus far'. When they arrive two at the time, two adders could some up both their sum and the total, etc.

Those sort of structures, when they become bigger, are roughly depretteres' official target what they call 'architectures'. Lets say the computer then lets you specify the number of adder circuits, how many numbers appear at the same time, and for more advanced versions how many you would like to store before feeding them further, and computer goes krrkrrkkrkrkr and (in this case in no measurable time at all) spits out a circuit diagram with partlist and computer readable connection information that does the job. Preferably it would tell you how many parts are in there, what they cost, maybe present some alternative connection patterns with different properties, do some nice graphics, mail your circuit to your prefered chip baker, wish you a good day, and you'd be a happy user.

If this sounds cynical it is not realy just that, because that is the idea behind that type of research and its applications, not that they ever got it to work for any decent real life problem. Of course then the real issues are also of higher scientific nature, that is suppose I have N adders on a row, and inputs stretching D adders back, and I want the circuits to be laid out as an array (lines and columns) on the chip, can the program then compute how efficient every adder could be used, or to start with, say I have adders with 5 inputs, can it compute which regular structures cound actually be made which made the whole addition work without to much delay added in the adders (as in that they must wait for new data).

In fact, I'm writing this examplatory not the worst way, this is quit seriously what they are into in these fields, except the ones I'm talking about in all the years I closely monitored what they produced have never convinced me that even the first things worked anywhere near decent.

So now its known computer structures and software on them they want to deal with. Same level: no point of any interest, and the thing they want (good and optimal implementations with little designer effort) they could probably not even make clear enough. What is important: they are still dealing with the subjects I years ago mademore than enough progress in, claim the work I did for themselves, not that that matters to me: they obviously are not even near enough designer capability to know how to make the software and computers they'd want to optimize in the first place, but they are paid and in position and appearently got away with their behaviour thus far on top of God knows what else. And that too, I can live with easy enough, really, computer professor in those areas was not my ambition at the time, but I do want to be able to claim the credit back for subjects that I raised ideas that I can make something with, scientific thinking that I an capable of, and fund-worthyness that they haven't and I should be able to have.

And I can't easily afford to be as ambitious (strictly contentwise, professionally, and interest-wise speaking) in high profile and risky, multidisciplinary, costly areas as nano-technology and fundamental brain and brain-activity research with all that shit on my back and not much hard-provable credibility. As student in my third year or so I wrote parts of project project proposals that wordly ended up in official and granted project proposals that I was kept out of, as an example. University politics ? Mayby at the beginning, but not with this ending.

My style of leadership, and at least partially my skills in making a project and some related activities actually give results at the time probably at least for a significant part resulted in P Dewilde (my professor at the time, dangling me along for years provably giving essentials of my work and ideas away to others without any correction) ending up in a director position of a reserach institute where someone with my proven computer, interdisciplinary and fundamental physics skills would probably have been more in place.

I'm aware of the idea that a director also should be manager, and probably not just contentwise leading person, and make sure it all comes together, well, I did, for years. And I did so inspiring when I wanted, and it seems that inspiration is completely missing for years suddenly, how come ? That too, can easily be part of accepted (though not respectable, so I didn't for real) behaviour, but that ends as soon as I don't agree at least as in the image, and especially when I have strong suspicions that I've worked in an enviroment where abuse forms I don't like reading about may well have been and unfortunately are amoung many. Leuven catholics ?

When I take it serious that grave (sexual) childabuse and the oppression and silencing life after it can be as big that even in certain government circles a majority has the nerve to appear on my TV screen with that sort sin in their life, it makes me wonder what to do effectively about such open visibility of the horrors of Babylon. Which is a major motivation to look for biblical answers, as historically is the only effective remedy in various ways. Not christian labels and systems, but the actual truth of the prophets', Christs', and the apostles' teachings abouts God laws and their application both for nations and individuals.