It has occurred to me that in this ALU and in all combinational circuits or chips that all operations that it can do happen all the time whether you ask for it or not. If you have a combinational circuit that can AND things, can ADD things, can OR things etc. etc. and if you give it some input values then it will perform all operations that it can do on that input data but only provide the output of the specified operation that you ask for (all other operations eventually computing to 0).
I've never read anything about this, but wonder about it. It seems that a combinational chip operates as a single whole, doing everything it can all the time but only a part of it being of any use at any time.
This concept gives me the idea of a bunch of people all singing in a choir at the same time. The song is the input data. Each person is a different part of the chip performing some calculation with the input data. So some people are singing the song beautifully, some are singing it backwards, some are singing softly and others are singing it loudly. And you the user just select one person to listen to out of this cacophony at any given time -- though everyone is singing it all the time in there own special way.
But it is more than that. The parts and functions of a chip do not operate independently, they operate together. Any function is the result of operations of lots of parts of a chip. The output of one part of the chip becoming the input of another.
So the cacophony of the chip isn't each part doing it's own thing, it's each part doing things in many combinations with other things. Tons of combinations of logic running, some of which may be of use to the current function asked for, most of which is not.
This is much different than the typical programming I'm used to. In programming you make a conditional and then choose logic to peform depending on the conditional. In logic design you perform all calculations and then use conditionals to choose which result you want.
21 March 2010
I'm no chip designer, but from what I understand, you're correct. Part of the reason for the discrepancy between programming in silicon and programming in bits has to do with the differences in cost (latency, parallelism,...) for various operations.
Way up at the level of a decent programming language, a single instruction (i.e., a whole line of source code) can take a long time to execute (e.g., connecting to the database, logging in, retrieving data...). Moreover, we can't execute two branches in parallel (at least not without even more overhead). So if we can evaluate a conditional expression and use that output to "filter the future" and only run the necessary branch, then we'll win in total running time.
But way down on the physical chip things are different. A single instruction still takes a long time (which is why we have pipelining, prefetching, etc), but it's rather cheap to execute multiple branches simultaneously. Conversely, "executing a conditional" means jumping somewhere else in memory, blowing your cache, bubbles in your pipelines, etc. So here, it's a lot cheaper to just do everything and then "filter the past" and only keep the relevant results for the next instruction.
This is absolutely the case with SIMD machines like vector processors and graphics cards, where they will perform operations on other nearby data even when they know they'll throw *all* of it away. Modern ALUs and CPUs have been on a collision course with GPUs for a while now, so they're all much the same. On decades old ALUs/CPUs they used to actually behave more like programming languages because the costs were so different, but that was long before VLSI.
Interestingly, another part of the difference has to do with purity. Down on the chip you're dealing with a pure "language" that has no essential side-effects. There are side-effects about latency, but that's it; nothing irrecoverable. Whereas in a programming language, there are all sorts of side effects which are really expensive (not just computer time, but human time and even money,...) or impossible to recover from (lives,...). It's hard to filter the past when the missiles have been launched.
25 March 2010
Yep, that's more or less correct. One of my favorite "programming" languages is Verilog, which is a hardware description language. (VHDL is another.) The program can be run in a simulator, which accounts for all the various traffic going on... Going back to a regular programming language after Verliog seems, well, a bit single-minded!