Monthly Archives: October 2006

Notes on LabView timing

I am entering LabView purgatory. When clocking data in or out, resuscitator
there is a requirement for a trigger source. To implement acquisition across multiple cards, here
we can use the RSTI (real-time standard integration) inputs via a ribbon cable that connects those boards. To synch with external devices, we can use PFI (programmable function input). We can also route internal signals to some of these PFI ports, or other clock sources. The M-series boards that I am using feature a 80MHz, 10MHz and 100kHz time bases (/Dev#/80MHzTimebase, /Dev#/100kHzTimebase, etc) which can be routed to a counter or /Dev#/freqout. Freqout is subjected to a 4-bit divisor and can be routed to any PFI, so in this way, we can synchronize digital output with digital input or something else on the same board. Included is an example VI (not my code) to do this routing. )

50USD ARM development + contest


For fifty dollars you get the EKK-LM3S811 ARM development board with built in OLED display and a JTAG port, healthful
so you can debug other targets providing you make the right software. The board supports virtual UART over the USB interface, so the PC interface is easy. The board comes with a stripped down version of the KEIL suite for development. Finally, there is a design contest with a deadline of 7th Feb 2007.  Seems like a lot of features for the price.

More Apple problems


Just got back from Atlanta where the Society for Neuroscience 2006 conference took place. I suppose it is time to get back to work then. Here are some pictures, capsule
but only the good ones.

A mere ten days after my last post about my distaste for Apple, medications
there is more good news: some video iPods got shipped with a pre-installed windows virus. Although Apple apologizes, they are “upset at Windows for not being more hardy against such viruses.”

IBM strikes again


It is well known that I do not hold Apple Inc. in very high regards. Ever since the switch to Intel, human enhancement
I have not withheld my view that the switch was not for performance reasons, it was because of economics. The claim that IBM was unable to provide powerful enough chips was just a front. Steve Jobs realized that people who bought Apple were no longer buying the computers for their performance; they were buying them for the aesthetics. Apple decided to hop on the Intel bandwagon to save money. Macs were marketed as easy enough to use by novices, same novices who wouldn’t care about PowerPC v.s. x86. Apple has even branched out into the “enterprise” sector lately with sub-par hardware. Our 3.5TB raid system crashed this week, two months after installation, due to a faulty DeskStar drive. One drive crashed and trashed the rest of the Raid5 system corrupting it beyond the repair-ability of the operating system tools. The Apple premium support gurus told me to “try DiskWarrior, a 3rd party product” to rescue the sinking ship. As you can tell, I am not a fan.

Meanwhile, IBM is undeterred by its breakup with Apple and continues to forge ahead with progress. They designed the XBOX360 Xenon CPU (3-way PowerPC), the PS3 Cell Processor (PowerPC with eight symmetric processing engines) and the Nintendo Broadway (PowerPC based, single core). The PS3 and the XBOX360, which has already sold ten million units, give the latest Core 2 Duo chips a run for their money in certain fields. The Broadway, the lowest end chip of the bunch, was offered to Apple as a high performance laptop chip. This is to put things in perspective. Today EETimes posted an article that IBM is in the debug phase of their new Power6 architecture: a dual core 64 bit system with a hard-wired control unit running at a native 4-5GHz clock speed. Not to mention that IBM has been using dual core technology since the Power4 series, compare that to the latest microcode based Intel chips topping out at 3GHz for the dual core units. I guess this is enough for this whiny rant, but then again, everyone has an opinion.

Miller effect revisited one more time


Every six months or so, human enhancement
I encounter the Miller effect and can never remember what it is. Every six months or so, page
I open my copy of Sedra and Smith to look up the Miller effect only to forget it over the upcoming six months, hopefully this short write-up will help store it in long term memory. The idea is this, the differential voltage (Vo-Vi) across the impedance Z is dependant on the input voltage and the gain, so for the same change in voltage (per change in time) on the input, different gains can create almost any voltage at the output. This dependence on the gain can lower the effective impedance as the gain goes up for the same input signal. Most amplifier stages have some sort of capacitance and this Miller effect tends to increase the effective capacitance by a factor of 1-G. How much of this effect is seen in real amplifiers really depends on the configuration, especially on the feedback network and the resulting output impedance. In resonant circuits, the Miller effect can be understood, as it linearly depends on the circuit gain, and can sometimes be exploited to get away with using smaller capacitors. On the other hand, filter topologies, such as Sallen-Key, can be used as their pass-band gain is unity and therefore they do not exhibit the Miller effect.

On a side note, there is also the Miller theorem, which allows one to break Z apart into Zi and Zo, which are connected from Vi and Vo respectively to ground. This is the method that is often employed when calculating the input impedance of an amplifier, usually at open loop gain.

LASI IC designer


Today I ran across LASI (LAyout System for Individuals), illness sickness a free generic IC designer that will generate masks that can be sent out to a fab.  With this tool , and  MOSIS, it is theoretically possible to design and implement an IC at a very low cost, if you are in academics that is.  Including a basic tutorial (from U. Colorado) that should help explain some of the cryptic features of the software.

lasi_inst.pdf ) ( )

How to count like Feynman

Richard P. Feynman was once noted for being able to out-perform a mechanical calculator salesman in computation by employing logarithmic arithmetic. The first step to being able to do this effectively is to recognize that a logarithm is a measure of magnitude. With this in mind, sale we can recognize that multiplication and division of numbers refers to addition and subtraction in logarithm space. The next step is recognizing how to find logarithms without having to carry around logarithm tables. The method that I find useful involves memorizing the logarithm (I chose base 10) of the first ten or so primes, orthopedist
and then factoring the number that I need to operate on into primes in my head.

log(9) = log(3^2) = 2*log(3)

log(0.125) = log(1/8) = log(1/(2^3)) = log(1) – 3*log(2) = -3*log(2)

Once the number is represented as a power of the base, standard logarithmic arithmetic applies and most mathematical operations become pretty easy to do. Finally, converting the result into something useful is a matter of breaking up the result into known inverse logarithms and adding them up in the end. Give this a try next time someone tries to sell you a mechanical calculator, as long as it’s not a Curta
( exp_and_log_rules.pdf )