SN 1007 Efficiency

  • Be sure to checkout “Tips & Tricks”
    Dear Guest Visitor → Once you register and log-in please checkout the “Tips & Tricks” page for some very handy tips!

    /Steve.
  • BootAble – FreeDOS boot testing freeware

    To obtain direct, low-level access to a system's mass storage drives, SpinRite runs under a GRC-customized version of FreeDOS which has been modified to add compatibility with all file systems. In order to run SpinRite it must first be possible to boot FreeDOS.

    GRC's “BootAble” freeware allows anyone to easily create BIOS-bootable media in order to workout and confirm the details of getting a machine to boot FreeDOS through a BIOS. Once the means of doing that has been determined, the media created by SpinRite can be booted and run in the same way.

    The participants here, who have taken the time to share their knowledge and experience, their successes and some frustrations with booting their computers into FreeDOS, have created a valuable knowledgebase which will benefit everyone who follows.

    You may click on the image to the right to obtain your own copy of BootAble. Then use the knowledge and experience documented here to boot your computer(s) into FreeDOS. And please do not hesitate to ask questions – nowhere else can better answers be found.

    (You may permanently close this reminder with the 'X' in the upper right.)

Badrod

Active member
Sep 29, 2020
42
18
Steve's comments regarding the inefficiencies involved with the current crop of AIs (last 2 paragraphs of the show notes) reminded me of a talk I watched last year by Andrea Liu, a professor at Penn. Professor Liu presented her research several times early last year. This is her talk at Stanford: Physical systems that can learn by themselves. There are other versions on Youtube.

Basically, as far as I can follow, Liu is suggesting that an analog circuit can be designed that performs functions that a neural network does (image recognition for example) much more efficiently. Not to the level of energy efficiency found in a human brain, but much much greater than what is being consumed by current AI systems.

The electrical engineering content begins at around the 34-36 minute mark of the above linked video. I know how Kirchhoff's Laws are used to analyze circuits, but I'm getting lost as to how Liu is applying them to decision making . 🤯 .. something to do with path of least resistance, I gather.

Sorry if you've read this far. I guess this is why I didn't share this talk before now. Based on the paltry number of views her talks are getting on Youtube, I would take it that either nobody else understands her, or she's talking nonsense ... But, she's given this talk to audiences at Aspen Physics, Stanford, Berkeley, The Isaac Newton Institute for Mathematical Sciences at Cambridge ... and on and on ... and so far as I can tell, she's not being laughed off the stage.
 
  • Like
Reactions: floatingbones
Basically, as far as I can follow, Liu is suggesting that an analog circuit can be designed that performs functions that a neural network does (image recognition for example) much more efficiently. Not to the level of energy efficiency found in a human brain, but much much greater than what is being consumed by current AI systems.
In a way that is not surprising. Any item designed to do one specific purpose is likely to be more efficient than a generalised item doing the same task.
 
  • Like
Reactions: Badrod
Liu's commentary about energy usage in real vs. digital networks reminded of my primary gripe with the 2011 demonstration of IBM's Watson on Jeopardy (2011). Do you, Watson, think you can beat champions on Jeopardy? Fine. Do it with the same amount of energy. As far as I remember, energy usage was never ever discussed in the glorified IBM infomercial posing as a game of Jeopardy! The Wikipedia article notes that Watson was stuffed to the gills with RAM, and that RAM was filled with Jeopardy! answers, but joules are not mentioned.

@Badrod , are you familiar with analog computers? In the 1950s and 1960s, analog circuits were built to solve differential equations. Initial conditions were supplied with a reference voltage; the "solution" was read with a voltmeter. Equations that lacked a solution would converge into a "hunting" (which would formally be known as a limit cycle oscillation). ChatGPT tells me those non-solutions would be sometimes labeled "as a 'dead zone', 'stuck state', or simply 'failure to converge.' " Setting up the proper RLC circuit with a particular differential equation probably took over an hour. OTOH, physics computed the solution very quickly.

Analog computers were around in the 1960s and 1970s; they got replaced by tools like MACSYMA that solved equations digitally -- on a massive (expensive) computer like a KL-10 -- a big-a.. PDP-10. Today, you can run Wolfram Research's Mathematica on a Raspberry Pi -- far more capable than MACSYMA -- for free. MIT-MC (MACSYMA Central in the late 1970s) had 256kWords of 36-bit memory memory -- the equivalent of 1.25MB of memory. Today, you can fire up a RPi5 with 16GB of memory and 2TB of storage on an NVMe memory stick. Times have changed. I don't think Steve's KL-10 simulator coud possibly run MACSYMA of the 1970s.

Anyone interested in re-living the golden age of analog computing can get an The Analog Thing -- a marvelous Maker-designed breadboard for hacking analog circuits:

Analog-Thing.jpg

Steve could make really spiffy Portable Dog Tamer with this thing!

I regret that MIT is no longer teaching the physics of analog circuits to its "EECS" undergrads. One can't really understand the world unless you know something about oscillating systems and impedance. It turns out that impedance is quite useful for understanding biomechanics. My EE professors left that as an exercise for the reader; I finally started figuring it out.

Interesting video, @Badrod . I wish the researcher had been a bit more explicit about KCL or KVL in her presentation.
 

Attachments

  • Analog-Thing.jpg
    Analog-Thing.jpg
    69.9 KB · Views: 35
  • Like
Reactions: Badrod
are you familiar with analog computers?
Vaguely. Years ago I listened to an episode of Omega Tau (episode 159) where Markus visited an Analog Computer museum which was located in some guys house. A bit of reading this evening reminded me of something Feynman said about using electronics to model something like a car suspension system (I think that was his example). ... But that explains her slide describing curve fitting.

She (Liu) did mention Analog Computing in one of her talks, but I'd gotten the impression she was talking about something that would eventually become a Chat GPT. Now that you've got me on the right track, a bit more searching has turned up lots of material on the topic from Versatium, Microsoft etc. So, anyway, more of a task specific machine--and the efficiencies would come in when somebody asks the AI, for example, to model a bouncing ball on Saturn, instead of throwing a bunch of power hungry GPUs at it, the AI would pass the calculation off to the analog computer.

Oh, and yeah, The Analog Thing is super tempting!
 
Last edited:
  • Like
Reactions: floatingbones