• Please visit and share your knowledge at our sister communities:
  • If you have not, please join our official Homebrewing Facebook Group!

    Homebrewing Facebook Group

Technica Cornucopia

Homebrew Talk

Help Support Homebrew Talk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Musk's new xAI building in Kentucky. 100,000 gpus. I'm impressed by a lot of things in this video, but mostly the awesome modular build. Seems to be due a lot to this company SuperMicro, and does seem like a commercial for them, but a well-deserved one. Look at the size of the cooling water pipes running through this place!

Side note: Nvidia is now worth more than the stock markets of every country in the world, other than India, Japan, China, and the U.S. But it's just getting started - watch this space.

 
^^ lol, all that anal design and build. Then this... (spelling lol)
1731370688850.png
 
wrt anything awesome by SuperMicro, they may be all done forever due to apparent financial malfeasance :oops: Suppliers (including nVidia) are reallocating components that were destined for SMCI and basically cutting them off. Too bad, about half my builds used their motherboards, the rest used ASUS including my current beast, the workstation it replaced, and my wife's system...

https://www.barrons.com/articles/super-micro-stock-delisting-deadline-7b51a727

Cheers!
 
Last edited:
Makes me wonder where the qualfications are defined and valdated. Do you think that someone who meets the qualfications is then certfied. :confused:
That typically means "by the manuf only". Maintenance contracts are often worth more than the original equipment sale. (LOL did you intentionally leave out the 'i' in qualifications and validated and certified? )
 
The video is awesome but I wonder how long that technology and topology will be adequate.
How soon will it become the bottleneck in the entities' ability to think. Will it be as soon as they achieve true inference engine status. Or maybe at that point the entity will tell us what it needs.
 
... I went to college for a BSEE in the late 60s and started designing IBM mainframe memory systems in the early 70s, DEC in the 80s and 90s, and Stratus in the 2000s, I've seen my fair share and across many system architectures...
So if you were with DEC in the 80's and 90's did you miss the chance to work on MCA?
That always intrigued me. It seemed to have a lot of upside performance potential.
 
The "MCA" packaging was for Aquarius, the only water-cooled complex Digital ever sold. It was developed by the Marlboro "high end" engineering group, honchoed by my neighbor Bob Glorioso, while I was working for the Maynard "mid range" group doing the memory subsystem for an air-cooled Alpha RISC platform we codenamed Crystal, in concert with the Dave Cutler team out in the Greater Seattle Area (Bellevue, specifically) who called it Emerald. I logged a ton of air miles on that effort before it got killed by the Aquarius release to market, which was how things were done at DEC. Another competing platform called Argonaut out of the so-called "workstation" group in Littleton MA also fell to Aquarius.

Fun times. I was with DEC for 18 years before I retired the first time, and enjoyed it a lot. It was never boring, that's for sure!

As for the MCA and water cooling: it was an epic pita, and iirc, DEC had to replace every Aquarius sold with an air-cooled follow-on Aridus.

Cheers!
 
Last edited:
exaflop lol. That's a quintillion floating point operations. well, I didn't know what that was either, so of course I looked it up.
My first PC ran at 4.77MHz. 4770000 clocks per second. Not sure if it had a fpu, but I doubt it. See image below. Apparently this predated surface mount components. Gasp! DIPS lol.

My next machine ran at a screaming speed of 16MHz. That one did play some games very well. I'm pretty sure it did not perform single-clock floating point operations. I think there was a '387 coprocessor that you could plug in and get that (from memory, and there was bugs).

Not going through all the machines, but that brings us to today. I have a pretty badass pc that I built myself for more money than I'd like to admit here.

I'll just list the clock speed and not the FLOPS as I have no idea which of these could perform a floating point operation in one processor clock cycle:

Code:
PCjr =                   4,770,000
386 =                   16,000,000
Today =              3,000,000,000
El Cap = 1,000,000,000,000,000,000

1732061999283.png
 
My first job out of college over 20 years ago, I worked for a well-known FPGA vendor. We had a soft core embedded processor IP block for the FPGAs, and the team I was in had a competition to develop the best thing to build around it to showcase the processor. Well, the capability existed to create "custom instructions" in the command set as you could use the rest of the FPGA to be whatever you needed... So we built a floating point arithmetic unit that interfaced via the custom instructions, as the processor didn't have its own built-in floating point unit.

I think the typical timing we'd find using software emulation to perform floating point operations was somewhere in the realm of 1700 CPU cycles. With our unit, we got that down to about 15 IIRC... And of those 15, I think it was only 3 or 4 to actually perform the calculation, the rest was the overhead of passing the command and variables to and from the instruction.

I ended up writing the test program for it. Which was all well and good until we kept getting wrong answers... And I realized that the C compiler wasn't passing variables correctly to the custom instruction. So I had to hand-code that little bit in assembly :oops:
 

Ouch, that brings back painful memories of rows of little bleeding dots on my thumbs from memory DIPs flipping over while putting pressure on them to load up system boards and expansion cards.
The DIPs came with the pins slightly spread so you had to get one side set in the socket then push a little sideways to align the opposite side before pushing straight down to seat them.
I worked at a computer store and we had a bunch of corporate research centers as customers. They always wanted the full memory load.

I bought my first PC as an employee so I got a nice discount but still paid what even today is an obscene amount for my IBM Model 80 @20MHz and 115 MB hard disk.
But no DOS for me, I bought OS/2 to go with it. Version 1.0 Man I was really cooking with gas then.

After living the joyous life of each OS/2 upgrade up thru Warp 3.0 I made the decision to cut my losses (I don't give up easy).

When I tried to install Warp 3, the system happily began partitioning and copying from the CD to the hard drive, asked me all the config questions, and finally prepared to reboot and finish configuration. This happened in a lightning-fast 90 minutes or so.
The machine ejected the CD, did a reset, and failed to boot because it couldn't find the CD-ROM drive that it just finished loading from. Repeated attempts got the same result. At least they were consistent!
I was an IBM System Support Rep. so I had some inside connections but even they could not figure it out.

The bloom was off the rose.
 
My first job out of college over 20 years ago, I worked for a well-known FPGA vendor. We had a soft core embedded processor IP block for the FPGAs, and the team I was in had a competition to develop the best thing to build around it to showcase the processor. Well, the capability existed to create "custom instructions" in the command set as you could use the rest of the FPGA to be whatever you needed... So we built a floating point arithmetic unit that interfaced via the custom instructions, as the processor didn't have its own built-in floating point unit.

I think the typical timing we'd find using software emulation to perform floating point operations was somewhere in the realm of 1700 CPU cycles. With our unit, we got that down to about 15 IIRC... And of those 15, I think it was only 3 or 4 to actually perform the calculation, the rest was the overhead of passing the command and variables to and from the instruction.

I ended up writing the test program for it. Which was all well and good until we kept getting wrong answers... And I realized that the C compiler wasn't passing variables correctly to the custom instruction. So I had to hand-code that little bit in assembly :oops:
I have a similar story of hand-coding a 64-bit multiply routine in assembly on some arm processor. It was some embedded video application I was working on (as a consultant) in the 90's. I don't recall exactly why I did it, but either the compiler wouldn't do it at all, or did it wrong, or did it slow.

I'm pretty sure I was smarter back then...
 
But no DOS for me, I bought OS/2 to go with it. Version 1.0 Man I was really cooking with gas then.

After living the joyous life of each OS/2 upgrade up thru Warp 3.0 I made the decision to cut my losses (I don't give up easy).
LOL. I cut my teeth on DOS. At one point I set up a complete batch file menu system for my dad so that all he had to do is type a number and hit enter for whatever program he needed to run.

But then most of my teenage years I spent pissing him off, because each time he learned how to use an OS, I'd change. Oh, Dad, you know how to use Windows 3.11? Let's see if you can figure out OS/2 2.1! Got that down! Let's try Windows 95! Oh, you've figured it out? Time for OS/2 Warp.

I think he was glad I left for college 😂

I have a similar story of hand-coding a 64-bit multiply routine in assembly on some arm processor. It was some embedded video application I was working on (as a consultant) in the 90's. I don't recall exactly why I did it, but either the compiler wouldn't do it at all, or did it wrong, or did it slow.

I'm pretty sure I was smarter back then...

Yeah, it's one of those things in life. Is it helpful to learn assembly in case you need it? Yes. Do you EVER want to actually use it? No.
 
LOL. I cut my teeth on DOS. At one point I set up a complete batch file menu system for my dad so that all he had to do is type a number and hit enter for whatever program he needed to run.

But then most of my teenage years I spent pissing him off, because each time he learned how to use an OS, I'd change. Oh, Dad, you know how to use Windows 3.11? Let's see if you can figure out OS/2 2.1! Got that down! Let's try Windows 95! Oh, you've figured it out? Time for OS/2 Warp.

I think he was glad I left for college 😂



Yeah, it's one of those things in life. Is it helpful to learn assembly in case you need it? Yes. Do you EVER want to actually use it? No.
I thought i was a hot-shoe with DOS intrinsic commands and edlin. I could do about anything I needed from a command line. But 20 years later I worked with a guy that I swear could write user apps with REXX.

I actually liked assembler.
In college did one half of my "Systems Project" in System 370 DOS/VSE assembler. I wrote the inventory management, maintenance and economic order quantity routines in assembler. The other half was COBOL for the operations side with ordering, pick and ship and invoicing.

My partner wrote in COBOL and RPG II for reports and other maintenance tasks.

I lost a few points for my choice to use "Shure-Kill Cat Traps" as a sample inventory item. Professor was a cat lover (I knew that). We still aced the project.
 
This...is very bad.

For the first time ever researchers crack RSA and AES data encryption


"A team of Chinese researchers, led by Wang Chao from Shanghai University, has demonstrated that D-Wave’s quantum annealing computers can crack encryption methods that safeguard sensitive global data."
Really makes me want to put my data in the cloud, or anywhere else connected to the outside world.
I already get several "Settlement Notifications" every year, and they seem to be increasing.
Each one offers a different monitoring service that requires you to put all your critical data into their database!
No Thanks !
 
Does anyone else have a copy of the MS DOS bible Encyclopedia? 1988 Microsoft Press.
I'm talking about the one with the historical account of the birth of MS DOS in the secret room.
 
Last edited:
In my early teens, I was ripping up Starcraft on Windows 98 - offline of course. Rural dial-up was more miss than hit.

Now 25 years later, I'm at work updating our equipment maintenance...on Windows 98. At least our key infrastructure communication software runs on XP.
 
This...is very bad.

For the first time ever researchers crack RSA and AES data encryption


"A team of Chinese researchers, led by Wang Chao from Shanghai University, has demonstrated that D-Wave’s quantum annealing computers can crack encryption methods that safeguard sensitive global data."
I sent that over to my colleague who is paid [very handsomely] to work on this sort of stuff, and he said it's not significant at all. Academic at this point.

Even as an outsider, nothing in there says that they've been able to break AES-256. It seems to suggest that techniques are getting closer to being able to do so, but not that they're there. More that the "many decades from now" assumption may be shorter than originally thought.

So while I wouldn't say that there's anything good about this, I hesitate to accept that this is "very bad", at least for now. If by "very bad" it means that our current encryption capabilities are woefully inadequate and have already been breached.

Edit: I just looked up said colleague on LinkedIn, and he has a PhD in Electrical & Computer Engineering. So this is a guy with some credibility lol. And dammit, does that mean now I have to start calling him Dr.?
 
This...is very bad.

For the first time ever researchers crack RSA and AES data encryption


"A team of Chinese researchers, led by Wang Chao from Shanghai University, has demonstrated that D-Wave’s quantum annealing computers can crack encryption methods that safeguard sensitive global data."
what a coincidence. I'm designing something right now for a client. We literally had a convo about it an hour ago. He insists on AES-128, and I told him a lesser authentication was fine and AES wouldn't stand forever.
 
I sent that over to my colleague who is paid [very handsomely] to work on this sort of stuff, and he said it's not significant at all. Academic at this point.

Even as an outsider, nothing in there says that they've been able to break AES-256. It seems to suggest that techniques are getting closer to being able to do so, but not that they're there. More that the "many decades from now" assumption may be shorter than originally thought.

So while I wouldn't say that there's anything good about this, I hesitate to accept that this is "very bad", at least for now. If by "very bad" it means that our current encryption capabilities are woefully inadequate and have already been breached.

Edit: I just looked up said colleague on LinkedIn, and he has a PhD in Electrical & Computer Engineering. So this is a guy with some credibility lol. And dammit, does that mean now I have to start calling him Dr.?
Read the wiki article on MiFare classic, which is used on tons of fare cards and other apps in europe. Hack devices now can clone one in 40ms, remotely lol. I'm no expert on this, don't want to be, but it's one of my jobs right now to bone up on it.
 
I sent that over to my colleague who is paid [very handsomely] to work on this sort of stuff, and he said it's not significant at all. Academic at this point.

Even as an outsider, nothing in there says that they've been able to break AES-256. It seems to suggest that techniques are getting closer to being able to do so, but not that they're there. More that the "many decades from now" assumption may be shorter than originally thought.

So while I wouldn't say that there's anything good about this, I hesitate to accept that this is "very bad", at least for now. If by "very bad" it means that our current encryption capabilities are woefully inadequate and have already been breached.

Edit: I just looked up said colleague on LinkedIn, and he has a PhD in Electrical & Computer Engineering. So this is a guy with some credibility lol. And dammit, does that mean now I have to start calling him Dr.?
It is inevitable....
 
Read the wiki article on MiFare classic, which is used on tons of fare cards and other apps in europe. Hack devices now can clone one in 40ms, remotely lol. I'm no expert on this, don't want to be, but it's one of my jobs right now to bone up on it.
Yeah, my job is to try to be smart enough to understand what the PhD says JUST enough to know how to translate it into what the customer will understand 😂
 
Surround audio from my PC: OMG, you'd think I asked for something special.

My mobo has an optical output. Realtek audio hardware. I have a Logitech Z906 5.1 speaker system plugged into the optical output. That system has DTS and Dolby Digital decoders. After a couple of weeks, I gave up. It's gonna have to be simulated surround here. OMG.

So, the "standard" for optical output on a pc's optical output is 2-channel. For reasons i don't know, there needs to be some compression to get more channels than that. Dolby Labs stepped up to the plate and offered... a bazillion different audio standards - dolby pro logic, dolby digital, truehd, atmost, dolby digital +, probably more. I guess hardware/software vendors have to pay licensing fees to decode these.

I believe the Z906 hardware should be able to decode. It has all the dolby trademarks splattered on it's enclosure.
 
LOL. I cut my teeth on DOS. At one point I set up a complete batch file menu system for my dad so that all he had to do is type a number and hit enter for whatever program he needed to run.

But then most of my teenage years I spent pissing him off, because each time he learned how to use an OS, I'd change. Oh, Dad, you know how to use Windows 3.11? Let's see if you can figure out OS/2 2.1! Got that down! Let's try Windows 95! Oh, you've figured it out? Time for OS/2 Warp.

I think he was glad I left for college 😂



Yeah, it's one of those things in life. Is it helpful to learn assembly in case you need it? Yes. Do you EVER want to actually use it? No.
We started with CP/M. I was worried about the crap Microsoft was doing at the time, so I was excited about Warp, which they killed with FUD. At some point I supported an OS/2 system for dialup access, but by then Warp was but a memory.

I had co-workers wait in line for '95. I didn't care at that point, but had to use and support it.
 
Surround audio from my PC: OMG, you'd think I asked for something special.

I run an older Logitech 5.1 system via "direct connection" to my Z790 motherboard with a Realtek chip and get all six channels with DTS and Dolby support provided by Logitech at 96000hz. I haven't tried using the optical connection instead - but are you saying you can't get more than just right/left out of your optical link?

Cheers!
 
Ok, I learned something today: my Realtek audio doesn't do more than 2 channels over its optical link either! Hah!

I built this system last November and simply moved all of the audio cables - optical, digital, etc - from the old beast to the new one without giving it a lot of thought when it just worked. But I drilled a dry hole this evening just trying to get the host to even try to send 5.1 to the Logitech head unit over the optical link, so I'm guessing there's some DRM thing going on, which wouldn't surprise me.

NBD, I get 5.1 over the 6-channel copper connections, which is fine with me.
But I wonder if it's just a matter of installing enabling software (payware, of course)...

Asked Google about all this. SMH, been through this a long time ago, and my payware DVD software does work over optical.
https://www.tenforums.com/sound-aud...out-realtek-optical-card-other-than-test.html

Cheers!
 
Last edited:
Back
Top