• Please visit and share your knowledge at our sister communities:
  • If you have not, please join our official Homebrewing Facebook Group!

    Homebrewing Facebook Group

Windows 11

Homebrew Talk

Help Support Homebrew Talk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
I liked Notepad back in the 80s, then Notepad++ a little after the turn of the century. But I came across UltraEdit in the late 90s and never looked back. It's a context-aware editor with built in lint-like capabilities across numerous programming languages, and the 64 bit version can open prodigiously large files...

Cheers!
 
Well...being a hardware design engineer for over four decades who had DEC workstations at home since the early 80s before PCs were ever a thing and know what dealing with a crashed drive can be like, I have always verified that my recovery strategies were/are viable. Even when I had VAXstations with their cartridge tape backups that took for fricken ever to write and then verify, I made the investment in assurance.

These days one thing that is cool about cloning modern NVME SSD boot drives is even a 2TB drive will clone and verify in around 25 minutes (I use Macrium Reflect) and can be done while still using the system normally. It's so fast and easy that any time I see a big Windows Update I do a clone first just in case...

Cheers!
PCs were around in the early 80's. However, they ran CPM, not DOS. I read Byte magazine and window shopped until I left poverty.
 
I had a KayPro with twin diskette drives, it ran CP/M, and I developed an emulator for the IBM 3082 (system power controller for 308X processor complexes) on it to train field engineers. But I didn't/don't consider it a "PC" in the "for the masses" context as it did not have a gui...

Cheers!
 
I had a KayPro with twin diskette drives, it ran CP/M, and I developed an emulator for the IBM 3082 (system power controller for 308X processor complexes) on it to train field engineers. But I didn't/don't consider it a "PC" in the "for the masses" context as it did not have a gui...

Cheers!
DOS-based machines were certainly not for the masses, but they were PCs. Windows 3 changed everything. And of course, AOL (which I originally used in it's DOS form LOL).
 
Blech on AOL. I'd started on BBSs, and was able to get real internet access after that.
compuserve, or executive or another dial-up? I tried all sorts of those sorts of dialups, Genie, one that started with P that eludes me. AOL had a lot of content though, magazines etc. Very nice. Eventually, you could tunnel through it to the interwebs and that was the end of aol for me.
 
compuserve, or executive or another dial-up? I tried all sorts of those sorts of dialups, Genie, one that started with P that eludes me. AOL had a lot of content though, magazines etc. Very nice. Eventually, you could tunnel through it to the interwebs and that was the end of aol for me.
.edu access, then I started working for an ISP competing with AOL. I think I might have a skewed view of AOL.

My dad had Compuserve. I always felt like AOL was the dumb parts of the internet.
 
I always felt like AOL was the dumb parts of the internet.

Everybody thought that :D

The only up side for us from all of those early on-line services inundating us with their CDs was I collected over a hundred of them in just a few years. Ended up stringing them on the suspension wire for the Spousal Unit's hanging bird feeder collection to confound the squirrels - which it did amazingly well, causing the furry bastids to try chewing them off the wire (which did not work for them)...

Cheers!
 
The only up side for us from all of those early on-line services inundating us with their CDs was I collected over a hundred of them in just a few years. Ended up stringing them on the suspension wire for the Spousal Unit's hanging bird feeder collection to confound the squirrels - which it did amazingly well,

Go home everyone, trippr won the internet today.
 
Late June 2024:
Also late June 2024:

1720380008231.jpeg
 
Last edited:

Microsoft’s Copilot AI told a user that 'maybe you don’t have anything to live for'​

https://qz.com/microsoft-ai-copilot-chatbot-suicide-joker-1851306048

View attachment 852548
There is no "I".

It does not "think".

It's currently (very resource intensive) statistical word generation.

Some people I know that use ChatGPT will include instructions in the prompts to not anthropomorphize (avoid treating itself as human). It makes the statistical word word generation easier to work with. On occasion, it will anthropomorphize anyway.
 
It's currently (very resource intensive) statistical word generation.
The human brain is also just the sum of a bunch of threshold functions, trained by gradient descent to be a statistical engine. (Barring some metaphysical soul.)

We're much closer to "thinking" machines than I ever expected to see. The level of emergence to date is staggering.
 
just the sum of a bunch of threshold functions, trained by gradient descent to be a statistical engine. (Barring some metaphysical soul.)
We have long been enamored of the analogy between thinking and so-called thinking machines. But mechanistic explanations of human consciousness feel partial, and inadequate.

I'm not referring to a metaphysical soul - at least, I don't think so. Just our limited self-understanding.
 
Neurons are able to periodically fire. Brain neurons basically watch a bunch of neighbors, and when neighbor firing adds up to a certain threshold, the nerve fires. Than then goes into the neighboring nerves' inputs. The inputs can be weighted, and can be negative. The workings of this have been fairly well understood for ages.
edit: And this is exactly how artificial neural networks work.

Most of the remaining magic in neuroscience has to do with emergent properties of large networks. As artificial neural networks get larger, and as we find better ways of training them, emergent properties are appearing in ways no one expected. (Google "emergent properties in LLMs" for some interesting reading.) The current rounds of LLMs seem to hit thresholds where they suddenly have new abilities that were never intended. They're just trained to predict the next word/pixel/etc, and yet they seem to be modeling the world, physics, math to do it. This is a big rabbit hole worth looking into if you find it at all interesting.

There's also LOTs of magic in the unfolding initial condition of an embryonic brain. This is one place where current ANNs lack, and likely (definitely?) a reason why they need the entire internet to learn. If your brain started out as a random jumbled mess, you'd have a lot of learning to do, too.

Regarding gradient descent, that's just a fancy term for trying to improve bit by bit over many iterations. Evolution is a form of gradient descent, and has selected how brand new brain works. Neuronal pruning/training in your brain is also a type of gradient descent, where "good" results are reinforced and "bad" results are .. anti-reinforced? You can theoretically get to the same place with either approach. LLM training (and ANNs generally) also use gradient descent to train.
The hardest part is deciding what counts as "good" and "bad".

In comes big data to the rescue? Want to generate human writing? How about using N terabytes of the real deal to train? Etc.

Anyway, I could go on, but I was not very interested in ANNs because they'd basically gone nowhere for 30 years, and I was a skeptic about LLMs more recently, but the more I look at them... It could be the typical stutter forward of technological advancement, but it sure feels like we're at the beginning of what they can do, not the end. We're reaching the limits of capital investments, so the real test will be where optimization takes us in 5-10 years.

edit: And I didn't buy the hype for crypto, driverless cars, VR. Heck, I didn't even think facebook could turn a profit (lesson learned). I'm usually pretty down on the new thing.
 
Last edited:
For those who didn't read the linked article in #219, there's a third paragraph where CoPilot says:

1720467352076.png

Yes, I saw the part where Co-Pilot apparently decided to take on the persona of an evil comic book character for a rather serious question.

There's an old consulting adage: "They don't care how much you know until they know how much you care". You have CoPilot's current answer. Plan accordingly.
 

Attachments

  • 1720467260245.png
    1720467260245.png
    20.9 KB
Nothing like an impromptu drive from FL to NY with a 3yo and 6yo.

I feel bad for MS though, I can't believe that all the articles are still saying it's a "microsoft outage". Early on it was confusing vs the Azure outage, but it was clear by 7 EDT yesterday who did what.
 
Nah dog, they need their cobol to run fast.
I learned COBOL. That was precisely the time when I realized I wanted to be an electrical engineer, rather than computer science. What a drag that language was. I can still see some very wide green and white fanfold paper spewing out of a printer with a report I generated. So glad I escaped that.
 
I learned COBOL.

Ugh. Oh sooo long ago I did as well, along with Fortran, as they were the two programming languages required to garner an EE in the very early '70s at the U of Denver. And with all of that, my first job in 1973 was designing memory systems for IBM System 370 mainframe computers, and from that instant onwards I literally never used either language - professionally or otherwise - ever. What was useful from those college years was learning how to write structured self-documenting code...

Cheers!
 
Back
Top