2024-03-28 03:45:56

@72
Lol. Sure. Believe that if you want.

Whether you'll be running it locally I dunno.  Whether it matters, though...well, science is about to accelerate like you wouldn't believe, already is in some ways.

@74
Frankly I have heard of Framework from exactly one place: this forum.  I don't have an opinion on it because the only person I've ever seen feel strongly about it is Ethin.  If I had one maybe I would, but I can get a tower instead if I need those kind of specs and that's just what I generally do instead.  In my opinion if you have the money for such things and the actual need for those sorts of specs, get two machines.  If you're seriously contemplating framework you can afford it.  Laptops are for travel. Towers are for absurd compute power.

There's what you need, what you want, and what might be cool to have.  Framework seems like a might be cool to have.  I don't enjoy tinkering.  Swapping components around all the time and whatever doesn't add anything to my life.  I could get something like that and chase the upgrades for the next few years until Intel changes the socket or whatever and then buy a new laptop.  Or, I could just wait and buy a new laptop and probably the cost of both is less than what I just spent with framework anyway.

@ghost
The reliability of reviews for laptops is questionable at best; if you haven't learned that the hard way, good for you.  By the time we wait long enough to get more than the "it just came out" reviews that don't know anything about long-term stability, then either the product has changed or stopped mattering. So eh.

My Blog
Twitter: @ajhicks1992

2024-03-28 04:01:13 (edited by Ethin 2024-03-28 04:14:14)

I like the idea of framework but I think I've made it pretty clear in this topic that I don't swing either way on it since I've never owned one. I just have heard good things about it from people who have used/owned one.

And yes, AI is a gimmick. At least, running it locally is. Don't get me wrong: when applied properly it's incredible. But right now it's all in the cloud, so how powerful your computer is is entirely irrelevant, and at the glacial pace we're going in model minimization, it won't matter for a while. I call it glacial because that's exactly what it is: we see more and more powerful models coming out (Claude just beat ChatGPT on Chatbot Arena) but... They aren't models you could ever run locally. More and more powerful models and more and more demand for insanely huge GPU clusters. That's what we're seeing. Oh, Google has Gemini Nano, which (supposedly) can run on a phone, but I highly doubt it's all that accurate compared to GPT-4 or Claude Opus. The power of your computer won't matter until we either figure out how to prevent these models from requiring tens of thousands of GPUs just to run remotely efficiently or somehow we magically manage to fit the power of those clusters into a phone or laptop/desktop case. And that's assuming that these AI companies don't get forcefully shut down by government because they're gobbling up 4-8 percent of the US's power grid, which is quickly running dry because these companies are just opening up more and more data centers instead of waiting for us to have the capacity to transmit that kind of power first. Well, either that, or governments start passing stupid legislation that makes running an AI impossible without paying billions/trillions in intellectual property licensing fees because somehow training an AI model is "copyright infringement".  (Maybe, just maybe, this last point might not apply to highly specialized models for things like gene/material synthesis, but we haven't seen the last of these kinds of court cases on LLMs, and we won't unless governments start shutting down copyright maximalists, hard.)
Edit: another thing we need to solve is the training problem/building AI from scratch problem. Right now you practically have to be a domain expert. If we want AI to really take off, people need to be able to easily create them even for specialized tasks without needing to have PHDs in AI/ML/neural networks/etc.
Edit2: oh, and the "I like to hallucinate" problem. That's another big one. As is the "figure out what's AI-generated and what isn't" problem, though for that one throwing more AI isn't going to solve the solution... Is there even a solution to that last one?

"On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."    — Charles Babbage.
My Github

2024-03-28 04:10:13

@76
curiosity makes me wonder, you think it wouldn't be a good idea to invest in high end specs mac for longterm because stuff changes, or because macs did arm well first it will be fine?

2024-03-28 04:14:45

76, agreed on swappable laptops. Who actually enjoys tinkering and swapping out laptop components?  I myself really don't, and that marketing is totally irrelevant to me and not worth any extra cash. And the socket for CPUs do change, sure you get a new board, but now you need new ram, SSD etc. When doing that, why not just buy a new laptop?
As for reviews, the first couple days reviews mean nothing. But the reviews that are fully featured after a couple months are good I think.  My experience, any defect in a laptop if it is major manufacturing defect will surface in the first 12 months to 24 months, the warranty period. If you don't have major problems after that,  then great. I had my Toshiba Satallite L50A battery always showed 0% charge after less than a year.  Then in 2016 to 2017, two years after I started using it, fan started to grind and got progressively louder.  I took it to the Toshiba shop  and paid good money to swap out the battery and fan.  Both worked fine until I got a new laptop. But a battery or fan on a computer shouldn't fail so early, and this was a midrange laptop.  My Eluktronics all the components are orderable online, so if fans fail, I can just order a new one and put it in, or keycaps. I broke a keycap, just ordered a new one and pressed it in.
Regarding AI models and performance though, does CPU performance increases matter though if software doesn't take advantage of it?  I  mean if performance went up 200%, but nothing takes advantage of that, and if everything already happens instantly, it won't really provide quality of life improvements.

A learning experience is one of those things that say, "You know that thing you just did? Don't do that."

2024-03-28 04:37:56

@78
I don't understand the question?  Mac circuitry is only special because they were in the perfect position to do it though.  X86 probably isn't on the way out, but I doubt it'll be the first choice for new stuff much longer.  Amazon has cheaper datacenter offerings based off Arm already that you can rent from them.

@77
This probably isn't the place for the argument but I'm bored so I'll have it anyway.  It doesn't matter if you can push the AI just far enough that it can start solving these problems.  We haven't, but AI is something where every single goddamn time I say "x won't happen until" x has happened sooner than I'd expect, and while the headlines are all ChatGPT lots of people are busy applying this stuff to solving real science, e.g. searching for medicine.  And while perhaps model minimization needs to come along, and maybe power is an issue, both are problems that can be solved on a 5-10 year time horizon.  Plus, really let's be honest, we've gone from using f64 to 4-bit quantization in just the last 5 years to mention only one thing I am aware of, that's over a 10x improvement.

Still you and a lot of other people always talk about how we need to democratize it and it's just like, no we don't, I am personally extremely glad that it requires expertise, the absolute last thing I want is for anyone to be able to make one, I would literally and genuinely prefer to just give everyone nukes first.  No joke, I view nukes as safer.  Whoever gets an AI above the threshold of say junior programmer first wins.  Not something specific, they win everything it is possible to win.  Or, more likely, we botch alignment and get the horrifying endings.

Get that far, though, and we are  at most months from the singularity, and we are terrifyingly close to that right now if you consider that for the most part it turns out that you build giant buildings and stuff them full of computers and all you have to do for a better AI is, apparently, stuff more computers in.  Like if I made a novel out of this and published it a decade ago I'd have been laughed out of even the highly speculative science fiction genre because you're kidding, right?  They just kept stacking chips in a pile?  That's your plot device?  Everyone is so quick to say gimmick and also so quick to forget what they'd have thought 5 or 10 years ago if you describe the modern world to them back then.  Clearly AI is more than this, but seriously "just use enough electricity and your problems are solved" level solutions?  They weren't supposed to even be on the freaking table.  If you look at how all of this works on the one hand you can always point at a million obstacles, that's true of any field, but on the other hand AI is happening so easily--relatively speaking--that it's almost like having enough compute in one place just naturally wants to let intelligence happen or something.

If I were you--or anyone else--decide what line AI has to cross before you take it seriously. Write it down right now so that you all will stop changing your minds on what that is. Then check if it has already, a lot of the time it turns out it did, and if it hasn't check every 6 months.  I didn't; my opinion was changed by a bunch of "nah can't happen" and then it did predictions going against me.  If I had to say mine, probably GPT4 (not Copilot, just the normal one) passing our coding interviews, the music offerings becoming passable or, going way way back, when that friend of mine got Ai Dungeon to engage in incredibly, unbelievably niche kink roleplay.

I should really blog what I call the idea oracle, and I could continue on at length, but put simply if you give me an AI which can come up with an important scientific insight 1 time out of 100000, I could likely leverage that plus some relatively simple hierarchical human structures shaped like mechanical turk to filter out most of them before passing the possibly good ones on up to the next level of qualification and so on.  Like if you had a box that is that good it's better than humanity is right there, even if you had to check them all with college-level or higher people it's still better than what we have, checking 100000 ideas for validity is something we could almost automate.  The other part of this that always gets me so hard is that no one realizes how low the threshold is for it to push us over the line like that.  Imagine 1/10000? 1/1000?  And you can point these at "soft" fields like chemistry where "synthesize this molecule cheaply" for example has tens of thousands of ways to do it so that it's not like physics where only exactly one solution exists to mine for, plus also we have reasonably good simulators for biology and chemistry and a lot of other "engineering"-type science that can check the output before even getting to the expensive and slow people...

My Blog
Twitter: @ajhicks1992

2024-03-28 05:55:10

@76, You do realize that not everyone has the luxury of being able to get a desktop right?
It's not just for monitary reasons either. Some people don't have the space.

2024-03-28 06:39:15

@81
yes, I do. But if you are talking about shelling out for a top of the line gaming laptop then presumably money isn't an obstacle.  If you are talking about shelling out for a Framework that you intend to upgrade, then presumably money is even less of an obstacle than top of the line gaming laptop, too, since that means replacing components.  But. Never mind that specifically as regards framework because I dug into this to find actual pricing numbers, it was too cheap to make sense, and I kept going until I found the catch.

What Framework actually is is they have some empty bays that have USB-C ports in them and the expansion cards are just fancy USB-C adapters.  So that's your catch.  See here: https://framewiki.net/expansion-cards

So for example their "storage expansion card"? Normal external USB-C hard drive (who knows which one, couldn't find that quickly) and they put some plastic around it so it'll slide inside the bay, then they avoid coming out and outright saying what these are in so many words. Fan-fucking-tastic.  I can't wait to buy one.

Still I do stand by what I said and I do recognize that no not everyone can get two computers and cool, great, but if we're talking about just shelling out cash for specs specs specs then presumably you aren't that person and what I said applies so...

My Blog
Twitter: @ajhicks1992

2024-03-28 07:01:14

The 250GB and 1TB Expansion Cards give you the performance of an internal drive with the flexibility of an external one. With a USB 3.2 Gen 2 interface, the 1TB card exceeds 1000 MB/s read and write speeds, while the 250GB card reaches 1000 MB/s read and 375 MB/s write speeds. Both are fast enough to run apps and even boot an operating system from, and you can plug them into other computers for high-speed file transfer.
Specs
add
Weight: 8g Dimensions: 38mm x 7mm x 30mm Phison U17 Flash controller Micron N28 NAND Exceeds 1000 MB/s read speeds Exceeds 1000 MB/s write speeds 50% post-consumer-recycled aluminum 30% post-consumer-recycled plastic

Yeah... they're really hiding stuff here lol.

Every record has been destroyed or falsified, every book rewritten, every picture has been repainted, every statue and street building has been renamed, every date has been altered. And the process is continuing day by day and minute by minute. History has stopped. Nothing exists except an endless present in which the Party is always right.

2024-03-28 08:56:58

@80
I mean I might get a mac with high specs, I'm hoping it will stay with me until apple drops it's OS support, which is usually 5-6 years or more? Not clear with silicon yet since m1 isn't dropped, I'm asking if it's a good idea or I should cedel with something smaller since tech changes might force me to switch anyway way sooner than I think?
by high specs I mean something like macbook pro with the m3 pro processer, which is a bit stronger than the normal, more cores and all that with 1TB storage and 18GB ram, yes that's high specs by mac standards because macos isn't a memory hog lol.
how curious? I thought it would take longer for arm to become a good choice, not just an interesting, Could be good but not yet, kind of choice, because of x86 apps convertion and everything.
wonder how long until windows catchs up? Though from how things are going I'd probably give it 5 years unless if something AI revolutionary was only doable through arm, which seems to be their complete focus now.

2024-03-28 09:24:05

It's not even really about tinkering. It's about making smart and informed finantial decisions. Is that a Framework or System 76 computer? Honestly I’d need to look more into it to know for sure. But I do know that buying a cheap computer every 6 months, a year, 2 years, because they break that often? That is not smart. Buying a thousand dollar Mac Book just to throw that out in a couple years because you can't replace the battery? That's not smart either, and really neither was I for buying one. Computers with swapable parts are more than just, "Oo, look at me, I can build!" It's more practical finance wize. I’m not the brightest person on the forum, but I do know that putting something together yourself is much cheaper than paying someone to do the labor for you. It also presents less opertunities for you to get screwed.

Discord: dangero#0750
Steam: dangero2000
TWITCH
YOUTUBE and YOUTUBE DISCORD SERVER

2024-03-28 11:31:35 (edited by mohamed 2024-03-28 11:34:40)

You can replace batterys though, but like anything else you need to pay for it, and last I checked, you have hardware parts available for 10 years from release date for all macs.
also it's a widely known fact that macs tend to last, I don't know what you do with your battery but I had a macbook air 2016 for 5 years and I only had to change it then because it was really getting out of date in 2021, with all the cpu madness that happened, and the battery lasted me for 10 hours usually and that wasn't even silicon.

2024-03-28 12:55:08

@86, the problem with your argument is that repairing a Mac is not exactly easy, either in the skills department or financially. Apple still is trying to make repairability as difficult as possible, meaning that very few of the mac's components these days are actually things you can repair/replace. Keyboard? Yep. Screen? Yep. Battery? Yep. Trackpad? Maybe. Anything else? Probably not (it's probably soldered). And unless something has changed, Apple still changes you $80 or more for their repair kit, which from what it sounds is really, really complicated and has lots of terms and conditions tacked on top.
I push System76 because I'm familiar with them and they're repairable, not because they come with Linux by default/run Linux really well. Framework sounds neat but I've never tried it. There's also Tuxedo, which is slightly cheaper than System76 and, from what it sounds, quite good. Both are user-serviceable brands, with Tuxedo's latest model being all AMD which is very repairable and is an absolute beast (though with the caveat that the battery life you get is only 3-6 hours or so). Still, I would think that anyone who wants a new computer would want it to be serviceable if only because sometimes, you don't always want to (or need to) send it back to the manufacturer to get repaired and just need to replace something simple like the Keyboard, local repair shops may not even know about the brand of computer you have (but if they do you probably don't want to pay the $300-$800 repair fee they'll throw at you or you don't want to go giving them your login credentials when they don't need it (some repair shops are notorious for asking for these when it's unnecessary, and it's usually so they can possibly go digging around in your files, it's quite creepy)), etc.

"On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."    — Charles Babbage.
My Github

2024-03-28 15:08:06

Yeah I understand and agree with the storage and ram part, battery maybe a bit, but paying $200 or something for battery replacement is something that happens always over here so it's not a surprising concept.
also in my specific situation I wouldn't mess with anything at all, I'll just go to an apple store, which yeah, I do realize doesn't exist everywhere so I see the problem.
yeah I hate those shops, my father is the one to usually take my stuff to fixing shops and all that when he goes out to do something and I have to repeat 50 times to him to insist on the shop to never, ever deal with data, because some of them think that making a factory reset is a smart idea whenever you change any component.
but yeah, you're not supposed to change any specs, no ram, no storage, certainly no cpu, and we just got used to it as a fact and since most of us don't really find it to be that big of a deal we don't find it to be an issue, For example I never replaced storage or ram in a laptop, ever, Sure I wished if I had more storage more than once, but solution to that in my eyes is just buy it with more storage and call it a day, not saying that's a solution for everyone, but trying to say some people just don't care.

2024-03-28 16:32:08

@mohamed
If you aren't planning to use the Mac as a Mac and just want to put Windows on it somehow, why bother?

Like I don't get what you're asking or trying to say still because either you want to use OS X for something in which case why are we discussing this, or you want to run Windows on it, in which case you've got to do a VM.

And to everyone else if I had to guess even midrange laptops are good for at least 3 years.

@dan
At the end of the day there's a big conflict between replaceable components and efficient/cheap.  People aren't soldering everything because we don't want you to be able to repair it, they're doing that because modern tech is too small and fragile to work otherwise.  It is actually a limitation, not something we do to screw you over.

My Blog
Twitter: @ajhicks1992

2024-03-28 17:20:21

@89
no, not actually planning too, I did and it was far too unreliable, I have this laptop for that. I'll just use OSX because I like it. I'm just asking because you have more pespective about  cpu's and improving tech than I am so I like to see viewpoints, the one about me arguing about why mac and etc well I, don't know, I felt like being in the discussion because I thought something so I said it.

2024-03-28 19:27:00 (edited by camlorn 2024-03-28 22:06:12)

@90
I can't say anything in response because you are asking me to compare pizza and cars or something.  If you want to run OS X get Mac. Otherwise, don't.  I can't say anything, they're fundamentally different options.

If you are asking about architectures more generally, we have kinda solved X86 emulation on Arm.  It's not as good as X86 on X86 because it's not going to support all the optional extensions, but it's not an interpreter and does basically recompile your program.  Both Windows and OS X can do it, and I'm sure Linux has one by now too.

Phones are all pretty much Arm, lots of embedded stuff that's big enough that they want to run Linux, Mac of course, there are Arm Windows laptops on the market.  AWS offers Arm servers and charges less than X86 ones for it, Google also sells them to you.  For a vast majority of software you support Arm by toggling an option and rebuilding with no code changes.  Most modern software is higher level languages anyway, and anyone doing cross-platform development already has to support both architectures.  It'll take a while, you shouldn't buy Arm right now unless it's Apple, and it is possible that Intel or AMD will surprise us.  But as things stand X86 is losing market share and there's lots of reasons to believe that will continue, not least of which being that people can license the Arm architecture and build their own chips to meet their specific needs.  Given the choice of cheaper and longer battery life while not losing performance, or going X86, I know which laptop I'd buy.  Workstations will be the last to fall, but as Apple has shown SoC Arm workstations are indeed quite viable.

My Blog
Twitter: @ajhicks1992

2024-03-28 22:01:53

RISC-V is also another interesting development, and may be even better since it's an open standard. I was thinking about Apple M machines for a while to try ARM, but if they aren't totally open, I'm not interested. Besides, Apple can go pound sand as far as I'm concerned, particularly after the dick move in the EU.

Grab my Adventure at C: stages Right here.

2024-03-28 22:14:20

Risc-v probably isn't going to take off in the PC/server market.  The primary driver of that isn't tech, it's social.  Arm has a chance because it was able to bootstrap to significant market share, and most people at the level of manufacturing chips don't care about openness or anything like that.  It has a chance, but it will take a very long time to prove itself sufficiently to get Microsoft on board for example, and of course Apple makes all their own chips and chose Arm.  Probably their biggest chance was if China chose them, but China seems to be going Loongson instead and seems to have enough leverage or whatever that everyone is going along with that, e.g. Rust and C/C++ compilers, etc. are all supporting it.

My Blog
Twitter: @ajhicks1992

2024-03-28 22:34:58 (edited by Ethin 2024-03-28 22:39:35)

@93, RISC-V may actually get into the PC/server market, purely because of how it works. It's not about openness. It's also the fact that, in order for anyone to build a custom ARM chip, you need to get a license from ARM or they'll sue you into non-existence. RISC-V has no license fee requirement. Anyone can build a custom emulator or chip. If I remember right a RISC-V laptop has already been built, though I haven't heard anything about it since I read about it's initial debut so perhaps it didn't go anywhere. ultimately though RISC-V in the PC space wouldn't surprise me one bit. I can very very easily see it breaking into the phone/IoT markets though.

"On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."    — Charles Babbage.
My Github

2024-03-29 02:34:29

@94
Anyone with the resources to design a quality chip and manufacture it at large enough scale that it becomes a product does not care much about the Arm license because compared to all the other stuff involved in making this happen it's kinda a rounding error level problem.  I get the sentiment, but a lot of you around here keep trying to think of hardware like it's software and it's just not. If I license Arm I get a starting point, tons of devtools, lots of software, etc etc etc. and then I go get my chip foundry and by the time I'm done getting a chip foundry and whatever the fact of having to pay for the IP is just, it's not super relevant.  Like say Risc-v is the future, we'd still have to spend a decade with someone footing the massive build to optimize the designs to hell and back or whatever and then convince everything and everyone to support it, get MS on board, get phone vendors on board, get people chasing down all the compilers and getting them up to par...

Plus the Arm IP fees don't pass on to end users.  I can buy Arm processors all day long and use them in my products and not deal with it.  Whoever is pushing Risc-v also needs a reason to manufacture their own stuff, kinda.

Like I don't agree with the "let's tinker with everything" desktop Linux mindset but at least you can, it's a valid perspective.  But you can't tinker with the CPU more just because it's Risc-v or something, it offers zero benefit to the end user, it just complicates things to get big companies who are already paying Arm  to make this instead and then we all have to support it and then... and maybe once we do all that it can really take off.  And while all of this is going on everyone is still paying Arm for the next decade in any case and not benefiting from that supposed advantage.

Arm didn't happen overnight.  Everyone feels like it did because M1s but it took a decade at least to prove itself, become mature, and reach the point it's at; and it only did so because it offers practical advantages like power usage.  Risc-v isn't really even in the earliest phases of wide market adoption and does not, to my knowledge, offer advantages over Arm as significant as what Arm had over X86.

My Blog
Twitter: @ajhicks1992