As someone who lived through that era, I can assure you that throughput is no deterrence to shitheads, morons, asshattery, and annoyance.
(Also, if you think Fediverse or even Reddit mods are bad, let me introduce you to the 1988 BBS Sysop.)
As someone who lived through that era, I can assure you that throughput is no deterrence to shitheads, morons, asshattery, and annoyance.
(Also, if you think Fediverse or even Reddit mods are bad, let me introduce you to the 1988 BBS Sysop.)
That’s been my take on the whole ‘use gopher/gemini!’ bandwagon. Nice idea, but the solution to the problem leads to more problems that need solutions, and we’ve come up with solutions to those, but on other protocols.
And I mean, if I stab someone in the face with a screwdriver, the misuse of the screwdriver isn’t in some way specific to the screwdriver and thus nobody should use screwdrivers.
Same thing with all the nonsense a modern website does: HTTP is fine, it’s just being used by shitheads. You could make a prviacy-respecting website that’s not tracking you or engaging in any sort of shifty bullshit, but someone at some point decided that was the only way to make money on the Internet, and here we are.
Yeah, now you get mean people, a drive-by malware installer, AI generated ads, and 4mb of JS that tries to scrape every detail about you so they can make a profile they can sell to (dis)information brokers.
Truly, an improvement.
(People have always sucked, the Internet just lets you interact with more people so…)
Their whole writeup is somewhwere between “trust me bro” and “enough holes you can legally sell it as swiss cheese”.
I’m utterly confused as to who the target market for this is since their current userbase clearly does not care if shits encrypted or not, and any even remotely privacy oriented person is going to have the exact same take you did.
Alternately, what’d be really neat would be an easy way to mostly completely do a webpage setup for someone using the free hosting options that do exist.
Like, a tool that makes handling deploying something to Github Pages or Cloudflare Pages or whomever else offers basically free web hosting that isn’t nerdy to the point that you need a 6,000 word document to explain the steps you’d have to take to get a webpage from a HTML editor to being actually hosted.
Or, IDK, maybe going back for ye old domain.com/~username/ web hosting could be an interesting project to take on, since I’m sure handling file uploads like that should be trivial (lots and loooots of ways to do that.). Just have to not end up going Straight To Jail offering hosting for people, I suppose.
It’s not an attack on certain minority groups unless you consider “normal average person” a minority group.
It’s just a little bit of the old nerd superiority complex leaking out with a new word attached to it.
The resurgence of a lot of pre-web protocols is interesting, but I’m not entirely sure it’s going to be a sticky thing beyond a novelty.
Also 100% agree with the first comment that on an article about the small web half the content is YouTube videos being hilariously tone-deaf ironic. If only there were some other method of sharing videos with people. Perhaps some sort of tube that’s peer-to-peer? A PeerTube, if you will.
blaming enshittification on “normies”
What’s really annoying is it’s straight out of the corpo playbook.
“We’re not responsible for ______, you are because you didn’t do enough ________”.
The most blatant is “global warming” and “ate too much meat/didn’t recycle enough/made poor choices with your car” and so on.
It’d be nice if people would stop trying to blame the worst offenses being perpetuated on people by billionaires and their pet corporations on personal choices, because it’s hot liquid bullshit.
No I mean the modem on my hard drive not the wifi.
Hell, Intel has lost my confidence they can even fucking fab a CPU correctly at this point, never mind anything else.
I’m almost exclusively AMD based at this point despite them being less than uh, reliable (see: the year long fight I’ve had with my 7700x being unstable which was only resolved, amusingly, by jacking up the voltage). Also, my 1700x was hilariously awful, but I’m willing to shrug and call that new architecture woes and not be too judgy about that one.
I’m reservedly enthusiastic for Qualcomm’s entry (for like the 4th time) into desktop processors, and hope that this time they can keep improving performance, and actaully support things for more than five damn minutes before going ‘welp only supporting new cpu!’ like they do with their mobile ones. Also if they actually live up to their promises to provide full driver support and support parity to the Linux kernel so you can get rid of Windows on them.
Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.
IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.
I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.
PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.
Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.
I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.
Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.
I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.
Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.
You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.
nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.
Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.
I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.
I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.
Power consumption numbers like that are expected, though.
One thing to keep in mind is how big the die is and how many transistors are in a GPU.
As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.
Big die + lots and lots of transistors = bigly power usage.
I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.
Videos are unfortunately the way a LOT of quality content is delivered now and banning any and all videos (relevant or not) is probably not the way to go.
GN, L1T, HUB and so on are super high-quality stuff that’re tech related and that’s basically how they deliver their content. A blanket ban would kill way too much good shit, imo.
movie industry that’s been complete trash for a while now.
This is not a callout of you in particular so don’t get offended, but that’s really only true if you look at the trash coming out of Hollywood.
There’s some spectacularly good shit coming out of like France and South Korea (depending on what genres you’re a fan of, anyways), as well as like, everywhere else.
Shitty movies that are just shitty sequels to something that wasn’t very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can’t possibly do better.
Well, that’s the doomer take.
The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.
I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.
And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).
And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:
If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.
And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.
As someone who’s been buying (though not intentionally) exclusively AMD laptops for the past 9 years or so, yeah, this resonates pretty well with the user experience.
I mean, none of the laptops are bad or defective or whatever, but the quality of support and feature support and just the general amount of time it takes to get things pushed out has always been shit compared to Intel stuff.
AMD can’t manage firmware and software fixes for shit, regardless of product line and if I were an OEM, I’d probably be pissed at their stupid slow bullshit too.
Example: 2022 G14 was totally getting USB4. Got a beta for it, and then Asus went ‘Fucking AMD isn’t helping or providing stuff we need, so this beta is all you’re getting, go yell at them.’ Is that the whole story? Maybe not, but it certainly feels perfectly reasonable based on how AMD has supported everything prior to that as well, so I tend to think it’s enough of the story to be true.
Good hardware (mostly), and it’s reliable enough and it does the job, but it’s very much a dont-expect-support kind of experience past the first couple of months after release. (And yes, I know the OEM carries a good portion of responsibility there, but if there’s not a new firmware/microcode/etc from AMD to fix an issue, then what are they supposed to ship to you?)
This here. The most important thing on your computer are all your session cookies, which are, well, accessible with permissions your user account already has.
Dudes don’t care about making your shit into a botnet, or putting a rootkit in your firmware, or whatever other technically complex thing you care to think about: they’re there to steal your shit, and the most valuable shit you have is sitting there out in the open for the taking for anyone who makes it past a very very low bar of ‘make the user do something stupid’.
It’s such a reasonable a policy I’m finding it hard to believe, unless there’s a clause that they get a kidney, or are allowed to show up and break your ankles, or are taking ownership of your first born child or something.