Yes you can get dial-up, DSL, cell network data, or even satellite! These services are clearly equivalent to cable or fiber in the ISP marketplace.
Yes you can get dial-up, DSL, cell network data, or even satellite! These services are clearly equivalent to cable or fiber in the ISP marketplace.
Let’s just take NYT for example. Subscription costs $325/year. Why would I ever pay that much? It’s not 1954. I’m not sitting down with my morning coffee and reading the damn thing front to back. I’m reading maybe one article a week from 15 different sources. Am I supposed to pay $5000/year just to cover my bases?
As with everything else in [CURRENT YEAR] the value proposition is so absurdly out of step with reality that fixing it basically relies on rolling out the guillotines.
Looks like the headset she’s wearing in that video (EPOC X - 14 channel EEG headset) is available from Emotiv for $1k, and the software she’s using to map controls (EmotivBCI) is something they provide for free. They have 2 and 5 channel headsets for cheaper and 32 channel caps that are more expensive. Seems pretty consumer-ready to me, but I’m sure your EEG activity data gets shared with Emotiv, which isn’t ideal.
Commenting to remind myself later because I’d love to check into this. My hands are achy from years of overuse, so an alternative to physical controls would be amazing.
While I agree for the sake of clarity, a bigger problem is that it only goes back less than 2 months. Has the number of installs been steady at 7k for a long time? Or does it fluctuate wildly like this occasionally for reasons totally unrelated to laws?
VTOL VR is awesome too. The problem with a lot of games that support VR is they don’t support the controllers to the same extent. Playing VR with an Xbox controller instead of the motion tracking Index controllers just ain’t the same.
I guess I’m wondering if there’s some way to bake the contextual understanding into the model instead of keeping it all in vram. Like if you’re talking to a person and you refer to something that happened a year ago, you might have to provide a little context and it might take them a minute, but eventually, they’ll usually remember. Same with AI, you could say, “hey remember when we talked about [x]?” and then it would recontextualize by bringing that conversation back into vram.
Seems like more or less what people do with Stable Diffusion by training custom models, or LORAs, or embeddings. It would just be interesting if it was a more automatic process as part of interacting with the AI - the model is always being updated with information about your preferences instead of having to be told explicitly.
But mostly it was just a joke.
It’s amazing the way you NOTICE TWO THINGS.
Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you’d have a bot that maybe knows to only contextualize the last couple messages.
Hmm, if only there was some hardware analogue for long-term memory.
Of course!
smacks forehead
I mean I have 64 GB but I’m not wasting it on browser tabs. I’ve got people at work who never close anything, they’ll have 15 tabs, 28 PDFs and 7 Excel spreadsheets open 24/7 because it takes them an hour to remember where they saved them otherwise.
Literally me when I hear them complain about their slow computer:
I was in the same boat as you about 5 years ago - I had been stubbornly using iTunes, but it was so slow and the store was just an annoyance, it was getting in the way of me actually listening to my music. I ended up choosing MusicBee over Winamp or foobar2000 because it has all the library management stuff (even a sync to mobile device function) and a great interface right out of the box.
just don’t close the tab
My RAM is screaming.
Hey, it’s my god given right to take it up the ass from corporations. I’m voting for Trump so he can reverse this radical marxist fascist rule on day one.