In the 2000s we had AdSense. So now we’re getting… AISense?
Yoko, Shinobu ni, eto… 🤔
עַם יִשְׂרָאֵל חַי Slava Ukraini 🇺🇦 ❤️ 🇮🇱
In the 2000s we had AdSense. So now we’re getting… AISense?
Whenever I get lazy I just throw some seasoned chicken drumsticks in my airfryer and then add some Uncle Ben’s rice. Almost 0 effort.
There’s also group C which I was part of, you just say that you just pooped or scratch your butt whenever they ask you to load/unload and they’ll immediately offer to do that for you instead.
Typical Xonotic players: https://www.youtube.com/watch?v=K8JD2g4N9JA
I believe that’s the Orangutan opening
It still works using just a web browser. Some might just prefer a native app, which Google is currently rolling out.
I love it, I use it on all of my devices at home and it works flawlessly.
Me see cat, me upvote 🐈
I don’t have a Xiaomi tablet but you could try what has been suggested in this thread: https://old.reddit.com/r/miui/comments/18tz52u/how_to_remove_this_3_dots_new_in_hyperos_mi_pad_6/
This would be a meme by itself:
lacks some cheese IMO
okay I’ll watch it just for the hilariously long title
for the math homies, you could say that NaN is an absorbing element
Omae wa mou shindeiru
Let them fight among themselves and prove time and time again that patents are idiotic and hinder innovation.
Chess, because we have !anarchychess@sopuli.xyz.
Yup, they already forced Google to announce that they’ll add such a choice screen for the search engine and web browser on Android: https://www.neowin.net/news/google-will-add-new-search-and-browser-choice-screens-for-android-phones-in-europe/
It’s only a matter of time before Microsoft does so too.
ollama should be much easier to setup!
ROCm is decent right now, I can do deep learning stuff and CUDA programming with it with an AMD APU. However, ollama doesn’t work out-of-the-box yet with APUs, but users seem to say that it works with dedicated AMD GPUs.
As for Mixtral8x7b, I couldn’t run it on a system with 32GB of RAM and an RTX 2070S with 8GB of VRAM, I’ll probably try with another system soon [EDIT: I actually got the default version (mixtral:instruct) running with 32GB of RAM and 8GB of VRAM (RTX 2070S).] That same system also runs CodeLlama-34B fine.
So far I’m happy with Mistral 7b, it’s extremely fast on my RTX 2070S, and it’s not really slow when running in CPU-mode on an AMD Ryzen 7. Its speed is okayish (~1 token/sec) when I try it in CPU-mode on an old Thinkpad T480 with an 8th gen i5 CPU.
Buy Now Pay Later is what’s exacerbating this. People are dumb, have short attention spans and most of them are statistically bad at basic math, so when they see a purchase that they can make without paying anything now they’ll hit buy and they’ll do it many times as the e-commerce platform will usually recommend other products to them they’ll likely want, they won’t do the calculation to see if they really can afford the split payments + the interest.
Or even better, just get a vibrating cock ring.