Oh yeah, I know what you mean! I keep unconsciously reaching for the stick 😂
Oh yeah, I know what you mean! I keep unconsciously reaching for the stick 😂
It definitely gets easier in my experience. A lot of the things that take conscious effort right now are going to become reflexes and automatisms with more experience. Right now you are building that experience, and there isn’t really a way to speed it up. You just need to do each action dozens and hundreds of times, until you do it without thinking.
Driving a manual car, for example, is definitely more complex than an automatic one. You literally need to manage one more thing. But do not worry about it, you will change gears a lot during your practice sessions and build a lot of experience quickly. In a few months you will probably not think much about gears, and in a few years you will be managing them without giving it a single thought.
Fun anecdote, I recently got a new car and it is an automatic one while I previously only drove manuals. For a few days I couldn’t figure out how to start smoothly, and I was very confused… until I realized that starting mostly involved the clutch on my previous car. The first movements of my right foot used to be to keep the rpm under control while disengaging the clutch, which is just not needed on an automatic car. I was simply applying the same muscle memory to the new car without realizing it!
As others have already said, that is a lot of pasta. If you regularly cook volumes like that, it would really make sense to invest in a large pot as well. A cheap 10l pot will do just fine for boiling pasta, and it sounds like you would get plenty of use out of it.
Do you cook your pasta in a large pot, with plenty of boiling water, and a good amount of salt? Usually I just stir once just after putting the pasta in, and I never have noodles sticking together.
It is more of a “For typical cards, expect very competitive options from AMD. If you want top performance, buy Nvidia.”
In theory it allows them to focus more on the cards that actually get bought, and thus they could make those cards better products.
As the great operation begins.
It is a hardware failure. Screens are complex and sensitive parts that are exposed to a lot of (ab)use. What is cryptic about that?
Well, if
Then we can be quite confident that your connection is indeed encrypted!
And of course, you’re welcome!
If the timestamps line up, maybe Wireshark just doesn’t manage to understand the entire exchange. What could happen is that Wireshark sees the SSH handshake, and after that it might become just encrypted gibberish due to the encryption. In that case the SSH traffic could just show up as “some kind of TCP”.
Do you see an SSH handshake, followed by random crap on the same ports?
(I’m not a Wireshark expert, just an IT guy trying to help!)
TCP is on a lower level than SSH, usually SSH uses TCP as its underlying transport layer. TCP as such is not encrypted, but it can of course be used to transport encrypted data.
Are those packages not part of the same SSH connection according to Wireshark?
Does life suck, or not?
Yeah that’s what I thought too. The horrors are described well, they just typically don’t get described through their physical form. As you say, because the human mind cannot comprehend it. There is a lot more focus on impressions, comparisons, and effects, rather than on a real physical description. Personally I thought it was quite neat!
AI is a field of research in computer science, and LLM are definitely part of that field. In that sense, LLM are AI. On the other hand, you’re right that there is definitely no real intelligence in an LLM.
It would also be very hard to compete with products that are this mature. Linux, Windows, and macOS have been under development for a long time, with a lot of people. If you create a new OS, people will inevitably compare your new immature product with those mature products. If you had the same resources and time, then maybe your new OS would beat them, but you don’t. So at launch you will have less optimizations, features, security audits, compatibility, etc., and few people would actually consider using your OS.
That is true, but from a human perspective it can still seem non-deterministic! The behaviour of the program as a whole will be deterministic, if all inputs are always the same, in the same order, and without multithreading. On the other hand, a specific function call that is executed multiple times with the same input may occasionally give a different result.
Most programs also have input that changes between executions. Hence you may get the same input record, but at a different place in the execution. Thus you can get a different result for the same record as well.
That exact version will end up making “true” false any time it appears on a line number that is divisible by 10.
During the compilation, “true” would be replaced by that statement and within the statement, “__LINE__” would be replaced by the line number of the current line. So at runtime, you end up witb the line number modulo 10 (%10). In C, something is true if its value is not 0. So for e.g., lines 4, 17, 116, 39, it ends up being true. For line numbers that can be divided by 10, the result is zero, and thus false.
In reality the compiler would optimise that modulo operation away and pre-calculate the result during compilation.
The original version constantly behaves differently at runtime, this version would always give the same result… Unless you change any line and recompile.
The original version is also super likely to be actually true. This version would be false very often. You could reduce the likelihood by increasing the 10, but you can’t make it too high or it will never be triggered.
One downside compared to the original version is that the value of “true” can be 10 different things (anything between 0 and 9), so you would get a lot more weird behaviour since “1 == true” would not always be true.
A slightly more consistent version would be
((__LINE__ % 10) > 0)
They are very busy charging an arm and a leg for crappy software with shit support.
Makes sense, you clearly thought about this! From a world-building perspective I do have a follow-up question: 86.4k seconds is our definition of a second, but it is essentially a convention and there is no reason for it. In a society that throws out the hours and minutes, why did they keep our second? It seems like it would have made sense for them to define the day as 100k of some new (slightly smaller) unit. That could have given them 10 “hours” of 100 “minutes” of 100 “seconds”.
Why split the day into 8?
You definitely have a point with base-12 though. If base-10 wasn’t so ingrained already, base-12 would be a very logical choice. You can even count to 12 easily on one hand, using your thumb to keep track of where you are and counting on the segments of each of your 4 other fingers.
DocFx could do what you’re looking for. You would write your stuff in markdown and it generates an interactive and customizable site.