(_____(_____________(#)~~~~~~

  • 1 Post
  • 28 Comments
Joined 2 years ago
cake
Cake day: April 11th, 2022

help-circle

  • It doesn’t help that they keep deprecating and changing standard stuff every other version. It’s like they can’t make up their mind and everything may be subject to change. Updating to the most recent release can suddenly cause 10s or 100s of compiler warnings/errors and things may no longer behave the same. Then you look up the new documentation and realize that you have to refactor a large part of the codebase because the “new way” is for whatever reason vastly different.





  • It’s capitalism. Nobody wants to take risks anymore because game development is a huge industry now with many investors who demand their developers play it safe and make sure it has mass appeal. Then even the indies (those who view game dev primarily as a business) see that what the big guys are doing “works” so now they strive to do the same because otherwise there would be no way for them to stay competitive. Only game devs who view making games as a fun thing to do and don’t even really think about the industry or the current market can make good games. Which are a small minority at this point, because capitalism favors the former.


  • I feel the same. I only play very few games anymore for this reason. Most of the games I play are very niche because they at least try to be unique. Like Fates of Ort or Dust Riser for example. I also tend to play a lot of open source re-implementations like OpenRCT2, OpenMW or Augustus (Caesar III).

    Game stores are flooded with crap that tries to appeal to the widest possible audience, which makes it really difficult to find actually good games. Even most Indies only produce slop. Nintendo used to make really interesting games back in the day. Still have my GameBoy and 3DS with some of my favorite games, so at least I can still enjoy those.







  • I have a similar theory. All “AIs” used right now are really just giant matrices that you multiply vectors against. It’s the same concept that’s used in computer graphics all the time, that’s why GPUs are so good for training and running them. To me a more “real” intelligence would need to grow and develop on its own completely organically without any human input and not be just a math problem. It would have to be dynamic and fluid much like a real brain. Neurons would need to function more like individual entities and behave like real neurons rather than just items in an array that get used in simple floating point operations. If you can express any core part of an “AI” as a simple function it’s not really an AI.

    Note that I’m not an expert, I just spent a some years experimenting with different types and combinations of conventional Neural Networks, reading research papers and eventually came to the conclusion that they’re a dead end due to their static nature. This realization actually made me lose interest in “AI” because these things are really just “smart” input mappers that can take well educated “guesses” of what the output might be.







  • Westmere Xeon processors are still quite OK imo. I have an old enterprise machine with one. 12 Threads, 2.6 GHz is still quite usable for many things. I mostly use it to compile larger software. But personally I’d argue that Longsoon is already far better than Intel/AMD since Longsoon is based on MIPS, which is based on RISC, while Intel/AMD still clings to their bloated and way too power hungry CISC crap. Plus today most performance comes from parallelism and cache size rather than core frequency and Longsoon does already have 128 and 256-bit vector instructions in their ISA, which is pretty decent. Maybe they can figure out a 512-bit vector extension that doesn’t severely throttle the CPU when using it before Intel can, lol.