One of the reasons the Xbox 360 was rushed to market a little too quickly was because Nvidia stopped making the GPU on the OG Xbox.
Microsoft didn’t have a choice, they couldn’t keep making the Xbox.
You don’t want to be locked in by a 3rd party like that.
Surely a company like Microsoft or Sega has enough weight to throw around to get a contract obligating the GPU provider to continue providing GPUs for X amount of years after the console’s launch, right? Maybe that was an oversight on the original Xbox, but I don’t see why they couldn’t do that now.
Thats how you get elevated government contract prices, to compensate for ensuring these factories keep making a product longer than its natural shelf life. And also the risk of being stuck with an obsolete technology.
You would think, right? Apparently not…
I’ve not heard of this, but I imagine the reason is probably because they think they can make them for cheaper and with less restrictions.
Probably to avoid any issues with licensing and proprietary code and drivers.
deleted by creator
A quick google didn’t give me any hits. What are you referring to?
deleted by creator
Same reason Apple started making their own CPUs
Huh? Source on this? Their latest arcade board line uses Intel CPUs and nVidia GPUs. That’s the only hardware they make right?
deleted by creator
If they’re trying to make a consumer device with a sales price under $100 (the article says the Mega Drive Mini 2 is $75) that is very hard to do with a modern desktop processor and GPU. It’s also much more horsepower than they need for Saturn or Dreamcast games.
which is weird because a raspberry pi 4 can emulate a dreamcast at full speed nicely.
To me it sounds like this is saying they’d have to make their own hardware if they wanted to make Dreamcast or Saturn Mini, not that they intend to do so.
deleted by creator
Who says they do?
deleted by creator
When you’re dealing with that vintage of software it’s not a matter of rebuilding the program to target a different architecture. You’re talking about emulation.
They probably instead were opting to remanufacture chips with support for the original instruction set and none of the quirks of emulation.
The smaller the price-point target,
the bigger a chunk of it that each embedded component eats.
The markup/licensing for the GPU in a little device can be a significant percentage of that little device’s potential-profit-margin.
Same as a $6 markup isn’t significant on a $1000 thing, whereas it IS significant, when the target-price is only $10, and now you’re left with $4 for all-your-contribution.
I’ve discovered that the good heart-rate sensors cost in the $30 range, whereas the shit ones are much much less.
So, if I were to put into production a fitness-trainer that used the proper equations, & didn’t do any spying on people ( as Fitbit & smartwatches do ), then I’d need to … either create a shit product, or bite the bullet & get the good sensors.
I can’t afford to create my own sensors, see?
Maybe Sega can afford to make their own GPU.
Do keep in mind that there already has been a opensourcehardware GPU, iirc, some years ago ( shit performance, but if it’s public-IP, and it can be significantly-improved-on, then it’s possible )
https://duckduckgo.com/?q=opensource+hardware+gpu&t=fpas&ia=web
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod