It may come as a surprise to many, but people think of video game consoles are separate devices to their “normal” computers that they use: smartphones, tablets, desktops, etc. Whatever you may wish to encapsulate it as, a video game console is nothing short of a computer just like your smartphone, desktop, microwave, and even any traffic light you probably have seen. These days video game consoles are becoming more and more integrated with the idea of the traditional “computer”: the “PC” term. While PC had its use back in the old days of IBM, calling your desktop “PC” and your PS4 otherwise is plain wrong. Alternatively, calling anything non-Windows desktop form factor as other than “PC” is also wrong.
People often differentiate between Windows and OS X as “PC” and “Mac”, but OS X and Windows both run on Intel processors; so it’s hard to say why OS X is not considered a “personal computer”, especially when it runs on the same microprocessor that Windows does. The same thing goes for modern video game consoles … both PS4 and Xbox One run on the “PC” as many would consider an Intel IA-32 or x84-64 computer architecture or instruction set architecture. In simpler terms, PS4 runs on pretty much the exact same thing as your Windows desktop, OS X, or Linux does on the bare metal of the computing platform. It makes no sense to waste money and efforts trying to still divert the idea of “video game console” away from any other computing platform in which games can run (which could be any platform from an old NES 6502 chip to an i7 Haswell @ 5.5 gigahertz clock speed).
It’s the time that “video game consoles” are coming to an end, and how long they ride out depends on how much more misinformation the marketers of video games and engineers want, and how long they want this seemingly real “video game platform” to exist. The marketing doesn’t fail because most consumers typically don’t know that a PS4 and a desktop can execute the same machine instructions … or that their Windows Surface tablet can play Wii and PS2 games on it through something called emulation.
What this does is give people the wrong impression of what a computer is and fails to teach them. People will think their magical video game boxes are unique, and they’ll spend $1,000.00 U.S.D. on them to watch Blu-ray movies and play the latest *describe console name here* games; or, as some would better manage, continue to build specially dedicated computers for games that have differing computer architectures. But manufacturers won’t do that because it costs too much money to manufacture specialized chips, make new ones, new hardware, etc. What do they do instead? They add a few compute units to an AMD graphics processor, slap on a cool name, lock down an Intel chip, optimize a specialized motherboard and call it all a PS4 … the deception of it all is that consumers think a PS4 is different than any other computer hardware because it has an “awesome graphical interface” which a 2012 rig could easily have had as well.
In the older days game consoles actually DID have their own more “independent” hardware and computing platforms … PS2 is one that implements MIPS and did so until PS3. The PS3 itself is pretty much a “computing platform” of its own in the sense that you can’t just simply slap together and buy its hardware like you could a PS4/Xbox One for less money than the system costs itself. You won’t run PS4/Xbox One games on it in general, but that’s besides the point … as long as people are uneducated, big companies make money by selling you a mushed together piece of hardware with a nice software-interface to run their games on it.
They won’t be along to continue down this path too much longer before people realize that a “video game console” and a “computer” is really all the same thing; and it’s stupid to slap together crap hardware and an interface to charge you more money as if it was “uniquely manufactured just for consumers”.