Nintendo Wii: Sold like gangbusters.
64bit Processors: The computing standard.
Battlestar Galactica: Considered one of the greatest sci-fi series of all time.
Facebook: Continues to be the world’s leading social media platform by literally BILLIONS of users.
High Definition: HD only got even more HD.
iPhone: Set the standard for mobile smartphone form factor and function to this day 16 years later.
To be fair, a lot of these are accurate, or at least were at the time.
-
Multi-GPU just never caught on. There’s a reason you don’t see even the most hardcore gaming machines running SLI today.
-
The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.
-
Spore is largely forgotten, despite the enormous hype it had before release. It’s kind of the Avatar of video games.
-
It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn’t notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.
-
Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.
-
Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)
-
I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.
-
The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.
-
The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier… the iPhone family did make a huge impact in the long run, but it wasn’t until the 3GS that it was a true competitor to something like a Symbian device.
The only entry on this list that’s really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck’s guts and has proudly never had a Facebook account.
Are you joking? I thought the Wii was a wild success, I remember it being very popular.
deleted by creator
Multi GPU video cards (not multiple video cards) might be making a comeback.
Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple’s “UltraFusion” and AMD’s “Infinity Fabric” to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.
As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback… at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.
undefined> Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)
I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.
I feel like both of these are wrong.
streaming is just the next step to downloading movies. Downloading movies was never really a “thing” and this is really just some tech writer not understanding the term streaming. Netflix had just launched in 2007 as a streaming video service and this is the hyped thing they are referring to. They are not referring to downloading movies on iTunes or whatever. If they are, then my argument is no one ever hyped that up, especially not in 2008.
This is also for PC, not TV. https://web.archive.org/web/20081216002805/http://store.steampowered.com/hwsurvey/ shows that 1024x768 was the majority followed by 1280 x 1024 Sadly I can’t pull stats from earlier because Valve decided to use Adobe Flash for these stats until 2017. But just in 10 years https://web.archive.org/web/20180331131102/http://store.steampowered.com/hwsurvey 1080 HD became 75% of the PC gaming base. HD certainly should have been as hyped as it was. It’s become the absolute standard. If you don’t have at least 1080 HD monitors you are far behind.
Also I wanted to address this:
The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.
It absolutely did have a lasting impact. It changed Nintendo forever. It’s seen as one of the best examples of a Blue Ocean strategy working. It’s why we had the Wii U and thus eventually the Switch. Which in turn influenced the Steam Deck. Wii also directly influenced it’s competitors to create the failed Kinect and failed Playstation Move. It was the best-selling TV console from 2006 to 2013 until Playstation 4 launched which quickly was replaced in 2017 by the Switch becoming the best-selling console.
Lastly, this article was fairly wrong and clearly written as a fluff piece without substance. If you look at the original magazine https://issuhub.com/view/index/1138 they don’t even go into detail or expand on their thoughts. It’s a fluff page.
-
Honestly feels like satire reading this today
The Wii was extremely popular. For years, it outsold every other console combined by several orders of magnitude.
Netbooks absolutely were overhyped, and the market for them died really quickly. They were barely usable, and by 2010 when tablets really started hitting the market, there wasn’t a space for them anymore.
HDTVs weren’t overhyped, they were just expensive, and in 2008 there wasn’t that much content to take advantage of it. I had a 32" 720p TV that I paid nearly $700 for in 2007. Now, you can gt a 40-something inch 4K tv for a little over $200, and there’s plenty of content to make it worthwhile (though the real-world benefit of 4k on such a small set is debatable).
The first iPhone was so incredibly polarizing at the time. The hype machine leading up to that announcement was unlike any other product launch I can recall. So it was never going to live up to that kind of hype. And while it was limited in features for it’s time, it was clear more was on the horizon. And given how it not only revolutionized the phone market, but also the web as a whole, we know how it all ended up.
The Wii was overhyped though. Most players never bought any other game than Wii sports. I had an unlocked Wii and played all the good titles, and there are not more than ~10 of them. Most Wii games (looking at you, NFS) felt like half-baked mobile ports.
And the Wii U sales showed that. Yeah, the Wii sold to tons of casuals, but hardly any of them upgraded, even though the Wii U was a much more capable system.
The most frequent question I hear to this day when talking with former Wii owners is “What’s the benefit of the Wii U and why would I need to upgrade?” That’s a question I have never heard in relation to any other game console. Or have you ever heard the sentence “What’s so special about a PS3 if I already have a PS2? Why would I need to upgrade?”
And this setup the Wii U to be such a huge commercial flop that Nintendo effectively cancelled their stationary game console line.
I would say it was seriously overhyped, similarly to the Netbooks. It was a fad, it was cool, boatloads of non-techy-people bought them, and none of them bought the successor so it all died quickly.
For the rest I agree though.
The Wii had a ton of great games outside of the Nintendo specific ones. The Conduit 1 and 2, Golden Eye, tons of fighting games, it gave us No More Heroes. The Force Unleashed somewhat had the best edition on the Wii (this is mostly subjective but it’s a strong consensus that the Wii’s version held up). Its main appeal to other consoles I think was how diverse the games could try to be - silly games like Boom Blox and De Blob, and niche ones like Endless Ocean for all the marine biologist kids.
Granted, I grew up with some of these games and I’m not trying to say that the Wii’s extensive library is all stellar. But there are many gems amongst it. The Wii’s popularity drew a lot of attention to games that would just be scrolled past as shovelware on other online stores (Xbox Live mostly). Few of these were outside of the Xbox Arcade or whatever it was, but on the Wii they would be digital and sometimes have physical editions. Also because of how wide its demographic, it had a few surprisingly decent Barbie-esque and Horse care games. I mean, it had so many games made for it that only just stopped getting games in 2020.
The Wii U was an attempt to bridge the gap between the success of their portable line, the DS, and the Wii. Growing up all any kind ever wanted was getting their consoles connected. But then when the Wii U finally came out and was marketed, its main selling point was that you could play your game on the tablet while someone else was using the family TV. I mean really, it was exactly what every 10-14 year old into Nintendo was talking about up until Nintendo actually made it.
Part of it was marketing, I remember a lot of people being surprised that the gamepad wasn’t what was being sold, but a whole console with it.
It’s crazy that it failed honestly but at the same time it’s totally understandable. You can’t try to be both a home console and a “portable one” when what’s portable is connected to the Wii 2. It was the genetic imprint that wanted to be everything the Switch became.
The Wii U was an amazing platform specifically because of the second screen. It lended itself nicely for asymmetric multiplayer (Rayman Legends is so good on the Wii U) and the ability to play without using the TV is also pretty nice.
Yeah marketing was certainly an issue, but the other issue is that the Wii was mostly owned by casuals and casuals don’t need to upgrade ever.
Hence why the Wii received it’s last game (Just Dance 2020) a whole year later than the Wii U received it’s last game (Just Dance 2019).
God, GoldenEye online was so amazing though for a long time. Only reason I got a Wii.
In 2008 there was still new stuff being shot in a 4:3 aspect ratio in some places never mind HD, I remember when Top Gear did the polar challenge the year before it was a real showcase for HDTV.
Dang I had no idea .LGBT was a top level domain
It’s a mastodon instance. Calm down.
…so?
Sorry I’m confused at your comment there. Care to elaborate?
All I’m saying is I think it’s cool ‘lgbt’ can be used in this way
Sorry if I was unclear
My mistake. The internet has programmed me to take any reply in context of lgbt to be with hostile intent. ❤️
Just to clarify their point, they weren’t pointing out the mastodon instance, they were pointing out the LGBT TLD (top level domain). It’s interesting, and not widely known, that anyone can make whateverwebsite.lgbt now. You could own the africangrey.lgbt domain for $11.99/yr if you wanted. Nothing to do with mastodon.
The only ones that look wrong to me are iPhone, Wii, and downloading movies from the internet. The rest are dead on. HD literally did exactly what they said, as soon as it got popular it was already on to 2k then 4k then 8k.
They said 64bit computing explicitly calling out the need for more 64bit apps. Dead on as well. When apple went 64bit only there was a huge uproar, even from me, because even in 2018 that was a huge problem.
Facebook they literally agree it’s fun and distracting but that it’s not revolutionary so there’s no reason for the hype. It’s the same thing as with TikTok.
I’m really surprised with how accurate it is honestly.
I disagree. HD lasted a super long time. That there would be a new standard after HD was never a question. As far as standards go it lasted a very long time and did about as good as any standard could.
64 bit was an absolute necessity. That it was a lot of work to switch to does not mean it was overhyped.
I don’t like Facebook but that doesn’t mean its success can be ignored. It became the biggest social network and was regularly mentioned in the same breath as Google and Microsoft, so I can’t see how it’s overhyped as much as I don’t like them.
The point is you should read the article rather than going off of headlines. Each thing in the list states as much.
The HD paragraph literally states exactly what happened. As soon as HD had made it to mainstream (actual tvs, laptops, monitors) it was already outdated. They were saying to not overhype it because it will keep happening. And they were completely right.
64 bit they were complaining about being overhyped because it was. Until you were able to get almost any app in 64 bit it was useless for all but the most tech savvy.
No one is ignoring the success of facebook. they’re saying that facebook as a social network was overhyped. It wasn’t the first, there was nothing remarkable about it. Just because Cavendish bananas are the most popular and most successful bananas of all time doesn’t mean other, very good, very tasty bananas didn’t exist before them. Cavendish bananas are just successful.
Nah, the iphone is also overhypped and overpriced
always has been
HD is still the standard for most? I don’t know of a single person who uses a 4k TV. 4K is still in early adopters phase.
EEE PC!!! I miss the age of netbooks - I had a similar one, the MSI Wind - my favorite computer ever :')
I had a HP Mini311, Atom CPU and 11.6" screen, 3GB of RAM, wifi/bluetooth, and it had a dedicated NVIDIA GPU, it was impressive at the time in 2008, and a HDMI port, was able to read 1080p using GPU only. I loved the form factor. Since then I favour laptop with 13 or 14", I don’t want/need a 17.3" laptop with DVD and all!
Since I love playing devil’s advocate, here’s a couple of points in their defense:
Multi-GPU videocards: Pretty much dead, it’s just not efficient.
64-bit computing: At the time was indeed slightly overhyped because while your OS was 64-bit, most software was still 32-bit, games in particular. So games couldn’t really use more than 4 GB of memory. And that was standard for multiple years after this article (this was 2008, 64-bit Windows had been out for ages, and yet 3 years later the original Skyrim release was still 32-bit. Games having 64-bit binaries included was a huge thing at the time) Now most software is 64-bit and yes, NOW it’s standard.
High definition: Depends, did they mean HD or Full-HD? Because the former certainly didn’t last long for most people. Full HD replaced it real quick and stayed around for a while. Of course, if they meant Full-HD then hell no, they were hella wrong, it’s been mainstream for a while and only now is being replaced by 1440p and 4K UHD.
iPhone: The FIRST one as a singular product really didn’t live up to the hype. It was missing features that old dumbphones had. Of course the overall concept very much did revolutionize the phone market.
Well to be fair, changes like switching to 64 bit always are very slow (especially if they’re not being forced by completely blocking 32 bit). But I don’t think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.
Well by 2008 we’d had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it’s not worth the effort in the personal computer space.
I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).
It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).
They were right about Facebook.
HD only got even more HD.
That’s exactly what they were saying, no? “even more HD” = UHD.
Downloading movies from the internet (is wildly overhyped)
lol what does that even mean
They’re probably referring to streaming. I don’t know when this article came out, but considering they’re talking about literally the first iPhone I guess we can assume it’s 2007 or 2008, and Netflix started streaming back in 2007.
Apart from that, perhaps they’re referring to when Amazon started offering movies to buy or rent online, but I don’t know when they started doing that.
The first movie available to buy on iTunes was High School Musical in March 2006. Apple then launched the iTunes Movie Store in September and by the next year (sometime around when this was probably written) was bragging about having sold a couple million movies.
Anyway, I think they’re likely talking about downloading proper rather than streaming.
It’s funny how piracy was a step forward compared to the industry. People around 2008 hyping downloading movies from the internet, when pirate sites have been doing that since early 2000s, of, perhaps, even earlier.
Consumerism, seems like always kept overhyped.
(amazing, everything you said was wrong)[https://youtu.be/2sRS1dwCotw]
They got Spore right.
To be fair, Spore was overhyped - it was fun enough, but not the total gamechanger that it was forecast to be. Will Wright had two amazing strikes with Sim City and then the Sims, and then a whole pile of very middle-of-the-road simulation games, so it wasn’t that hard to foresee.
And EEE PCs occupied the uncomfortable niche where they didn’t do a lot that your phone couldn’t, while being extremely limited compared to a £300 ‘proper’ cheapo laptop. That’s not really a business model.
So yeah, that’s two things that anyone could have seen coming, versus eight where they’re so massively completely wrong they couldn’t have failed harder if they tried. Would have been better to call this list ‘things which are not massively overhyped’, they’d have done better.
I was going to say that I agree iPhones are smartphones are of no benefit, they don’t do anything, I’m being sarcastic, but looking closer at the list, how did they get it all so very wrong?
Don’t need 64 bit for more than 4GB? Every new computer should have 32GB and 64GB is not unreasonable.
Don’t need full HD? How does 8K resolution sound with 16K being developed?
I question their basic knowledge and experience with technological advancements for higher demand, more complicated work loads, and adcancements for security protections like 64 bit memory address randomization that can’t exist on 32 bit hardware.
Every new computer should have 32GB
My parents do not need 32gb of ram to browse the internet.
Haha, I mean I’m using up 8 gigs of ram with 10 chrome tabs open, 16gb is definitely starting to look a little lightweight than I’d like at the moment. I built my PC in 2020 with 16gb and around the end of 2021 I ended upgrading with two more sticks, just to keep things running smoothly!
With you adding my RAM, that’s exactly why I say 32GB of 4GB per core, to make sure everything runs while making sure not all hardware is being used so everything runs at max performance.
For example if an 8 core 32GB system is using 6 cores and 14GB, then all programs are running as they should be with ease. But if a 16GB system is using 14GB, there’s not enough memory and might already be starting to pagefile.
“All I need is my dell optiplex from work on windows Xp” -that reviewer.