power to the people

During this recent (and most impressive) string of PR failures on Microsoft’s part, there were all sorts of people coming out of the woodwork on both sides of the argument. Among these was one engineer at Microsoft, who released a statement on pastebin explaining and defending some of their less popular policies. But Microsoft’s policies aren’t what bothered me (their failure to competently expain their position was rather annoying, though); what got to me was a particular phrase in the text:

Microsoft is trying to balance between consumer delight, and publisher wishes. If we cave to far in either direction you have a non-starting product. WiiU goes too far to consumer, you have no 3rd party support to shake a stick at.

To me, this statement is indicative of the ongoing conflict between publishers and consumers, not just in the gaming industry, but virtually any market. Until about a generation ago, the gaming industry seemed to be more consumer-friendly than others. I shouldn’t be surprised that didn’t stick.

The phrase “too far to consumer” in particular irks me. There should be no such thing as “too far to consumer”. The consumers have the money. They decide what they want to buy. A publisher cannot demand I do business on their terms. They are asking for my business. If they lay down demands and prerequistes, they don’t get my business.

Consumers need to stand up and let the publishers know this. They need to stop laying down and saying, “oh well, that’s how it is, may as well get used to it.” I am far from a principled person, and I have caved to some things like Mass Effect 3 even though it meant having to put up with Origin. But other things, like Blu-Ray, I refuse to embrace, because of its oppressive DRM. But this thought that publishers need to wage war against consumers, and wrap it in a cloak of “cloud computing” and “all digital platform” half-truths is absurd. At least Steam was open and up-front about its strategy to become an exclusively digital distribution service. Rather than make the same claim, Microsoft was attempting to say that they would somehow be superior to Steam while levying restrictions that were more burdensome, with few advtanges.

On a wider scale, publishers and developers have adpoted a “divide and conquer” strategy for their products over the past several years. This represents a more subtle, but equally effective, approach at belittling their consumer bases. Often it’s compounded with a psuedo-currency system designed to force people to buy in bulk and end up with leftover credits that can’t be spent anywhere else.

Publishers and distributors should be begging for customers’ business. They should be asking nicely for those crisp dollars, and if they want me to give them more dollars, they should be providing me with content that is worth those dollars. When a developer finishes work on a chunk of DLC they should be stopping and asking themselves, “is this worth the money we want them to pay for it?”

But the sad reality of economics is that things aren’t worth a value determined by the cost of the raw goods to produce them, or the man-hours spent, or the weight of the Brazilian royal family in 1871 divided by the rolling average of the number of car accidents in the developing world. They’re worth what people will pay for them. If people demonstrate they are willing to pay $90 for a used video game, publishers and retailers will charge them that. It’s up to the developers and publishers to put out content worth the price tag, but it’s up to consumers to stand up and say “no” when it’s not worth it.

wii woes, part 3

Nintendo has managed to correct its wrongs in recent years, but the list of things they’ve failed to do right is still considerable. With the Wii U, many old mistakes were learned from…and many new ones are in the process of being downplayed, if not ignored.

Part Tri Ultimate: Nintendo Network
Online gaming has been something of a mystery to Nintendo. In the 90s, a small but thriving community existed in the form of Satellaview. While the service’s user base never exceeded 120,000, it had a loyal core that helped keep it alive well into 2001, just 18 months before the debut of Xbox Live. In 1999, Nintendo launched RANDnet as a successor service to support the 64DD; unfortunately both failed.
Perhaps feeling burned by the winding down of Satellaview and the downfall of RANDnet, Nintendo refused to even consider the possibility of online gaming as they went into the sixth generation. While a broadband adapter was released for the Gamecube, only seven games supported it, and only four of those supported online play. The Gamecube’s online community–if it could be called that–scraped by, barely existing for about six years before Nintendo delivered a coup de grace in anticipation of the Wi-Fi Connection service.
But WFC was just another blundering stepping stone for Nintendo. The service wasn’t concieved until after the DS and Wii had reached the market, and the software was difficult to deploy to both platforms. Nintendo’s solution was to put it in the game cartridges, which only created more problems. With no centralized piece of data to rely on, it was necessary to make use of friend codes.
Oh yes, friend codes. Their legacy is so damning and tainted I won’t even go into it here.

With this generation Nintendo has made their first real attempt at creating an online service to compete with Microsoft and Sony. Behold, Nintendo Network. Finally, a service with a centralized profile, a messaging system, and the ability join online games in a manner similar to that on competing platforms.

But it’s still not quite enough. The Network lacks a real method of mass interactivity; the Miiverse seems to want to emulate environments like Sony’s Home, but is really just a visual representation of a message board. The board itself lacks many features that have been long integrated into even the most basic forums. Direct responses are not an option; one can only respond to the main post in a thread, and hope that anyone else addressed will see the message. The one function that is both unique to NN and useful is the ability to post a screenshot of a game to the forums. This is actually something I would love to see in other services.

The system is also heavily fragmented; Nintendo leaves virtually every aspect of it up to the publishers of each game. While this is great for publisher freedom, it means the user has a very inconsistent experience. Some games may support parties, some may support voice chat. There are no cross-game parties or chats. These are things that need to change for this service to compete.

Even headset support itself leaves much to be desired. There is no bluetooth support; only 3.5mm headsets will work, and even then coverage is spotty. Really the only good choices are Nintendo’s first-party headset or one made by Turtle Beach specifically for the Wii U. Even then, headsets can only be used with the gamepad, as the Pro Controller lacks a 3.5mm port. This all adds up to create a distinct impression of a colossal lack of planning. At the very least, adding a connector port to the Pro Controller would be greatly appreciated; Bluetooth headset support would be ideal, however unlikely.

From Nintendo’s point of view, the Network is a huge leap forward, bringing them closer to their competitors’ online gaming and social webs. From outside, though, it’s less significant. I would really call it a Planck step, personally. But it’s a step. Now if they can just take a few more…

wii woes, part 1

Let’s be honest. The Wii U is not doing well. There are a lot of reasons for this. Some are Nintendo’s fault, some aren’t. More importantly, some of these reasons can be compensated for. Some can’t.

Perhaps Nintendo’s single biggest error with the Wii U has been regarding marketing. The name “Wii U” was a terrible choice. It carries the implication that the product is either an addon to, or an upgraded version of, the Wii. Many people are still under the impression that it is nothing more than a tablet that works with the Wii. The direct result of this is that many people don’t feel inclined to buy it. Nintendo hasn’t done enough to differentiate the new from the old.

While the name can’t be changed (at least, not without causing even more confusion), Nintendo can always retool their marketing, and make customers more aware that this is a new product. Meanwhile, there are far bigger issues that need to be confronted by Iwata and company.

Part The First: Third Party Support

This is where Nintendo has traditionally trailed far behind its competitors. Ever since the Nintendo 64, they have struggled to maintain connections with other publishers and developers while Microsoft, Sony and others shovel dozens of games with long-running consumer bases onto their consoles.

At this point the Wii U is stuck in a vicious feedback loop. Currently, Black Ops 2 has an online player base of about 2000-4000 players on a daily basis. Xbox Live tallies about 200,000 on an average day. As a result, Activision feels less inclined to provide higher support, including releasing DLC on the system. As a result of this, less DLC can be sold. So far none of the Black Ops 2 DLC has been released on Wii U.

In a similar boat, the Wii U release of Injustice has recieved significant content support, but still little in comparison to its bretheren. The DLC that has come to the platform has all come with considerable tardiness. On top of this, Injustice lacks a very particular feature: the ability to play with friends online. The only available option is to play against random opponents (or not so random, in the case of ladder games). One cannot simply pick their friends off a list and play them. This can only be done in local multiplayer.

In September 2012, the Mass Effect trilogy was released as a bundle for Xbox 360 and Playstation 3. While it wasn’t much more than a convenient package for 360 customers, it allowed the PS3 to experience the first game for the first time. The trilogy was not released on Wii U, and there are currently no plans to do so. A reworked version of Mass Effect 3 was released making use of the gamepad. It received good reviews; however it only includes DLC that was already on the market beforehand, and EA does not plan on releasing any more of the paid content that was released afterward.

Speaking of local multiplayer, there are some games that omit online entirely, even if it seems like an inescapable conclusion. Tank! Tank! Tank! is one of these games. Despite having broad appeal and a variety of game modes, the best that can be done is four-player local. At this point, the upcoming Arkham Origins is not planned to have any multiplayer at all. While I’m not particularly interested in multiplayer with regards to the Arkham games, no doubt it comes as a slap in the face to the millions of Wii U owners who plan (or were planning) to buy on that system.

While the userbase is lacking compared to competing platforms, the fact remains that a product never placed on the shelf can never be sold. One certainly isn’t going to build consumer confidence when their consumers feel punished for buying their product. The community has been practically begging publishers to release DLC, with responses that can be generously described as indifferent and ambiguous. Then those same publishers turn around and state that upcoming games will not have comparable feature sets because of the lack of sales, seemingly baffled as to the cause.

Someone needs to break this cycle. While the Wii U and Nintendo Network aren’t what everyone wanted, on the whole it’s been a step forward for them. Nintendo finally has a system and a network that can sustain the functionality its predecessors long lacked. It’s time for the publishers to take the risk. Put the content out, and people will buy it. They’ve been begging for the privilege to do just that.

I for one will likely be buying the upcoming Call of Duty: Ghosts on Wii U. While the PC version will likely recieve more content, have a far larger player base and let me do things like listen to music while playing–not to mention the natural advantages of shooters on PC–my desire to see this system move forward trumps that. If the publisher is going to take the risk putting content on the system, I as the consumer will take the risk buying that content, hoping that they will see it’s worth their time to invest further in. Ultimately, the antidote to these vicious cycles breaks down to hope and trust.

vita excrucio

I’m now convinced that Sony is determined to come up with the most sadistic way to store data possible, and see if people still buy it.  It’s like they’ve created their own tech-centric version of Opus Dei that drives its followers to inflict suffering upon themselves in some futile chase to gain the approval of God.

Let’s look at Sony’s recent history of storage formats, shall we?

Start in 1975, with their Betamax casette.  While a significant jump of quality over the competing VHS, Beta had two major drawbacks.  The first was limited capacity, a logical expense for the tape’s higher resolution (roughly 50% greater than VHS).  The second was much more detrimental–Sony’s jealous and suffocating guarding of the Betamax technology.  Few outsiders were allowed licenses to produce the casettes and players/recorders, hindering the supply of releases and increasing the cost.  Beta died a slow, lingering, and unsurprising death at the hands of the much more widely available VHS.

In 1992, Sony decided to take another swipe at a format completely under their control, and the MiniDisc hit the market, an attempt to combine the best attributes of magnetic and disc media.  Sony again failed to open the technology to consumers, producing only a handful of MiniDisc read/write drives (not counting the MiniDisc players themselves, some of which could be used to write data to cartridges).  While it was (sort of) a hit in Japan, MiniDisc failed to gain any traction in America or Europe.

Skip ahead another decade, and Sony’s introduction of the Universal Media Disc to coincide with its PSP system.  While it had promise with its high capacity of 1.8GB, Sony again closed off avenues of production, amid piracy concerns.  Intent on making a portable disc-based gaming system, they designed it with a very similar physical format to the MiniDisc, encasing the actual disc in protective plastic that turned out to be rather flimsy and unenduring. Once again, Sony’s attempt to stifle piracy failed, as it was relatively easy to use a PSP as a reader connected to a computer and directly rip images of games off the UMDs themselves.

Meanwhile, the Memory Stick was pushed to market in 1998, for use in Sony’s digital cameras and the (late) CliĆ© PDA.  While a formidable technology in terms of performance, once again Sony stifled production in the name of control and the Memory Stick was limited almost entirely to use in Sony’s products alone.

The company managed to change their fortunes with the breakout Blu-ray technology.  Designed as a successor to DVDs, Blu-ray competed directly with the lower-capacity, but cheaper, HD DVD.  This time, Sony did not fully control the format, working to design and implement it with over a dozen major manufacturers to give it a boost in market penetration.  Beyond high-definition movies and mass storage, however, Blu-ray has seen little innovation.

Now Sony is preparing to paint themselves into a new corner with the looming Vita.  The new handheld will make use of a proprietary memory format and a new storage format for the games themselves.  Sony has already ommented that memory use will be highly compartmentalized–games that run on internal memory will not have access to external memory, and vice versa.  Furthermore, the use of yet another controlled format will only drive the market up–it has already been announced that 32GB will cost $120, as opposed to a same-capacity microSD card which costs a third that; for a more fair comparison, look at 32GB Memory Stick Pro Duos, which are in the $80-90 range.  This is not going to persuade many people to buy memory at such a high cost.  Sony’s fear of piracy is understandable, but they could have saved consumers, and themselves, untold costs if they had simply went with an established memory format instead of designing and implementing their own.  We have yet to see how this one holds up, but I’m not holding my breath.

past masters

I like to think of myself as having been a Nintendo fanboy since at least 1991, and I still prefer my Nintendo consoles to my others, for various reasons.  But what’s been bugging me lately is their library.  Not their new releases, which we all know are somewhat limited.  No, it’s their overall library on the Wii and 3/DS/i that I’m talking about, specifically the digitally-distributed sort.

Nintendo started off the Wii launch with its most promising venture–its entire collection of previously-released games.  The Virtual Console, paired with the Wii Shop Channel, opened the door for classics of all shapes and sizes to pour through the floodgates, bringing a wave of nostalgia to longtime fans–and bringing some dejected oldies to the attention of a new generation of players.

But as of right now those floodgates remain in a rather unfortunate state.  Of late I’ve noticed a distinct lack of new releases on either the Wii Shop Channel or the eShop on the 3/DSi.  Okay, I noticed it several months ago, but this time I decided to run some numbers.  They don’t make for a great outlook.

Total Nintendo Download Releases.

At present, 392 games are available on the Wii’s Virtual Console service.  This amounts to roughly 3 new games every two weeks since the first games were pushed out on November 19, 2006.  This doesn’t seem so bad in the end–400 games is a lot to choose from.  But this is paltry compared to the vast libraries Nintendo has built in the past three decades.  Just counting the NES, SNES, and N64, about 1,958 games have been released over the (many) years, depending on where you get your list.  This number is highly debatable, and it is impossible to build a comprehensive list of all releases, so I will round down to 1,900 to be safe.  Out of this number, Nintendo has tapped a mere 20% of the product pool, as it were.  But that’s not the entire library, either.  Games for the Master, Genesis, TurboGrafx, Neo Geo, Commodore 64, and arcade machines have also been made available.  While it is impossible to account for all games released on the last two platforms, all the other systems total a count of nearly 3,500 games.  This reduces VC’s library to a mere 11% of its total potential.

The Virtual Console has also been made available on the 3DS Shop.  Game Boy, Game Boy Color, and NES games have been pushed to the platform, but so far less than fifty titles are available.  The GB and GBC alone open the possibility for over 1,100 games.  I’m hoping they see more potential in it than they have previously.

Moreso than this, Nintendo has neglected some of the aforementioned systems on the 3DS Shop.  The Master, Genesis, and SNES would work beautifully on the 3DS–in fact, I would be far more likely to play SNES games on the 3DS, because I’m not keen on using the Classic Controller with my Wiimote.  Interestingly, the Virtual Boy could also be implemented on the 3DS, making use of the true 3D screen, and possibly even with full-color graphics.  This is something I would like to see Nintendo do (but that’s a very long list).

Note: This chart does not include PSN.

Now for the damning comparison.  Nintendo’s efforts have foundered compared to its two noncompetitors. While Microsoft has made fewer overall games available on Xbox Live, the most original titles are seen there.  PSN features the fewest original works, possibly due to the burden of cost being shifted to the developers, but Sony has made oodles of games available for download via the service.  The chart below visualizes the release rate for Nintendo and Microsoft only.  Sony is not accounted for, since virtually all PSP games, a vast proportion of PS3 games, and considerable original titles are available for download.  In addition, TurboGrafx and Neo Geo games are available on PSN, in greater numbers than what Nintendo has to offer, along with Dreamcast games.

A little perspective.

While the graph shows Nintendo in a solid lead over Microsoft, it’s worth noting Microsoft has released fewer than 40 of its original Xbox games on Live. Both Xbox Live and PSN are making much more headway with original works, and have fostered better connections with indie developers.  While is isn’t surprising given Nintendo’s history with third party developers, that doesn’t make it any less dismaying.  Nintendo is only leading the pack with sheer numbers of regurgitated titles, instead of working with independents to help create innovative and quirky motion-controlled (or 3D) games that could revitalize their catalogue.

If Nintendo’s long-term strategy is to lean on its golden oldies, so be it.  But they better grab an oar and start paddling, because the propeller has long since given out.  If not, they better start building a whole new boat, and set sail post haste, because their rivals have seen much more ocean.

de-1337

I’ve not been the biggest fan of the adjustment succession of the Modern Warfare games of late.  Don’t get me wrong, I’m not the hipster MW hater much of the internet has come to be, hating the franchise simply for the sake of hating the franchise.  I loved the first game, not least because it had a nicely executed OHSHI- moment that really made the player confront mortality, a rarity in shooters.  The second game changed the main character into something of a superhero, gunning down enemies in a James Bond-style chase down a snow-laden mountainside and surviving a knife wound to the chest and still managing to use the same knife to kill an enemy.

Modern Warfare 3 does a better job of using multiple viewpoints to tell the story, including one segment in which all the player does is literally walk a few feet before that character dies in a sudden terrorist attack.  But I have other fish to fry.

I would say that Activision is worrying me with the way this series is going, but that would be something of an understatement.  So far little progress has been made with it, other than innovative storytelling.  The games use virtually the same engine, with only minor tweaks (though Infinity Ward claims they are major advancements on par with the invention of the third axis).  Modern Warfare 2 and 3 are little more than glorified map packs–indeed, several of the maps are virtual copies taken from previous games, and many map structures appear to have been copypasted into “new” maps.  About the only part of the engine that shows improvement is the lighting, and even that is debatable, as shadows are blocky and pixellated, and roughly the level of quality I would expect on a low-end Wii game.

Find the differences. I dare you.

Then there’s Elite.  This aspect of the game worries me the most.  The idea is this: you pay a subscription fee, and you get access to all new content–map packs, weapons, whatever–released during the subscription period.  Considering Activision has held a pretty strong record of releasing maps for a rough price of $5 apiece, this sounds like a great way to save some money.

Maybe.

I’m skeptical that this will actually be to the players’ advantage, even ignoring the fact that this is Activision.  For Black Ops, the publisher managed to push out four map packs in a year’s time, which at $15 a pop would actually have justified a subscription service such as Elite.  But this is not a pace they’ve kept with other games.  Even were this not the case, Elite creates an incentive for the publisher to go slow: the slower they push out content, the more they milk the subscribers for.  And again, being Activision, I can absolutely see them pumping the teats dry.

Already cracks are showing in Elite’s armor.  The service has not been active since the game’s launch a full week ago, with Activision blaming the outage on heavy use overloading the servers.  Even more, they have moved the PC release of Elite into the “indefinite” category, and it’s unlikely the service will ever be available to the master race.  This will force the PC consumers to buy map packs as they are released, and with a minor price increase, or by breaking packs up into fewer maps or less content each, this opens up another hose through which they can suction cash like so much crude at the bottom of the Gulf.

antique marvels: homebrewing

Today, Sony’s PlayStation devices are mostly free from the blight of open-source software.  The XrossMediaBar organizes everything in a simple interface that is easy to navigate with a controller.  Facebook, Twitter, Last.fm, YouTube and more have been integrated into the operating system, allowing enormous control at one’s fingertips.

But it wasn’t always like this.

Once upon a time, Sony was friendly to the homebrew community.  The original PlayStation had a sizeable, if largely unseen, army of players and programmers making their own games at home to share with friends.  In 1997, Sony capitalized on this by releasing the Net Yaroze, a development-kit PlayStation for home programmers to toy with.  The Net Yaroze was popular, even being used in programming competitions in the United States, Europe and Japan.  Many home-developed games appeared on demo discs of PlayStation-oriented magazines.

With the coming of the PlayStation 2, Sony contined the practice by producing a kit to install and run Linux from the unit, much in the same way as one might install it onto a home computer.  While the intended purpose of this kit was also to spur development, it was mostly used to convert the systems into home servers of one type or another.

But fortunes began to wane when the PlayStation Portable arrived.  Many attempts at homebrewing were consequently shut down by Sony, with each new system update bringing more frustration to the homebrewers.  Most stack attacks and unsigned-software methods are now ineffective, limiting what could have greatly expanded the PSP’s market with cheap, simple, innovative games–the same mechanism that powers the smartphone software market.

The rare Linux-armed Portable, seen in a laboratory.

The death warrant was signed by the PlayStation 3 in 2010.  Initially the console was distributed with the ability to run Linux without issue, as was specified in the original user license.  But on April 1, 2010, Sony released a patch for the PS3 that removed the ability to install these operating systems, claiming it needed to close a loop in security caused by George Hotz, aka, Geohot, who had created a custom firmware using Linux.  Lawsuits are now in progress against Sony, claiming that by doing this, the company violated its user license.

With the coming release of its newest product, the PlayStation Vita, Sony has revealed that games made to use external memory, or games running on its proprietary memory card, will not be able to access its internal memory.  Conversely, games made to run on internal memory, or those downloaded from the PlayStation Store, will not have access to external storage.  Any games that may need access to a memory card for saving will require that the memory card be present before the game is started.  This prevents virtually all known forms of statck attacks or unsigned-code methods by closely limiting where the game software goes when in operation.

The future for homebrewing on Sony’s consoles is looking bleak.  But if history has taught us anything, it’s that there is always a chance…that Sony may someday see the light and welcome its community back with open arms.  But while it is entirely up to Sony to make this decision, one thing is for sure: the homebrewers will continue to operate, whether they are scorned or sanctioned.

Antique Marvels will return after the commercial with: Platformers.

roots

So.

Origin is the hot topic of the day, as is Battlefield 3.  The consumer whore I am, I decided I couldn’t wait to nab a copy of BF3 to play on launch day, but I’m wary of Origin (for obvious reasons).  I’ve read statements from EA claiming that Origin would not be required for any of their games to run, which apparently have all been deleted from the internet–either that, or my hat isn’t working anymore, because apparently Origin is and will be required for (virtually) every game to run, including Battlefield 3.  Oh well, I figured.  I purchased it on Impulse in an attempt to get around this issue.

Not only did it not work, it backfired like a CIA operation.  After a four-hour, 13GB download (yeah, my internet‘s not great), I attempted to run BF3 through Impulse and got…a game key.  That was it.  A small window in the top center of my screen showing my game’s key, which when clicked on caused Origin to launch.  Origin then prompted me for said key, after which point it began downloading the game.  Even after installing via Origin, I could not launch the game from Impulse.

What is the point of selling the game through a third-party vendor when it can’t be used by that vendor’s software?  I understand EA’s want to have Origin on the market, and have its own social network.  But it’s one thing to have Origin running in the background while the game runs, and entirely another to force someone to re-download the game even though it’s already been legitmately purchased through another storefront.  Another four hours later, I was finally able to play the game.

Thirteen gigabytes of wasted bandwidth.

At this point I was confronted with the Battlelog, perhaps one of the most confusing elements in the game so far.  Rather than having a server browser built into the game, which has been done for many years now, EA and/or DICE seem to think it was a good idea to make players manage their avatars and find servers from their internet browser, which then launches the game itself and loads everything up.  This means that switching servers requires quitting the game and re-launching it, with no shortcuts in the middle.  Meanwhile, the consoles have a fully functional server browser within the game, with almost all the options available to the PC players.

One argument I’ve heard is that this system allows for closer support for changes and patches.  Isn’t this what Origin is for?  Steam pushes updates to games automatically, so if Origin doesn’t do that, what’s the point?  EA didn’t want to play Valve’s game of providing easy-to-get DLC and updates, so if they’re not going to use Origin for that, is this just their digital iron maiden to put players in?  Seems like they’ve decided ActiBlizzard‘s douchebaggery made them look too good, and they had to start a pissing contest.

Another defense is that this allows for simplification and a more unified interface for the players.  Again…isn’t this what Origin is for?  Origin allows direct access to the EA store via what is basically…a browser window.  Couldn’t this be rolled into Origin, even in the exact state it’s in now?  Apparently not.  As the rule goes, it can’t be too easy or logical…somewhere it has to get convoluted, just to screw with people.

And what happens when this website is shut down?  Certainly they won’t keep it running indefinitely.  Someday the userbase will decline to a point at which EA decides it’s not worth running the Battlelog anymore, and then…no more play.  If all servers were centrally hosted by EA, this argument might be baseless, but they’re not.  They’re hosted by whoever feels like hosting one, to avoid the issues Activision ran into with peer hosting in Modern Warfare 2.  But this is pointless if one can’t find the servers to login to, and unless EA/DICE decides to patch the game to include a server browser that isn’t going to happen.  I’m beginning to become convinced the developers and publishers out there want to destroy the PC platform for no reason other than sadism, or perhaps boredom.

duke flukem

Duke Nukem Forever was put up on a Steam sale last week, so I decided to take it up.  I’ve put four hours or so into it by now, and I have to say, I’m not impressed.

To be fair, I’ve only so far put a few hours into the game.  But already it’s flailing about like a frog reanimated with raw electricity.  The game runs on a heavily modified Unreal Engine 2, so much so that the increase in detail and model complexity almost makes it look like Unreal Engine 3 (which makes me wonder why Gearbox didn’t go with that out of the box).  This is probably the cause of issues, because I’ve had to scale back the game’s settings considerably.  While my system isn’t bleeding-edge, it’s not a pushover either, and the fact that the game’s high settings drags it down to 10 fps tells me something in there is overloading the processors.  This framerate drag can even leak into play at lower settings, but oddly at random times.  This can be corrected by just sitting and waiting for a minute, but you can’t always do that in heavy combat, and even so, having to stop and wait for the frames to catch up can really hurt the motivation to continue playing.

Interestingly, the skybox isn’t a clear image.  When looking around at normal magnification, it looks perfectly fine, but when you zoom in the view–even the basic “iron sights”-esque zoom–it’s quite obviously fuzzy, as if the developers didn’t intend (or want) the player to actually look at it.  This is a bit puzzling, because it wouldn’t have taken much more to fully detail the image, even on the off-chance of being looked at.

The shadows in the game behave oddly, to say the least.  When the focus point of the player’s view changes, it seems all shadows are redrawn, so that objects seem to glow.  This has distracted me on more than one occasion with thinking that the “glowing” oject was my obective when it was just a minor lighting glitch.  This even happens with some “permanent” objects, such as buildings and terrain, not just items that are sitting around.  I can only surmise that this is also a result of the heavy modifications made to the game engine.

Above: The shadowing issue in action.

Probably the worst design decision the team made was to graft platform and puzzle elements into the game.  In one stage, you come across a statue of Duke which must be used to reach the next floor of a building.  The statue’s hitbox is small enough that one can very easily fall off the arms, making the task tedious and annoying.  Later Duke must utilize a crane to continue his progress, and like most other puzzles in the game, the solution isn’t very clear.  At least six times in the first six chapters, I have had to look up YouTube videos to solve these awkward puzzles, and each time I felt stupid for not noticing the solution, even though it was badly designed.  Don’t get me wrong, these features can work in a shooter, but they don’t really fit into this game at all.  Duke  is all about fast and furious gunplay with weapons of absurd destruction.  Breaking the momentum with tedious puzzles and framerate issues can kill the game, and in this case, probably does for many players.

What strikes me as particularly funny is that another game feels more like Duke than…well, Duke.  BulletStorm is a game focused purely on gunplay, and capitalizes on this with skillshots that grant points for particularly interesting, unusual, or simply skillful kills of various flavors.  These skillshots make the game far more enjoyable by encouraging creativity with kills, while having almost zero emphasis on plot or puzzle elements.  I get the distinct feeling that BulletStorm is the game Duke Nukem Forever wanted to be.  It’s certainly the one I enjoyed more.

id-ridden

Recently the gaming world was rocked by an earthquake of indescribable magnitude.  That earthquake was the sound of one of the great titans falling–id Software, or more specifically that great god of gaming John Carmack.

For the past four years, id has been trumpeting the development process of their new engine, id Tech 5.  This new engine would support enormous textures (as high as 128,000×128,000), live streaming of textures into the game world, automatic optimization of resources to make cross-platform development easier, and dozens of upgrades to increase the visual quality of games, such as multi-threading and volumetric lighting.

Comparison: id tech 3, 4 and 5

But when the hammer hit the nail, some bad things happened.  Immediately the PC version of the game suffered from horrendous texture pop-in; if the player shifted their focus of view for even a moment, high-resolution textures would be moved out of memory, thanks to the texture streaming aspect that the studio so staunchly stood behind.  The result is a world of constantly smeared textures, which looks so badly like an overused highlighter that even Joystiq had to take a shot at them.

Compounding the problem, the game had virtually no settings the player could access.  Literally.  These are settings more or less like those found on console games, where there is no need to allow players to alter every aspect of the game’s performance.  A patch now allows more settings to be accessed, but being one of the chief advantages of the PC platform, this is something that should have been there from the start (and is there, in virtually every other PC game released in the last fifteen years).

This is all baffling when taken in the context of who made the game: John Carmack, one of the foremost PC developers of all time, and probably the most ardent crusader for the platform in this day and age.  At this year’s QuakeCon, he speaks of the differences between PC and console development and mentions (as he has many times before) that the limitations of consoles hold back PC game development because games essentially must be developed for the weakest platform, and can only be scaled up or adapted for the PC.  From the way he constantly brings this up, it seems the logical solution is to abandon console development entirely and focus, with religious zeal, on the PC platform.  But he insisted (or at least, someone insisted) on creating an engine and a game for all three platforms.  The end result is…well, you can see for yourself.

Making things even worse, id has recently revealed that their test builds of the game ran on machines with drivers that had been customized.  There are no words for how foolish a plan like this is.  This would be like custom building a car engine that ran on a homemade concoction of fuel, and then complaining when that engine failed to run properly on the fuel that people actually sell at gas stations.  AMD and nVidia, stunningly, have released driver updates that have greatly improved RAGE’s performance–but this arrangement is backward.  Developers don’t dictate to hardware vendors what their drivers should do.

And now, as can be read in the above link, id is saying that the PC isn’t the “leading platform?”

A titan has truly fallen today.  Let’s hope the PC pantheon can hold itself up.