off the block

Earlier this week, one of the gaming industry’s biggest figures began his exit. Markus Persson, better known to fans as Notch, sold his baby off and has decided to move on. Reactions run the gamut, naturally, especially in this land of the internet where hyperbole is the only accepted form of communication. In less than six years, what started as a pet project for Notch eventually grew to a community of tens of millions, worth $2.5 billion. If my math is right, that means Minecraft made him over a million dollars a day, not counting the sales of the product itself. If those sales are accounted for, that adds another billion or so to the total, raising the daily breakdown to $1.5 million, for something that started as a mere personal project for amusement. Mind boggling.

It’s not about the money. It’s about my sanity.

In his personal blog post, Notch revealed that after the sale is finalized, he will leave Mojang, along with its other two founders, and go back to his own personal tinkerings. Many reacted with shock that he would abandon something so popular, so large and so profitable. In his closing line, “it’s not about the money. It’s about my sanity,” he exposes a larger problem that has surfaced in recent years in the gaming industry. It’s incredibly easy for someone to become famous, and sometimes the wrong people get put on the pedestal.
With the release of Minecraft, Notch’s little project almost instantly began to get attention. Just two and a half years after the first public alpha release of the game, Notch and 5,000 people gathered at the Manalay Bay in Vegas to celebrate his game. Less than a month after that, Jens Bergensten was given complete creative control over Minecraft, and Notch more or less became a CEO in function. In his free time he pursued new projects like Cobalt, Scrolls and a very experimental concept called 0x10c. While showing promise, 0x10c eventually failed to pan out, and Notch cancelled it. Already he was finding his personal limitations; while he now had the time and money to pursue whatever project he desired, he didn’t always have the ability to make it work.
Of course, this isn’t the first time this sort of thing has happened. History is replete with instances of common individuals who do something otherwise innocuous and suddenly gain massive fame and wealth from it. In recent decades, people like George Lucas, John Carmack and Mark Zuckerberg have risen from obscurity to worldwide fame in months. But therein lies the rub–not everyone is cut out to be famous. Our most recent example of the wrong person can be found in Phil Fish, another game developer. Fish created Fez, a very complex game with very simple basics, which became another indie hit and propelled him into the spotlight. Fish turned out to be impulsive and vindictive, often doling out harsh personal insults to people who criticized him. In perhaps the most remembered moment of recent Internet history, Fish went on a rampage on Twitter and cancelled the development of his game’s sequel.
Notch saw Phil Fish, and worried that he might become that man. He realized he wasn’t meant for the spotlight he was in when Mojang announced changes to the Minecraft EULA and users across the internet instantly targeted Notch, who had not been involved in the EULA changes. He realized he had become something different to his fans and his players, and he wasn’t what they wanted him to be. He would rather sit in obscurity and toil on his own pet projects, which he is now free to do for possibly the rest of his life. Will he be remembered as one man who flew in from the night, changed indie gaming forever, and then vanished? Or will he be remembered as the man who started a phenomenon, and then gave his baby away to a company with a less-than-stellar track record with acquisitions? We’ll see.

what the players want

A few days ago, news trickled down to the outlets that the Steam controller’s design had been tweaked once again, triggering waves of debate running the gamut from high praise to condemnation and everything between. But if nothing else, the design history of the still to-be-released Steam controller is just more proof that the gaming consumer base is fickle, and doesn’t always seem to know what it wants.

A cursory search will bring up a great deal of information (and even more opinion) regarding the state of innovation in the gaming industry, in all imaginable forms. Most of the opinion seems to demand more innovation, insisting that gaming has lost all creativity and become nothing more than an assembly line for money. And it seems to be true–each year’s Call of Duty game looks virtually indestinguishable to its predecessor, except in minor gameplay elements. Each new Mario game is looked as “just another Mario game” by many, only introducing the occasional new character and sometimes a third dimension. People just don’t see a whole lot going on other than the gaming equivalent of injection-molded mass production. Even though Microsoft reportedly spent “hundreds of millions” designing the Xbox One controller, it just doesn’t look very different to the average buyer.

But the facts tell a very different story. Over the years, many people and companies have had many different ideas about what a game controller should be. With the recent change announced by Valve, many are asking questions like “why does every game controller look like an Xbox gamepad?”, apparently forgetting that the Steam controller started out as a much more radical departure from the norm and has gradually been migrating toward a more traditional design, most notably dropping the expensive touchscreen in favor of a few fixed-use face buttons. Well beyond this, the consumer base of the gaming has shown, to the shock of many, no real desire for innovation–games that are praised for being innovative and original just don’t sell. The adage just doesn’t die: You vote with your wallet, and when innovative titles fail to sell, it tells developers and publishers to go back to the established formulas and design ethics.

This applies in the world of controllers, too. Everyone compared the Wii U’s Pro Controller to that of its competitor, but they didn’t seem to see that controllers these days have been more or less boiled down to their essence. Hardware designers have had over thirty years now to experiment, to come up with new layouts and concepts and see how they play out with the consumers. Some succeed, many die out. These ideas are even somewhat cyclical: The Sega Genesis had a concept controller while in the design phase, that strongly resembled the Wiimote of today. The PlayStation controller originally had a much more radical design before it shifted back to the DualShock design that is now a staple not just of the PlayStation, but even of its parent company as a whole. Even when new concepts seem to take off, they don’t seem quite able to reach orbit. Microsoft’s Kinect seemed like a great idea at the time, but after a while, the novelty wore off, and even putting it in every box and forcing its use couldn’t keep it afloat. People just wanted controllers in their hands, it seemed.

Even the company known for innovation, Nintendo, has been caught in this net many times, most notably with the N64 controller, which is often rated on lists as the worst controller of all time (either for Nintendo specifically, or the gaming industry as a whole). Early in development, the PlayStation 3 used a totally new design of controller, often referred to as the “boomerang“, that was lampooned by many as ridiculous, though it was rumored to be very comfortable to use. Why did Sony ditch the futuristic design? “…there are so many players who are used to the PlayStation controller; it’s like a car steering wheel and it’s not easy to change people’s habits.”

People like what’s familiar. You want both new customers and longtime fans to be able to use the controllers on your new console easily, without a long adjustment period. Controllers have reached the point where most all needs are covered. Two joysticks for analog movements in most games, a d-pad for older games and menus, a four-button cluster on the face, two sets of shoulders, and an extra button under each analog stick. All arranged so that they are easy to reach with thumbs and fingers. There’s really not much room for improvement anymore, unless you’re a pro gamer in need of more convenience. Just as evolution has dead ends, so does hardware design. And until we find something that can surpass the ulitity and ease of use of current controllers, that’s what will continue to be made and sold.

some assembly required

Just when I think I’ve seen the complete progression of a particular branch of gaming evolution, another step appears just to surprise me. As if the industry hadn’t already been overly focused on DLC, in the past few years they seem determined to take it to its logical conclusion, stepping even beyond the sad reality we’ve grown accustomed to.

Earlier this year, Warner Bros Interactive went back on their plans to provide DLC support for Arkham Origins on Wii U and canned all story DLC that had previously been planned. Apparently feeling screwing over the maligned Wii U was not enough, they decided to shaft their other customers in a different way: bugs and performance issues that had been prevalent in Arkham Origins would be ignored in favor of the aforementioned story DLC. Not only is WB comfortable with releasing an incomplete game, they’re clearly very okay with leaving it that way. At least Bethesda (eventually) patches its games.

In another part of the realm, Ubisoft’s reputation as a top independent has been slipping rather quickly of late. With Rainbow Six: Patriots languishing before being leaked (and then cancelled), and Siege not even in alpha, it’s now been six years since the franchise last showed a major entry. Meanwhile, the Splinter Cell games–once a hallmark of stealth tactical action–have slowly been corrupted, with Blacklist being a fairly boilerplate shooter game in which stealth is an afterthought. The Division has looked promising, but its release remains nebulous, and in fact this is mostly irrelevant after issues with Ubisoft’s other major release: Watch Dogs. A very ambitious title, early trailers show a gorgeously detailed, living world for the player to explore; but actual gameplay videos are considerably less impressive. More damning was when modders found a way to restore the game to its previous level of graphical quality, which didn’t stop Ubisoft reps from denying that such a downgrade had ever occurred. In video reviews, both TotalBiscuit and birgirpall have noted glaring issues with the game (to be fair, birgirpall’s whole draw basically revolves around the “I broke <insert game>” schtick). Ubisoft isn’t letting such trivialities stall their plans, however, and rolled out the first piece of Watch Dogs DLC just ten days after the game itself was released.

Even smaller developers are feeling the fever. Until recently, From Software held the (publicly stated) belief that games should be released as whole products and had denied any plans to release DLC for their games. As of two weeks ago, that position radically reversed, with not just DLC, but a DLC trilogy announced for Dark Souls II. This could easily have been marketed as a form of episodic content, but these days “DLC” is the buzzword, so I guess that worked just as well for them, except that “DLC” is increasingly carrying a negative connotation among gamers, who often feel defrauded or let down by such content.

No list of gaming sins would be complete without two-time national champion EA included. At the height of the anti-Call of Duty movement, DICE boldly declared that they would “never charge for Battlefield map packs“. While the DLC released for Battlefield 3 was certainly more than just maps, it still left a bitter taste in my mouth to pay $15 a pop for each of them. To date I have not purchased a single one; the two I do have were only acquired because they were given out as freebies. For the same reason, I chose not to throw my money at Battlefield 4, which was itself little more than a major patch for Battlefield 3. Nevertheless, they managed to thoroughly break the game, and eight months later EA and DICE are still cleaning up the mess. Even after this, and after stating as recently as last winter that Battlefield would never be an annual franchise, DICE and Visceral Games are preparing for the release of Battlefield: Hardline, which is regarded as little more than DLC packaged and marketed as a separate product.

At this point, it’s pretty clear that DLC is taking priority over actual products. If developers at least tried to steer closer to episodic content, it might be understandable. But this is a different trend, one I can’t see ending well for gaming in general. It seems like it’s up to the hyperconservatives to hold the line on this one…which is an odd feeling for a liberal.

Subscribe for exclusive early alpha access to upcoming content for this blog post.

the state of art

We are fast approaching a tipping point in the realm of video games. I like to think of it more as a singularity point, because in the near future many people will have to come to terms with the evolution of games as a whole. In the past (and present), many have defined art as purely noninteractive–you look at a painting, and you interpret it, but you don’t add or subtract paint from it. Video games, on the other hand, are frequently defined as “interactive art“, a term that is itself controversial. But it’s this label of “art” that is will have the most effect on their future.

The recent release of Wolfenstein: The New Order invariably fell afoul of censorship laws in several countries. As such, the German version of the game is devoid of Nazi symbols such as swastikas, and it faced heavy resistance from the Australian Classification Board, known for its long list of banned games. This is nothing new: from their inception, games have faced higher standards than other forms of media due to their perception as toys and the likelihood of exposure to children. In the case of Germany, the ban stems from a much wider cultural mindset: all Nazi symbols are banned in media, except in cases of educational or artistic depictions. Portrayals of violence are similarly prohibited, with some interesting workarounds being the result.

Therein lies the rub: If Wolfenstein is not art, what is it? Game developer art departments are now as large as those of big Hollywood movie studios. Millions are spent on AAA games such as this, with a not-insignificant fraction going to the people designing clothing, billboards, vehicles and buildings, all to create a living world that the player can immerse themselves in. We have games that range from hyper-realistic to highly stylized and everything between. There are even games that feature quadrilaterals as main characters, and abstract stories based entirely on wandering through the desert. Games have made us question our morals, and have turned our worlds upside down.

But is it really “interactive media”? You press buttons to make the protagonist move, shoot and operate objects, but in the vast majority of games, there is a very explicit path with a fixed beginning and end. The developers clearly have a defined story they want to tell, and there is only one way to experience it. Some games go as far as to be little more than playable movies (not naming names here). These are the most forced form of the medium, in which players experience only exactly what the developers want them to experience. Even games like Fallout, which appear to present the player with limitless possibilities, have an ending that doesn’t change (or changes very little) regardless of the player’s decisions and actions. The player may influence the movement of the brush, but in the end the painting still more or less looks the same.

To me, there is no doubt that games are art. But the debate likely won’t end anytime soon, particularly when considering the age gap between the average gamer and the average legislator. Just as their parents didn’t see television as an artistic medium, they often don’t see video games that way. Will it take another 30 years before our generation enters politics to see the issue in the same light? I hope not, but it definitely feels that way.

adaptations

As the seventh generation began winding down, everyone had high hopes. The Wii had smashed even its enthusiastic expectations, bolstered by a generous library of classics. Microsoft cemented its place as a major player, with successful franchises and the continuing expansion of Xbox Live. The PlayStation 3 managed to pick itself up and race to a close third. Overall, the three together managed to eclipse the sales of their combined predecessors, and as the eighth (and current) generation loomed on the horizon, everyone was riding high waves of success and victory (and income). Those waves have since crashed on the rocky shores of a new land, and those who thought they would stick a perfect landing have had something of a rude awakening.

Everyone has taken a step (or multiple steps) back from their originally-ambitious offerings of a few years ago. Microsoft recently announced a Kinectless Xbox One package (as well as finally announcing free access to services like Netflix), doomsayers are predicting the demise of the Vita, and the Wii U…it really doesn’t need to be said. So far the only real successes are the PlayStation 4 and Nintendo 3DS, both of which seem to be enjoying incredible popularity and acclaim. This feels to me like the strongest example yet of Darwinian gaming, vicious competitors adapting to 1) keep themselves alive a bit longer, and 2) try to gain an advantage over their neighbors, to keep themselves alive even longer.
Almost all of this is attributable to marketing blunders or bad design decisions that went uncorrected. Nintendo’s Wii U seems to be loved by most of its owners, but horrible–even nonexistent in some cases–marketing left many prospective buyers confused about what it was. Even highly acclaimed, well designed games like Wonderful 101, Sonic Lost World, and Pikmin 3 haven’t done much to shore it up against the onslaught of its competitors. Others, like Deus Ex: Human Revolution and Batman: Arkham City, have demonstrated the strong possibilities of using the gamepad to augment the user experience, but it hasn’t caught on. More than ever before, Nintendo is relying on its first-party offerings to keep the life jacket inflated.
Meanwhile, Sony has had wild success with the PlayStation 4, but its little cousin the Vita has been struggling. This despite the handheld’s greatest strength: Remote play. The appeal of playing your PS4 over LTE while on break at work is undeniably strong…yet it doesn’t seem to have motivated that many cross sales. Sony has at least seen this and put out a bundle for sale in Europe–but they have no plans to market this in North America. They’ve seen the light, but seem to be trying to look at it from an odd angle, or perhaps in the wrong wavelength.
Listening to Joystiq the other day, an interesting idea was posited: What if Nintendo radically changed their strategy, and changed the Wii U as we know it? What if they dropped the gamepad, made the Virtual Console a subscription service, and scaled down the hardware until it was essentially a “Nintendo Ouya”? Legions of fans would gladly fork over cash every month to play Nintendo classics, that much is beyond doubt. There are still dozens of games that could put huge momentum behind the Virtual Console, but Nintendo so far has failed to tap the torrent. Meanwhile, Sony puts out tons of games each year that are freely accessible, as long as you’re willing to pay a recurring fee, a model that seems to have had resounding prosperity. (Of course, PS+ has far more useful features than just that.) This is something Nintendo could adapt themselves to, and exploit, and probably carry right into the endzone.
But I feel like I may as well ask animals to stop trying to cross the highway.

wasteband

Storage is cheap these days. It’s not rare to find terabyte drives in low end desktops, and many people have several multi-terabyte drives to store oodles of data. In particular, many games have large drives to store games downloaded from Steam, GOG, Origin, Uplay, or any of many services out there. There’s no doubt about it–games are getting bigger. But is it better?

Recently Bethesda announced that the upcoming Wolfenstein: The New Order will require 47GB of hard drive space to store the game. It’s already spilling over into dual layer blu-rays, and the Xbox 360 version will span four discs. This brings back some old memories, not all of them good ones.

It’s one thing if there is actually enough content to justify such a large download size, but is there? Titanfall on PC is a 48GB download–of that, 35GB comprises every single language of the game, in uncompressed audio. That’s not “lossless compressed audio”, or even “high bitrate audio”. Uncompressed. Respawn claimed this was to accomodate lower spec machines, but this reasoning is (to use a technical term) bullshit. We’re in the days of six- and eight-core computers, when even low end duals and quads have cores sitting on their laurels with nothing to do, when processing and decompressing such files is a trivial matter even for a cheap entry-level cell phone. This isn’t even excusable, it’s just laziness.

Rather than an isolated incident, this is on its way to becoming the norm. Max Payne 3 will cost you 29GB if you want it on PC. Battlefield 4 demands 24GB even before DLC is added in. For comparison, World of Warcraft was roughly 25GB at its worst, before Blizzard rolled out a patch that hugely optimized the game and pared it down to size. Skyrim is a diminutive 6GB in size, and still looked good (but that’s not to say it can’t get better). Meanwhile, Rocksteady recently commented that the Batmobile alone in Arkham Knight would take up half the available memory of an Xbox 360. I’m left wondering how much space a game like Grand Theft Auto V would will consume. I have a 1.5TB hard drive that is mostly taken up by games, and I’m not keen on shoving another in there.

Storage limits aren’t the only concern, either. Most internet providers impose limits on users’ activity, namely through download caps. In some cases, downloading even a few games like Max Payne 3 or The New Order will put someone over their limit, resulting in their speed being throttled or huge overages on their bill. I had to download and install Titanfall three times before I could launch, meaning I burned through nearly 150GB of data. While I (no longer) have a cap to worry about hitting, many users aren’t so lucky. And what about patches? Machinegames recently decided that 47GB isn’t enough space, and will be applying a 5GB day-one patch to the monstrosity of a game.

Other than optimization of files, where is the future? My money is on procedural generation. While its engine is simplistic, Minecraft can generate vast worlds using a binary that is a mere 100MB. At 148MB, the binary for Daggerfall makes use of procedural generation to create a world that would cover most of England. Going off the deep end, you find .kkreiger, which is contained entirely within an executable 95 kilobytes in size.

Even ignoring hard drive space, games are beginning to hog memory. Watch Dogs, The New Order, Titanfall, and Call of Duty: Ghosts all require at least 4GB of memory to run, to store their enormous textures. There is a dire need for a new, more efficient engine to run games at higher qualities like these; hopefully one will be here soon.

It seems that procedural generation is the buzzword of the gaming industry’s future. It cuts down on file sizes, allows for streamlining and makes every experience unique. I know I wouldn’t mind my second playthrough of a game being a little different than the first; it certainly seems to have worked well (albeit through limited implementation) in Left 4 Dead.

It’s either that, or we’re looking at multi-blu-ray titles hitting the Xbox One and Playstation 4 before long. Time to start checking prices on hard drives.

virtual consolation

Nintendo is having a very odd month so far. I have to wonder if they woke up on the wrong side of the bed, or perhaps misread their horoscope. Maybe the Earth’s magnetic field is just out of alignment. I don’t know, I’m not a fortune teller.

So far this generation their modus operandi seems to be that they don’t quite know what they’re doing. They’re out of their element, trying to catch up with the times and not quite succeeding. So like any desperate geriatric, they seem to be doing anything they can think of to get the attention of the hip youngsters. Problem is, they’re not thinking of the right things.

The failure to market the Wii U properly is just now reaching the synapses of Nintendo’s upper management. To their credit, rather than shifting blame and making meaningless promises, Satoru Iwata imposed on himself a 50% pay cut, which will no doubt motivate himself and his management to make greater strides in improving the appeal of their product.

Number one at the top of the list of the moment is the Virtual Console. It’s an abysmal failure. What could have been a flood of titles has been woefully underutilized. Even when it has been utilized, it’s been in the most backward way possible. Currently there are a total of 66 titles available on the Wii U Virtual Console. This pales in comparison to its predecessor, which had 186 titles available at this point in its life cycle. Nintendo seemed to notice this vacuum a couple months ago, but so far have made very little progress, despite (or maybe because of) some lofty promises. Only now are they getting around to the release of A Link to the Past on Wii U, a full two months after its sequel hit 3DS. This would have been a perfect double sale to promote both products and both platforms, but I guess it wasn’t too high on Nintendo’s list.

It’s understandable that licensing is a major issue in these matters (something that kept EarthBound in the vault for so long), but in the case of first-party titles, said licensing issues are far less signifcant, or even nonexistent. Across both Wii U and 3DS, there are a grand total of 11 Mario games. They could easily have launched with all the NES, SNES, Game Boy and GBC games, ready to blow the doors off. But they didn’t. Super Mario Bros 3, one of the most highly acclaimed of the series, is still in the “TBA” category. That’s just sad.

It’s also understood that Nintendo is obsessively meticulous in the titles that are ported to the Virtual Console. They strive for perfection, and many titles go through a rigorous process to ensure there are as few issues as possible for those who should play said games. But a great deal of titles were successfully released on the Wii VC, that much is beyond debate. Wii games also run in emulation on the Wii U. But to play older VC games on the Wii U, it is necessary to switch the console into Wii mode. If the games are already running in a shell of some sort, why is it not possible to have the games switch over automatically, and switch back when you’re done? (This is a serious question; if there is a real reason, I would like to know it.)

More than this, it is not possible to use any Wii U hardware to control these games, meaning to play SNES games one must utilize the somewhat awkward arrangement of connecting a controller to a controller. What makes this particularly ludicrous, is that it’s now possible to play in Wii mode through the gamepad, using it as a primary screen. What does this mean? It means if you do this with a game that requires the use of the classic controller, you must sit there staring at a 6″ screen, with a controller plugged into a Wiimote that does nothing but sit next to you. It’s ridiculous. If games are going to be playable on the gamepad screen, why not at very least allow the use of the gamepad’s controls?

Of course, if this were actually done it would remove most of the incentive to buy Virtual Console games on the Wii U. And we all know where Nintendo cashes its checks.

Another market they seem to have forgotten about is the 3DS Virtual Console. While a total of 131 games have been pushed to the platform thus far, the depth of this catalog is broad but shallow. So far only Game Boy, Game Boy Color, NES, Game Gear and (for a select few) Game Boy Advance titles are available. Where are the SNES games? I would be happy just to be able to play Super Metroid, EarthBound, or Link to the Past on the go. For that matter, where is a cross-buy option? It’s entirely possible to buy a PS Vita, PS3 or PS4 game from any computer and have it installed while you’re out, one cannot buy a 3DS game through a Wii U, or vice versa–this is despite the rather glaring fact that one can view games from both platforms. I can see Wonderful 101 in my 3DS eShop, but I can’t look at its details or choose to buy it, let alone have it installed and ready when I get home. This feature alone would be an incredibly convenient feature. So of course Nintendo hasn’t done it.

Oddly, their most recent announcement has been that of DS games on the Virtual Console. This is an interesting move; I’m curious to know what the market is for playing DS games on a home console, but I’m willing to wager it isn’t as significant as those involving N64 and Gamecube games. We’ll see how this year pans out for Iwata.

birth complications

Tomorrow, the anticipated PlayStation 4 will finally make its way off shelves to waiting buyers’ hands (in America, at least). Players will head home, plug in their new consoles, and get ready to exchange intellectual debates via PlayStation Network. Next week, the event will repeat with the Xbox One.

Or, maybe not. Turns out there’s a snag with the two consoles’ launch processes. Just a minor snag.

The PlayStation 4 will require a day-one patch to be applied before the console can be used. Note the choice of words here–it’s not that it can’t go online, or make use of certain features like Remote Play, or share screenshots and videos (although those are also covered). It can’t be used. The day-one patch, which updates the operating system to version 1.5–implying that it encompasses a number of semimajor updates–must be applied to enable the Blu-ray drive. Repeat: The Blu-ray drive is not a functioning feature out of the box.

In the same vein, Microsoft has announced that the Xbox One will require a similar patch to be applied on launch day. They haven’t gone into detail, but according to a senior director, the unit will be capable of literally “nothing”, being “required for your Xbox One to function.” If what Penello said is accurate, the console will simply not work at all without this update.

No two ways about it, we live in the era of the day one patch. There is some justification for it–if release day is looming, and features are still missing, it’s a perfectly valid tactic to work on the missing content and push it out on launch day. But this is getting ridiculous. Both systems fail to function entirely without the update; the PS4 specifically mentions that its patch enables the Blu-ray drive, which is what the entire system is built around. It’s akin to selling a car without a transmission, and telling potential customers to have their dealer install it when the car is purchased.

Part of me wonders if this isn’t some twisted anti-piracy scheme. A measure like this effectively prevents anyone from making use of their system before launch date (although Sony has made the 1.5 patch available for download now, to be executed from a flash drive and save the user the trouble of downloading it through the PS4). With a number of recent incidents of retailers breaking street dates on both hardware and software, I can see why Microsoft and Sony would be concerned with it. The practice of banning users who log on early doesn’t go always go over well, and this provides a method to keep people off. But it’s still a load of shit.

Or, I don’t know. Maybe it’s related to other issues.

games for windows, unplugged

One of the defining elements of the success of the Xbox and Xbox 360 has been its online service, Xbox Live. From the start it was barebones and hardly more than functional, even by the standards of the time. Nevertheless, it launched what is now the modern era of online console gaming, and is widely reputed for its ease of use, robustness, and large user base. Naturally, Microsoft wanted to capitalize on this by carrying it over to Windows, and thus was born Games for Windows Live.

Naturally, the people in charge of GFWL apparently took the long list of XBL’s successes and felt motivated to completely contradict, if not entirely invalidate them. The GFWL software was clunky, difficult to use, and often redundant. The only thing worse than the standalone executable was the in-game overlay. Often it would fail to appear, or require an update that essentially locked-up the computer it was running on. To make things worse, the overlay and standalone program ran on separate codebases; often both required individual updates, or games could not be played online (or sometimes at all). But GFWL’s greatest magic trick is making save files disappear into thin air. I had this happen myself, in Grand Theft Auto IV, when it lost a save with 36 hours’ progress.

At any rate, by the beginning of this year, the writing was on the wall, and publishers and developers were beginning to read it. Arkham Asylum and Arkham City have already dropped GFWL entirely, as has BioShock 2; Arkham Origins ditched it in mid-development. Capcom has just hopped on the trolley, announcing that it will begin removing the software from its games as well.

Now, it looks like the great experiment is over. With the decision to integrate Xbox Live into Windows 8, things already looked ominous. Two months ago Microsoft announced that the point system will be discontinued. Information was leaked on an update page for Age of Empires Online stating that the GFWL service itself will be discontinued by July 2014 (this page was quickly updated to omit this information). The next stage was to close the GFWL Marketplace, ending purchase of existing titles on the service, and effectively putting it on life support until the coup de grace can be administered sometime next year. Now it’s just a matter of time.

This is all something of a modern Shakespearean tragedy–or a comedy, I can’t decide. It’s both saddening and hilarious, the way Microsoft turned what could have been the next revolution of online gaming and took it to Boondoggle Level Market Garden, and now they’re beating a tactical withdrawal to the fortress of Xbox Live, hoping to reform and organize for another charge. But with Valve’s offensive rolling across the terrain like a Soviet tank division, and competing armies gathering on the fringes of the battlefield, I don’t foresee another major assault by Microsoft anytime soon.

But things change. With a new general, the tide may yet turn.

beta moderne

I feel it’s time for a PSA.

Prior to the current century, the software development cycle was well understood. A product went through a few major phases–primarily pre-alpha, alpha, beta, release candidate, and final release–during which bugs and other issues would be progressively weeded out, and new features added. Alpha and beta releases were done purely on test conditions; interested parties would fill out applications with system specs and testers would be picked from among those applicants. The purpose was clear: those chosen were, in effect, software testers, and would provide detailed feedback on their experience, in particular anything that didn’t work as planned. The developers would take this information, make the appropriate changes to the code, push out patches, and await more feedback.

This understanding seems to have been lost at some point in the past several years.

These days, “betas” are more like previews. They are sold as sneak peeks to games with preorders (Battlefield 4 and Bad Company 2 did this), or as a bonus with an entirely unrelated game (the Halo 3 multiplayer beta access came with copies of Crackdown). It’s become marketing.

To me, it’s absurd. It’s like selling tickets to a feature that is composed entirely of dailies of a movie, with no clean-up or post-processing applied. Would people have paid as much to see the raw film of Lord of the Rings? I don’t think they would. Every product needs polish, and no one wants to buy an unfinished product (unless they plan on finishing it themselves).

It’s an unfinished game, that’s all there is to it. These people playing are expected by developers and programmers to effectively test it and report issues, but they are expected by publishers and retailers to simply buy it. Many of those getting into these betas are not going into them keeping the mindset that it is incomplete. Forums become jammed with complaints that the game fails to launch, or textures pop too much, or certain skills don’t function correctly. Proper bug reports aren’t filed, even in situations when an interface to report bugs is made easily available and its use is actively encouraged by the game.

One memorable beta I took part in was that for Wrath of the Lich King. What made it memorable was the constant complaints in-game. Chat didn’t go more than a few minutes without someone whining about a mob, encounter, or effect not triggering as advertised, or textures not loading properly, or something else not working as it should. Did these people file a bug report, or contact a mod? No, they just bitched in chat or forums that were rarely (if ever) monitored by developers.

Times like this, I wish all betas were closed. But then, the primary advantage of an open beta is to have a much larger sample size, to cover as many different hardware configurations as possible. It strikes me as something of a conundrum. Publishers aren’t abandoning the marketing of betas anytime soon, and as long as they sell them like products or previews many of the players involved with them will fail to consider that they need to treat it for what it is–A god damned beta.