A positive approach intended to question confidence. That is at the heart of the matter today. I have been involved in such tracks before, but in a slipping age of technology, where we see greed driven (or bonus driven) changes where some executives hide behind the excuse of giving new young Turks a start in the business, we need to wonder whether they were looking at the world through chartreuse glasses.
I have seen the stupidity (for the lack of a better word) of software firms pushing out software, some to make sure they kept some deadline, whilst the product was nowhere near ready. In a few cases they thought the product was truly ready and the QA department messed up in a royal kind of way. There is of course the third option, where a product was tested, was deemed good and things pop up. These are the three parts of QA the user faces, I have seen them all!
The third one is the clearest one. Development does its work, the QA department did all the test and then some and when released things go a little awry. Weirdly enough, this tends to happen to parts of the program that people seldom use, like that weird, off the wall setting that only 0.000001% of all Microsoft Word users tend to use. Microsoft had no idea, and at some point it gets fixed. This is just a flaw. You name a product, like anything in the range of Microsoft Office, Adobe Photoshop, Oracle, SPSS, Sybase or SAS Miner, they all have them. These programs are just too large to get 100% tested, and even when that happens, there is the interaction with another program, or with an operating system update that will then throw a spanner in the cogs. You only need to search for issues with Windows 8.2 or IOS 8.2 to see that things just happen. In the zero layer, we see the hardware, in layer one we get the operating software, in layer two we see the application, in layer three we get the peripherals (printer, keyboard, mouse and joystick), one massive sandwich to check! In any of these interactions things can go wrong and a QA department needs to sift through it all. Of course even if all of that did work correctly we see the fourth layer which is the user him/herself, who then decides to dunk that layered sandwich in tea. Boy oh boy can they mess up their own system! No software can truly prepare for that!
Yet in all this QA needs to have high standards, which are proven when we see the third option in all this. Options one and two are an entirely different mess! It is for the outsider often impossible to tell what on earth happened. I had the inside scoop on an event where something was marketed ready, yet the program was nowhere near that. Deadlines for stakeholders had to be met and some figured that a patch afterwards via the BBS systems would do the trick. So basically a flawed product went to the shops. I remember those days, that was long before any level of fast internet, I was a trendsetter in those days by owning a 64Kb modem, yes I was a speed demon in those days! LOL!
You see, legally the consumer is in a messy situation, product liability laws are not that strong, unless health and lives are placed in peril, beyond that, you would think that these consumers are protected when it involved fraud, yet, when we consider that part of fraud is ‘deception intended to result in financial or personal gain’, we see any case go south really fast when the defence becomes, ‘the consumer was offered a refund’ and ‘Your honour, our costs are massive! We are doing everything to aid the consumers, offering them a refund immediately’ and we see any fraud case go south. Consider part of this with the ruling ‘intentional perversion of truth’, the keyword ‘intentional’ can usually be swayed too easily, faltering the case of fraud. But in the core, getting people to sign on in the first weeks, getting that revenue on their boards can mean the survival of such a company, so some accept the costs for what happens to remain on the game board.
The other situation is where the Quality Assurance (QA) department messed up. Here is the kicker, for the outsider to tell which scenario played is impossible, without working at a place, it is an impossible task to tell, one can make estimated guesses, but that is as good as it goes. For example, Ubisoft had a net profit on -66 million in 2013, they fell from grace in 2008 from $32 to $3.80 per share, that’s a not too healthy drop of 90%. The interesting part here is that when we look at their games, we see over those terms Prince of Persia, the language coaches on DS, which was novel (especially Japanese), Assassin’s Creed II, Tom Clancy’s Splinter Cell: Conviction and a few more. This is the interesting part, here we see a few excellent games, a Prince of Persia that would bring back to life a forgotten franchise, Assassin’s Creed II, which was so far above the original that it mesmerised a massive player population, Prince of Persia: The Forgotten Sands, which upped the ante of Prince of Persia by a lot and Assassin’s Creed: Brotherhood, which gave us even more challenges. Yet, these good games could not hinder the fact that Ubisoft had produced so many games over that time, many of them far below great that it impacted their stock. Is their value back to $16 because of their games? So what about Assassins Creed: Unity? Is stock the reason for the lacking game. I personally would state no! I think lacking games drop the stock. Yet, this is an emotional response, because stock is driven by demands and rejections, as great games are made, people want a shae of that rabid bunny, if the games are nowhere near, the stock gets rejected. In this case it is about the games, because Ubisoft is gaming! This is also why the E3 is such a big deal and even though I was not impressed with their E3, ‘For Honor’ clearly shows that Ubisoft has some gems in their arsenal, or should that be ‘had’? For Honor is a new and likely high in demand game, the presentation was extremely well received. I am not much for those types of games, but I also looked with anticipation of a lovely challenge. The issue here remains, it is online, so timing and decent players are required to make this a good experience. Yet beyond that new title, I would see it as a collection of predictable that have become indistinguishable from their other titles. Sequels sharing bits from other sequels with an interchangeable codebase. With too many triggered scripts. We remain with a blurred sense of gaming. I stated it a few years ago, by adding too many prince of Persia moments into Assassins Creed, we end up not playing Assassins Creed, if I wanted that, I would have bought Prince of Persia! So why these games?
Well, there is of course method to my madness (and my madness is purely methodical). You see, Assassins Creed 2 and Splinter Cell: Conviction were amazing achievements. I can still play these two today and have loads of fun. They had set a standard, even though Assassin’s Creed: Brotherhood was a step up, certain flaws were never dealt with, flaws that became part of the engine for 5 iterations of the game. You see that in the second premise, I went from new game to iteration? That part matters too! With the Splinter Cell series we went from Conviction to Blacklist. Again, it was a step forwards, but now we get the issue that QA messed up buy not properly testing the re-playability part of the game, leaving players in a lurch, making the game a mess if I wanted to play a ‘NewGame+’, it is a little thing, with a far reaching consequences. What was great became good, a step forward, hindered by one and a half steps back., which is the faltering part. Ubisoft needed a QA department with teeth, as I see it, they did not have one, or Marketing got involved. There is in all honesty no way to tell how that came to pass.
Yet, this is not about Ubisoft, because Rocksteady Studios outdid it all with Batman: Arkham Knight, making Warner Bros. Interactive Entertainment extremely unhappy as I see it. A game that should be heralded as a new legendary release got a 50% rating by Steam and 70% by Gamespot, these are not good numbers, they are ratings that resemble coffin nails. Not a good thing at all. In my view, this is a massive fail by their QA department. However, when we accept the statement from Kotaku.com, we get “The moment I’m inside the batmobile, it’s not surprising to see it dip to 15 frames-per-second“, did QA really not see that? So is it Marketing or is it QA? No matter what answer I give here, it is pure speculation, I have no facts, just personal insight from 30 years of gaming. No matter where it lies, QA should not have signed off on it, not at such drops of quality. Which gets us back to the non-liability of these firms. ‘Res Ipsa Loquitur’, or in slightly more English “the thing speaks for itself“, The plaintiff can create a presumption of negligence by the defendant by proving that the harm would not ordinarily have occurred without negligence. Yet, what harm? The only harm the game has is spending funds which are refundable, the only harm there is for the maker of the game. So, there is no case, what is the case is that until these firms properly invest into QA, we get to go through buying and returning a lot more. Yet, these companies realise and they take a chance that the gamers (which tends to be a loyal lot) in that they hold on to the game and just download the patch. So basically, the first hour gamers become the sponsors for the development of an unfinished game. That is how I personally see it.
In my view, the game suffered, what could have been great will soon be forgotten. Yet, what happens when it is not a videogame? What happens when it is not a game, what happens when it is business software? you see the Donoghue v Stevenson case gives us that a maker can be held responsible for personal injury or damage to property, yet, what happens when neither is the case?
It is a very old UK case in Torts, where a Mrs Donoghue was drinking a bottle of ginger beer in a café in Paisley. A dead snail was in the bottle and because of that she fell ill, and she sued the ginger beer manufacturer, Mr Stevenson. The House of Lords held that the manufacturer owed a duty of care to her, which was breached, because it was reasonably foreseeable that failure to ensure the product’s safety would lead to harm of consumers. This is a 1932 case that is still the key case of torts and personal harm involving negligence. Yet, with video games there is no visible harm, there is only indirect harm, but the victims there have little say in this as the direct victim is offered a refund, the competitor missing out on revenue has no case. So as revenue is neither injury nor damage to property. Now we get the issue that if the buyer buys goods which are defective, he or she can only have a claim under contract of sale against the retailer. If the retailer is insolvent, no further claims will be possible. So, with Arkham Knight, when 2500 copies are returned, a large shop will not go insolvent, you get the idea, when the shop needs to close the doors, you are left out of money.
Here we get the crux, a maker of a game/program has pushed an inferior product to market. It will offer compensation, yet if the shop closes (that is a massively big if), the buyer is out in the cold. Now, the chance of this ever happening is too unrealistically small, but the need to set rules of quality, setting the need of standards is now becoming increasingly important. With games they are the most visible, but consider a corporation now pushing a day one product to get enough revenue to tailor a patch which the customer needs to download. An intentional path to stay afloat, to buy time. Where do you stand, when you got pushed to solution 2 as solution 1 is a month away, only to discover the flaw in the program, which gets freely adjusted in Week 23, so 22 weeks without a solution, this situation also hindering the sale of solution 1, which was fine from day one onwards.
Not only is a much better QA required, the consumer should be receiving much stronger protection against these events. That could just be me.
Now to the real issue connected to this. Assassins Creed: Unity became a really bad joke last year,
It went so far as Ubisoft offering a free game because (source: Express) “UBISOFT have confirmed some Xbox One fans who have previously applied patch 3 for Assassin’s Creed: Unity are now being hit by a 40GB download when trying to use the latest title update”. 40GB is massive, that comes down to 10 DVD Movies, it is well over 10% of the entire hard drive space, this gives us the image that one game has clear impact on the total space of the console. Also be mindful of the term ‘patch 3’, which implies that patch one and two had been applied, so is there clarity on the reasonable assumption that there is an issue with both release and QA here? In my view, delayed in addition or not, the game should never have been released to begin with.
Don’t get me wrong, with the new AAA games, the chance of a patch becomes larger and larger. You see QA can only get us to a certain distance and an issue on a console is a lot less likely than an issue on your PC (with all kinds of hardware combinations), yet the amount of fixes as shown here is way off the wall. Now we see a similar thing happening to the PC edition of Arkham knight. Warner Brothers have decided to call back the game, all sales have stopped at present. However, the issues we see on gottabemobile.com are “Warner Brothers’ forums are filled with complaints about the game including Error CE-34878-0 issues on the PS4, various issues with the Batmobile including this one on Xbox One, issues with cut scenes, Harley Quinn DLC problems on the PS4, Batman season pass problems, problems launching the game, problems with the game’s well-known Detective Mode, missing Flashpoint skin, problems with missions, problems saving the game, and more”.
Now we get the question, was this properly QA-ed? Was a proper quality test made, because the size and nature of the issues, as reported give out a negative testing vibe, which I consider to be extremely negligent! As such we must wonder, should such levels of non-functionality be allowed. Can the law allow the release of a product that causes, as alleged ‘no harm has been caused’, an industry, hoping on the users to wait quietly as a game gets finished on the consumers costs.
Now that the Nextgen consoles are all set out to be downloaded in the night, how long until games start tasking the game of ‘customer expectations’ and release a 90% game? How long until corporations will work on a business model that relies on consumer sponsoring whilst they contract even better profits. We also need to be careful, patches will always be a factor, I have no issue with that, and the list of games that needing massive patches keeps on growing, AC: Unity, GTA-V, Arkham Knight, Destiny, and the list goes on a little longer. I am only mentioning the patches over 3GB (one is well over 6Gb) and in this light Destiny gets a small pass as that game is all about multiplayer, which is a dimension of errors all on its own. The Elder Scrolls Online wins this year with a 16Gb patch, again, all about online play, but overall the gaming industry seems to adapt the bad traits of Microsoft, which is definitely not a good idea.
For now we seem to accept it, especially as the Nextgen systems are relatively new, but that feeling will change sooner rather than later and at that point someone official needs to step in, which might end up being a lot more official that the game makers bargained for, especially as games outside of the US can be up to 70% more expensive, at that point we are entitled to some proper consumer protection, against these levels of negligence, levels that currently only exist on a limited scope.