Tag Archives: Sybase

Another Brick in the growing Wall

The wall of profit is going nicely in France, even as I would like to take another gander on how the western media is all about ignoring the Houthi attacks with drones on Saudi Arabia, it seems that we will get more on that soon enough. As I see it, we have a situation where at present 5 attacks have been ignored by the western media, like they are all about calling Saudi Arabia the big nasty, even though there is no factual evidence, merely biased opinion on several fronts. Today is not about that. Today is about France (the land of Wine, Cheese and Citroen). This place is pushing a few boundaries and even as we think that things are still open to discussion, it seems that the mighty bosses of banks (one particularly) have made their choice, I mentioned it a little over a week ago, yet all were easily persuaded to ignore it. Now that we are given: ‘French parliament passes “Huawei Law” to govern 5G security‘ (at http://telecoms.com/498728/french-parliament-passes-Huawei-law-to-govern-5g-security/), we see an optionally much larger change. This might be the first step in changing the landscape on a much larger scale and as far as I can tell it is just the beginning. There is an important notice to all this and it opposes the UK point of view to almost 180 degrees. In the UK, Alex Younger (big boss of MI-6), aka El Capitano de observadores furtivos is off the mind that important infrastructure should never be in foreign hands. This is a policy issue and I do not oppose this choice. It is the short minded and stupid American view of being shouting anti-Huawei accusations without proof that I object to. Now we see on the other side (France) where Mathieu Duchatel gives us “the French government is creating a regulatory environment that helps reduce its vulnerability to foreign intelligence collection“, which is another policy approach. I tend to like this more than the one Alex Younger gives, but both are valid points. Yet the one Duchatel gives us leaves the players with more options.

To see this, we need to go back to 1993, when Sybase and Microsoft dissolve the partnership they had and Microsoft receives a copy of the SQL Server code base, this was the best approach and after this we see that Microsoft sets their own designers to make evolve their SQL servers, a choice that ends up making them a direct competitor of the code Larry Ellison pushed for (the solution we know as Oracle), and whilst he went sailing across the oceans, MS SQL Server got the be lean and mean. Even as we see flaws, we see that Microsoft created a much larger market than we thought possible. It is that path Europe and America needed for 5G. So as the Yanks decided to screw themselves 6 ways from Sunday, Europe has a much better approach and now we see the path where France has opened up a dialogue to enable that solution down the track. It is a solution that would assist Huawei as well, as we see a solution that uses the Huawei 5G path as a benchmark, France et al could deploy a non-Chinese 5G solution that is set to the Chinese standards and that would suit China (read: Huawei) in a few ways. It all goes from bad to worse for America. What everyone seems to forget is that Azure in China is Shanghai Blue Cloud Technology Co., Ltd., a wholly (or is that holy) owned subsidiary of Beijing 21Vianet Broadband Data Center Co., Ltd. and it now implies that the accelerated evolution of 5G via Huawei has the stage where the best upgrades to implementation and facilitation to 5G will come from 21Vianet and not from Microsoft. Just as Sybase gave the keys to Microsoft in the 90’s, we now see the opposite where the business advantage will be with the Blue Cloud bosses, together with Huawei they now have a much larger advantage than anyone realises. Even as there is a shift in china through the players like BitTitan, I believe that Huawei is still preparing for a much larger innovation giving 21Vianet when that kicks off an overnight advantage that Microsoft cannot equal, not for a much longer time, leaving Microsoft losing momentum to a much larger.

If you want proof, then I have to admit that I cannot give it, the market seems to facilitate to a larger shift and it is not some hidden gem that no one else found. I believe that the Sybase example is what we face today, as Mathieu Duchatel is setting the new policy, we see policy that is accepted over most of the EU, so as Germany, Spain and Italy accept this push, most of the EU nations will follow, they are willing to drop America like a bad habit ion all this. The US overplayed its hand and now they will face the consequences of choice. In this the UK must soon make up its own mind. The path Alex Younger opted for was not wrong, but it is a larger choice that could impede economic growth to a much longer degree for a much longer time, two elements the UK does not really have at present.

The SCMP article (at https://www.scmp.com/news/china/diplomacy/article/3020354/while-weighing-5g-security-risks-france-predicts-it-can-manage) we see another solution for France and somehow I believe Credit Agricole had been preparing for this step a little longer than most others. France needs to be on top of this as 2024 Paris is coming near soon enough, implying that a multi-billion euro scheme for 5G will be announced before years end to get anywhere near ready and it seems that the Credit Agricole dividend is about to push upwards to a much larger degree. And when we get to the end of the article where we see: “5G infrastructure poses more complex problems. The distinction between core and edge is no longer as relevant, as many software operations will operate in the cloud“, we get to be introduced to the benefit and advantage that Beijing 21Vianet Broadband Data Center Co. now gets to have; Microsoft forgot that most cannot get to China (for simple linguistic considerations) that limitation does not exist in the other direction. And now as the cogs connect we see how the market takes a shift. Remember when I made the joke (and connection) to the cloud; it is merely someone else’s computer. Everyone so needy to muddy the water claiming it is so much more complex. OK, to the smallest degree it is.

To see my point of view consider the NASA Mainframe that was there for the moon landing (and perhaps a little more), now consider my old Mobile, this 2011 mobile needs 5% of available processing power to do what that entire NASA room did. The mobile that followed 4 years later was 400% more powerful with 1600% more storage and the one that followed was close to 300% more powerful than the previous one with an additional 1600% more storage, the market shifted THAT fast.

So when we see a data center now, and consider that a dozen racks with terabyte storage can be replaced by ONE drive, yes there is an Exabyte drive now, one drive with well over 1,000,000 terabytes. We are nowhere near replacing the entire data center, yet in 10 years, that center could be replaced by one large tower in that time, it might look a little different (I always loved the Cray systems, it comes with a place to sit and heating, but that so called ‘cloud’ will be in one clear specific location (just as it is now) and that is the issue;

it is the location of someone else’s computer that is the issue, soon it will no longer be in America, China is now in a position to offer the same, optionally cheaper and when the America BS starts with ‘It needs some vague quality seal of approval‘ (a SAS marketing trick we saw 20 years ago).

It will be at that point that the entire mess becomes ugly real fast and we are already pushing in that direction. The problem is not China, or America. It will be the policy considerations on where data is allowed to be; a lot of cloud issues on data locations are still open to discussion. The problem is not the hardware, it will be the place with the most logical policy in place, that will be the main player for the next stage and it seems that France has been keeping busy on becoming that European location. I reckon that China does not care, as long as they get the business and that is when we see the American failure on getting the business. They planned on greed when pragmatism was the only solution to push the market forward. Now as most nations start waking up on the loss of pragmatism, we see the consideration, to be a player or a tool and some are realising that they banked on the wrong horse and the American horse is about to become a ‘horse no show!

Whether it was merely some bank, some policy, or a larger linked consideration, this time the French have played a good long term game and they have every chance to reap the benefits of that game. We have yet to see how it all plays out and Paris 2024 will be the big test, but as the issue stands, the French are pushing forward, it is there that I found some references to Credit Agricole, DGSE, and a very large billion dollar option. Even as 21Vianet and its subsidiaries are not mentions, neither is Azure in any way, it all falls to the one mention of ‘Microsoft Corporation‘. This might all be true, but I still seek confirmation, on a stage this large 21Vianet could not have been unmentioned, the same for the entire Azure part. the line “the proliferation of real-time data from sources such as mobile devices, web, social media, sensors, log files, and transactional applications, Big Data has found a host of vertical market applications, ranging from fraud detection to scientific R&D“, makes its absence of certain players either short sighted or the elements of that article were unreliable. I believe it to be a little of both.

I wonder how the game unfolds; I reckon we will know a lot more by the end of the year.

 

Leave a comment

Filed under Finance, IT, Law, Media, Politics, Science

When it is with us

Larry Elliott raises an interesting question regarding Huawei, it is an issue I raised a few times over the last months, even last year. I made a reference going back to December 2018 (at https://lawlordtobe.com/2018/12/06/tic-toc-ruination/) where in ‘Tic Toc Ruination‘ I said had “In a statement, the UK telecoms group has confirmed it is in the process of removing Huawei equipment from the key parts of its 3G and 4G networks to meet an existing internal policy not to have the Chinese firm at the centre of its infrastructure“, all at the behest of spymaster incredibili Alex Younger. Yet actual evidence of Chinese activities was never given in evidence. Alex does something else and in retrospect to his French, American and Canadian peers something that is actually intelligent. He gives us: “the UK needed to decide if it was “comfortable” with Chinese ownership of the technology being used.” This is at the foundation of “We can agree with Alex Younger that any nation needs to negate technological risk, we could consider that he seemingly had the only valid opposition against Huawei, as it was not directed at Huawei, but at the fact that the tech is not British, the others did not work that path, and as we see that technology is cornered by the big 7, those in the White House with an absent person from both Apple and Huawei. We have accepted the changed stage of technology and that might not have been a good thing (especially in light of all the cyber-crimes out there), also a larger diverse supplier group might have addressed other weak spot via their own internal policies, another path optionally not averted.” The issue is that ‘the tech is not British‘, so finding a temporary solution for British technology to catch up is an essential move. Whilst Larry gives us: “why a country that emerged from the second world war with a technological edge in computers and electronics should require the assistance of what is still classified as an emerging economy to construct a crucial piece of national infrastructure” is a very correct stance. The issue is that some got lazy and others got managed by excel users, getting it somewhere else is just cheaper. The combination has now created a technology gap that spans part of 4G and pretty much the entire 5G stage, that is before my IP comes into play, I found the niche that others forgot, in commerce and cyber security, as the gap is about to increase and for me the limitation is that only Huawei and Google have the optional stage where the problem can be solved (read: properly addressed). I am certain that there is more, I have not gone deep enough with what I found, implying that my window of opportunity is not that big. Larry Elliott goes on in his article taking to bat a few issues from 1967 onwards that gives rise to the UK loss, you should read it as it is a really good article (at https://www.theguardian.com/technology/2019/may/05/the-huawei-incident-points-to-a-deeper-lesson-for-great-britain). There is one element that was missing, it was the stage of the 90’s where the computer market moved from innovative to iterative, it is perhaps the larger (read: largest) failure. The advantage that places like IBM had were equaled within 3 years by makers like ASUS, A market of Printed Circuit Boards moved from US/UK held companies went to places like ASUS pretty much overnight, the people jumped to the competitive player that produced high end main boards. A company that started in 1989 owned the gamers and PC builders within 10 years at that point ASUS was the number one choice. It was not merely the high quality, the fact that architectures that were set in motion in one year were offered in upgraded form within a year. It is seen in “Intel itself had a problem with its own 486 motherboard. Asus solved Intel’s problem and it turned out that Asus’ own motherboard worked correctly without the need for further modification. Since then, Asus was receiving Intel engineering samples ahead of its competitors” (David Llewelyn, ‘Invisible Gold in Asia: Creating Wealth Through Intellectual Property‘, p143.), by the time the people were ready Asus had its Pentium II boards with one interesting nuance, unlike IBM, the board supported more processors, so the P2-350 also supported the P2-450, by spending an additional $35 on a better board, you could start with the P2-350 and upgrade to the P2-450 a year later, a person would save $525 and extend the life of their PC by 2 years.

It was an innovation that saved the people money, an issue that IBM never cared for. The iterative market got overwhelmed by Taiwan titan ASUS and the market in the UK and US started to slide. As I personally see it, the market was handed to executives measured by revenue and they were unwilling to take the big fight and decided to settle for $100K less income and zero risk and after 2-3 years they would move on degrading the market as a whole; that is how I see it. Now that the newest market requires actual knowledge and know how, we see a lack of non-Asian players. Yet Larry focusses on the part that matters most for the UK, there is no manufacturing vision (read: a lack of vision), a vision that would be essential for 5G, it is the one exponential growing market for the next decade and as such not having a game to play will make you miss out on it all. So there are two options, one forfeit the game or find a partner to build that market with, in that we see the Huawei would be the best fit, they are the most advanced. The alternative is finding an Ericsson or Nokia alternative, they are both chasing Huawei, so finding a solution with Huawei implies that Huawei creates another competitor for Ericsson and Nokia, which would suit them best, at that point the UK solution will be fighting over the same pie as Sweden and Finland are. Sybase did that trick with the MS SQL server and it did them a lot of good (for a while), the biggest part is that the UK needs to take a long term strategic stand on manufacturing and that is where the floor tends to fall from under your feet. The UK has shown to lack that vision for too often and now it will come at a much greater cost.

In the end the problem is not merely catching up with Huawei, it will be about remaining innovative with the products, optionally surpassing them. That has been a problem for almost 20 years and fending off bad habits is a time consuming, as well as an energy consuming effort. For most the problem is not merely remaining innovative, it is identifying it when it is offered and there we see that the UK has had its own moments of Titanic proportions when it came to missing out. If we look into history, we see that British innovation was an annual event at the very least; this has been diminished to thrice a decade at present. With 5G in coming, the idea of having enthusiasts with a Raspberry Pi and adding a 5G kit would be stellar, consider 19 million enthusiasts and if only 0.1% has an innovative idea, that still adds up to 19,000 with the chance of 190 patents. That is a multi-billion market right there, and it is not a man-made world either, you merely need to look at how JK Rowling and Joy Mangano got their boots on the floor to realise that this is a stage that is up for everyone to rule. Our problem is that every money maker seems to rely on 100% success at minimum (read: zero) investment, it might seem good business, but that is exactly how we lost the markets to indie developers in Asia and India.

In the end the tools we create is what enables a person to advocate and test: ‘What if I did it this way?‘ that is the one that makes for the innovation worth an easy 7 figure number and in that field no dream is too wild, because the need of people not realising that it made their lives easier is not that hard, you only need to see that they lacked merely one element, or another part to make it a better solution. That alone is worth a bundle and that is where the UK and several nations lost out, we forgot that this element requires creative thinking and actual creativity, as the schools cut those classes in favour of science and business, that is when we saw the change of leaders into sheep, following the work of others so that perhaps we might get a new idea does not work, not without a clear link to creativity and art. We lost 50% of the equation and started to think that this part would fill itself in (automatically) is where we lost, the solution was with us, and we forgot about the us part.

In that light I always remember Jeff Minter, some laugh and make a reference to the mutant camels, but the truth is that he was all about creativity and the list of his achievements is long, very very long. He has been around from the earliest Sinclair ZX to the PS4, if some Britons have one percent of his creativity the UK economic hardship would be over, it is that simple and even as we focus on the 5G needs and how the UK needs its own 5G solution (which is true), the UK can only do that by focusing on harnessing creativity that will lead to optional solutions, whilst that part remains missing the UK can merely hope to replicate what exists, not create what others forgot, seeing that is an essential first for those trying to sell you the story of a new technology.

And there is a second part, it is not what does it innovate, it is the second part: ‘What else could it be used for?‘ that is the larger part in all this. I always go back to the example from 1991, there was a company called WordPerfect and it had an excellent word processor. There was a secretary who found herself in a place where the budgets were not there, so they were confined to cheaper non-postscript laser printers (an issue in those days) as the postscript version was often thousands more expensive. So she did what no one had considered, she used the WP Equation editor to type the company name and a few other things, and added them in the letter, now (because of WP innovation) the letters suddenly looked like they came from high end expensive laser printers. Her work looked 200% better than anyone else in the company. The mere application of ‘What else could it be used for?‘, that is exactly the stage that some walked when they forgot what 5G also enables and more important, what it will allow for and there is the innovation worth billions, that is where creativity gets us, the lack of it leaves us with too little, or with gained advantage by pure chance. The chances lost were with us, or basically with the decision makers who did not comprehend the impact and cut it too far from education, and whoever followed in their footsteps are now required to clean up that mess.

Good luck with the attempt!

 

Leave a comment

Filed under Finance, IT, Media, Politics, Science

Questioning Assurance

A positive approach intended to question confidence. That is at the heart of the matter today. I have been involved in such tracks before, but in a slipping age of technology, where we see greed driven (or bonus driven) changes where some executives hide behind the excuse of giving new young Turks a start in the business, we need to wonder whether they were looking at the world through chartreuse glasses.

I have seen the stupidity (for the lack of a better word) of software firms pushing out software, some to make sure they kept some deadline, whilst the product was nowhere near ready. In a few cases they thought the product was truly ready and the QA department messed up in a royal kind of way. There is of course the third option, where a product was tested, was deemed good and things pop up. These are the three parts of QA the user faces, I have seen them all!

The third one is the clearest one. Development does its work, the QA department did all the test and then some and when released things go a little awry. Weirdly enough, this tends to happen to parts of the program that people seldom use, like that weird, off the wall setting that only 0.000001% of all Microsoft Word users tend to use. Microsoft had no idea, and at some point it gets fixed. This is just a flaw. You name a product, like anything in the range of Microsoft Office, Adobe Photoshop, Oracle, SPSS, Sybase or SAS Miner, they all have them. These programs are just too large to get 100% tested, and even when that happens, there is the interaction with another program, or with an operating system update that will then throw a spanner in the cogs. You only need to search for issues with Windows 8.2 or IOS 8.2 to see that things just happen. In the zero layer, we see the hardware, in layer one we get the operating software, in layer two we see the application, in layer three we get the peripherals (printer, keyboard, mouse and joystick), one massive sandwich to check! In any of these interactions things can go wrong and a QA department needs to sift through it all. Of course even if all of that did work correctly we see the fourth layer which is the user him/herself, who then decides to dunk that layered sandwich in tea. Boy oh boy can they mess up their own system! No software can truly prepare for that!

Yet in all this QA needs to have high standards, which are proven when we see the third option in all this. Options one and two are an entirely different mess! It is for the outsider often impossible to tell what on earth happened. I had the inside scoop on an event where something was marketed ready, yet the program was nowhere near that. Deadlines for stakeholders had to be met and some figured that a patch afterwards via the BBS systems would do the trick. So basically a flawed product went to the shops. I remember those days, that was long before any level of fast internet, I was a trendsetter in those days by owning a 64Kb modem, yes I was a speed demon in those days! LOL!

You see, legally the consumer is in a messy situation, product liability laws are not that strong, unless health and lives are placed in peril, beyond that, you would think that these consumers are protected when it involved fraud, yet, when we consider that part of fraud is ‘deception intended to result in financial or personal gain’, we see any case go south really fast when the defence becomes, ‘the consumer was offered a refund’ and ‘Your honour, our costs are massive! We are doing everything to aid the consumers, offering them a refund immediately’ and we see any fraud case go south. Consider part of this with the ruling ‘intentional perversion of truth’, the keyword ‘intentional’ can usually be swayed too easily, faltering the case of fraud. But in the core, getting people to sign on in the first weeks, getting that revenue on their boards can mean the survival of such a company, so some accept the costs for what happens to remain on the game board.

The other situation is where the Quality Assurance (QA) department messed up. Here is the kicker, for the outsider to tell which scenario played is impossible, without working at a place, it is an impossible task to tell, one can make estimated guesses, but that is as good as it goes. For example, Ubisoft had a net profit on -66 million in 2013, they fell from grace in 2008 from $32 to $3.80 per share, that’s a not too healthy drop of 90%. The interesting part here is that when we look at their games, we see over those terms Prince of Persia, the language coaches on DS, which was novel (especially Japanese), Assassin’s Creed II, Tom Clancy’s Splinter Cell: Conviction and a few more. This is the interesting part, here we see a few excellent games, a Prince of Persia that would bring back to life a forgotten franchise, Assassin’s Creed II, which was so far above the original that it mesmerised a massive player population, Prince of Persia: The Forgotten Sands, which upped the ante of Prince of Persia by a lot and Assassin’s Creed: Brotherhood, which gave us even more challenges. Yet, these good games could not hinder the fact that Ubisoft had produced so many games over that time, many of them far below great that it impacted their stock. Is their value back to $16 because of their games? So what about Assassins Creed: Unity? Is stock the reason for the lacking game. I personally would state no! I think lacking games drop the stock. Yet, this is an emotional response, because stock is driven by demands and rejections, as great games are made, people want a shae of that rabid bunny, if the games are nowhere near, the stock gets rejected. In this case it is about the games, because Ubisoft is gaming! This is also why the E3 is such a big deal and even though I was not impressed with their E3, ‘For Honor’ clearly shows that Ubisoft has some gems in their arsenal, or should that be ‘had’? For Honor is a new and likely high in demand game, the presentation was extremely well received. I am not much for those types of games, but I also looked with anticipation of a lovely challenge. The issue here remains, it is online, so timing and decent players are required to make this a good experience. Yet beyond that new title, I would see it as a collection of predictable that have become indistinguishable from their other titles. Sequels sharing bits from other sequels with an interchangeable codebase. With too many triggered scripts. We remain with a blurred sense of gaming. I stated it a few years ago, by adding too many prince of Persia moments into Assassins Creed, we end up not playing Assassins Creed, if I wanted that, I would have bought Prince of Persia! So why these games?

Well, there is of course method to my madness (and my madness is purely methodical). You see, Assassins Creed 2 and Splinter Cell: Conviction were amazing achievements. I can still play these two today and have loads of fun. They had set a standard, even though Assassin’s Creed: Brotherhood was a step up, certain flaws were never dealt with, flaws that became part of the engine for 5 iterations of the game. You see that in the second premise, I went from new game to iteration? That part matters too! With the Splinter Cell series we went from Conviction to Blacklist. Again, it was a step forwards, but now we get the issue that QA messed up buy not properly testing the re-playability part of the game, leaving players in a lurch, making the game a mess if I wanted to play a ‘NewGame+’, it is a little thing, with a far reaching consequences. What was great became good, a step forward, hindered by one and a half steps back., which is the faltering part. Ubisoft needed a QA department with teeth, as I see it, they did not have one, or Marketing got involved. There is in all honesty no way to tell how that came to pass.

Yet, this is not about Ubisoft, because Rocksteady Studios outdid it all with Batman: Arkham Knight, making Warner Bros. Interactive Entertainment extremely unhappy as I see it. A game that should be heralded as a new legendary release got a 50% rating by Steam and 70% by Gamespot, these are not good numbers, they are ratings that resemble coffin nails. Not a good thing at all. In my view, this is a massive fail by their QA department. However, when we accept the statement from Kotaku.com, we get “The moment I’m inside the batmobile, it’s not surprising to see it dip to 15 frames-per-second“, did QA really not see that? So is it Marketing or is it QA? No matter what answer I give here, it is pure speculation, I have no facts, just personal insight from 30 years of gaming. No matter where it lies, QA should not have signed off on it, not at such drops of quality. Which gets us back to the non-liability of these firms. ‘Res Ipsa Loquitur’, or in slightly more English “the thing speaks for itself“, The plaintiff can create a presumption of negligence by the defendant by proving that the harm would not ordinarily have occurred without negligence. Yet, what harm? The only harm the game has is spending funds which are refundable, the only harm there is for the maker of the game. So, there is no case, what is the case is that until these firms properly invest into QA, we get to go through buying and returning a lot more. Yet, these companies realise and they take a chance that the gamers (which tends to be a loyal lot) in that they hold on to the game and just download the patch. So basically, the first hour gamers become the sponsors for the development of an unfinished game. That is how I personally see it.

In my view, the game suffered, what could have been great will soon be forgotten. Yet, what happens when it is not a videogame? What happens when it is not a game, what happens when it is business software? you see the Donoghue v Stevenson case gives us that a maker can be held responsible for personal injury or damage to property, yet, what happens when neither is the case?

It is a very old UK case in Torts, where a Mrs Donoghue was drinking a bottle of ginger beer in a café in Paisley. A dead snail was in the bottle and because of that she fell ill, and she sued the ginger beer manufacturer, Mr Stevenson. The House of Lords held that the manufacturer owed a duty of care to her, which was breached, because it was reasonably foreseeable that failure to ensure the product’s safety would lead to harm of consumers. This is a 1932 case that is still the key case of torts and personal harm involving negligence. Yet, with video games there is no visible harm, there is only indirect harm, but the victims there have little say in this as the direct victim is offered a refund, the competitor missing out on revenue has no case. So as revenue is neither injury nor damage to property. Now we get the issue that if the buyer buys goods which are defective, he or she can only have a claim under contract of sale against the retailer. If the retailer is insolvent, no further claims will be possible. So, with Arkham Knight, when 2500 copies are returned, a large shop will not go insolvent, you get the idea, when the shop needs to close the doors, you are left out of money.

Here we get the crux, a maker of a game/program has pushed an inferior product to market. It will offer compensation, yet if the shop closes (that is a massively big if), the buyer is out in the cold. Now, the chance of this ever happening is too unrealistically small, but the need to set rules of quality, setting the need of standards is now becoming increasingly important. With games they are the most visible, but consider a corporation now pushing a day one product to get enough revenue to tailor a patch which the customer needs to download. An intentional path to stay afloat, to buy time. Where do you stand, when you got pushed to solution 2 as solution 1 is a month away, only to discover the flaw in the program, which gets freely adjusted in Week 23, so 22 weeks without a solution, this situation also hindering the sale of solution 1, which was fine from day one onwards.

Not only is a much better QA required, the consumer should be receiving much stronger protection against these events. That could just be me.

Now to the real issue connected to this. Assassins Creed: Unity became a really bad joke last year,

It went so far as Ubisoft offering a free game because (source: Express) “UBISOFT have confirmed some Xbox One fans who have previously applied patch 3 for Assassin’s Creed: Unity are now being hit by a 40GB download when trying to use the latest title update”. 40GB is massive, that comes down to 10 DVD Movies, it is well over 10% of the entire hard drive space, this gives us the image that one game has clear impact on the total space of the console. Also be mindful of the term ‘patch 3’, which implies that patch one and two had been applied, so is there clarity on the reasonable assumption that there is an issue with both release and QA here? In my view, delayed in addition or not, the game should never have been released to begin with.

Don’t get me wrong, with the new AAA games, the chance of a patch becomes larger and larger. You see QA can only get us to a certain distance and an issue on a console is a lot less likely than an issue on your PC (with all kinds of hardware combinations), yet the amount of fixes as shown here is way off the wall. Now we see a similar thing happening to the PC edition of Arkham knight. Warner Brothers have decided to call back the game, all sales have stopped at present. However, the issues we see on gottabemobile.com are “Warner Brothers’ forums are filled with complaints about the game including Error CE-34878-0 issues on the PS4, various issues with the Batmobile including this one on Xbox One, issues with cut scenes, Harley Quinn DLC problems on the PS4, Batman season pass problems, problems launching the game, problems with the game’s well-known Detective Mode, missing Flashpoint skin, problems with missions, problems saving the game, and more”.

Now we get the question, was this properly QA-ed? Was a proper quality test made, because the size and nature of the issues, as reported give out a negative testing vibe, which I consider to be extremely negligent! As such we must wonder, should such levels of non-functionality be allowed. Can the law allow the release of a product that causes, as alleged ‘no harm has been caused’, an industry, hoping on the users to wait quietly as a game gets finished on the consumers costs.

Now that the Nextgen consoles are all set out to be downloaded in the night, how long until games start tasking the game of ‘customer expectations’ and release a 90% game? How long until corporations will work on a business model that relies on consumer sponsoring whilst they contract even better profits. We also need to be careful, patches will always be a factor, I have no issue with that, and the list of games that needing massive patches keeps on growing, AC: Unity, GTA-V, Arkham Knight, Destiny, and the list goes on a little longer. I am only mentioning the patches over 3GB (one is well over 6Gb) and in this light Destiny gets a small pass as that game is all about multiplayer, which is a dimension of errors all on its own.  The Elder Scrolls Online wins this year with a 16Gb patch, again, all about online play, but overall the gaming industry seems to adapt the bad traits of Microsoft, which is definitely not a good idea.

For now we seem to accept it, especially as the Nextgen systems are relatively new, but that feeling will change sooner rather than later and at that point someone official needs to step in, which might end up being a lot more official that the game makers bargained for, especially as games outside of the US can be up to 70% more expensive, at that point we are entitled to some proper consumer protection, against these levels of negligence, levels that currently only exist on a limited scope.

 

 

Leave a comment

Filed under Gaming, Law