Tag Archives: IBM

The Good, the Bad, and North Korea

This article is late in the making. There is the need to be first, but is that enough? At times it is more important to be well informed. So let’s start with the good. The good is that if there is a nuclear blast, North Korea need not worry. The game maker Bethesda made a management simulator called Fallout Shelter. You can, on your mobile device manage a fallout shelter, get the goods of food, energy and water. Manage how the people procreate and who gets to procreate. Fight off invaders and grow the population to 200 people, so with two of these shelters, North Korea has a viable solution to not become extinct. The bad news is that North Korea has almost no smart phones, so there is not a device around to actively grow the surviving community. Yes, this matter, and it is important to you. You see the Dutch had some kind of a media tour around 2012. There were no camera’s allowed, still the images came through, because as the cameras were locked away, the military and the official escorts were seemingly unaware that every journalist had a mobile with the ability to film. The escorting soldier had never seen a smartphone before in his life. So a year later, we get the ‘fake’ news in the Dutch Newspaper (at https://www.volkskrant.nl/buitenland/noord-korea-beweert-smartphone-te-hebben-ontwikkeld-niemand-gelooft-het~a3493503/) that North Korea finished ‘their’ own local smartphones. This is important as it shows just how backwards North Korea is in certain matters.

The quote “Zuid-Koreaanse computerexperts menen dat hun noorderbuur genoeg van software weet om cyberaanvallen uit te voeren, zoals die op banken en overheidswebsites van eerder dit jaar. Maar de ontwikkeling van hardware staat in Noord-Korea nog in de kinderschoenen“, stating: “South Korean computer experts believe that their northern neighbour knows enough of software to instigate cyber-attacks, such as those on banks and Government websites earlier this year. But the development of hardware in North Korea remains in its infancy“. I believe this to be a half truth. I believe that China facilitates to some degree, but it is keeping its market on a short leash. North Korea remains behind on several fronts and that would show in other fields too.

This is how the two different parts unite. You see, even as America had its hydrogen bomb in 1952, it did not get there in easy steps and it had a massive level of support on several fronts as well as the brightest minds that this plane had to offer. The same could be said for Russia at the time. The History channel of all places gives us “Opponents of development of the hydrogen bomb included J. Robert Oppenheimer, one of the fathers of the atomic bomb. He and others argued that little would be accomplished except the speeding up of the arms race, since it was assumed that the Soviets would quickly follow suit. The opponents were correct in their assumptions. The Soviet Union exploded a thermonuclear device the following year and by the late 1970s, seven nations had constructed hydrogen bombs“, so we get two parts here. The fact that the evolution was theoretically set to 7-10 years, the actual device would not come until much later. The other players who had nowhere near the academic and engineering capacity would follow close to 18 years later. That is merely an explosion, something North Korea is claiming to consider. With the quote “North Korea’s Foreign Minister has said the country may test a hydrogen bomb in the Pacific“, we need to realise that the operative word is ‘may‘. Even then there will be a large time lapse coming. Now, I am not trying to lull you into sleep. The fact that North Korea is making these steps is alarming to a much larger scale than most realise. Even if it fails, there is a chance that, because of failed safety standards, a setting that is often alien to North Korea, wherever this radiation is, it can impact the biological environment beyond repair; it is in that frame that Japan is for now likely the only one that needs to be truly worried.

All this still links together. You see, the issue is not firing a long range rocket; it is keeping it on track and aiming it precisely. Just like the thousands of Hamas rockets fired on Israel with a misfiring percentage of 99.92% (roughly), North Korea faces that same part in a much larger setting. You see ABC touched on this in July, but never gave all the goods (at http://www.abc.net.au/news/2017-07-06/north-korea-missile-why-it-is-so-difficult-to-intercept-an-icbm/8684444). Here we see: “The first and most prominent is Terminal High Altitude Area Defence, or THAAD, which the US has deployed in South Korea. THAAD is designed to shoot down ballistic missiles in the terminal phase of flight — that is, as the ballistic missile is re-entering the atmosphere to strike its target. The second relevant system is the Patriot PAC-3, which is designed to provide late terminal phase interception, that is, after the missile has re-entered the atmosphere. It is deployed by US forces operating in the region, as well as Japan.” You see, that is when everything is in a 100% setting, but we forget, North Korea is not there. You see, one of the most basic parts here is shown to undergrads at MIT. Here we see Richard C. Booton Jr. and Simon Ramo, executives at TRW Inc., which would grow and make military boy scouts like Northrop Grumman and the Goodrich Corporation. So these people are in the know and they give us: “Today all major space and military development programs recognize systems engineering to be a principal project task. An example of a recent large space system is the development of the tracking and data relay satellite system (TDRSS) for NASA. The effort (at TRW) involved approximately 250 highly experienced systems engineers. The majority possessed communications systems engineering backgrounds, but the range of expertise included software architecture, mechanical engineering, automatic controls design, and design for such specialized performance characteristics as stated reliability“, that is the name of the game and North Korea lacks the skill, the numbers and the evolved need for shielded electronic guidance. In the oldest days it would have been done with 10 engineers, but as the systems become more complex, and their essential need for accuracy required evolution, all items lacking in North Korea. By the way, I will add the paper at the end, so you can read all by yourself what other component(s) North Korea is currently missing out on. All this is still an issue, because even as we see that there is potentially no danger to the USA and Australia, that safety cannot be given to China and Japan, because even if Japan is hit straight on, it will affect and optionally collapse part of the Chinese economy, because when the Sea of Japan, or the Yellow sea becomes the ‘Glowing Sea’, you better believe that the price of food will go up by 1000% and clean water will be the reason to go to war over. North Korea no matter how stupid they are, they are a threat. When we realise just how many issues North Korea faces, we see that all the testosterone imagery from North Korea is basically sabre rattling and because they have no sabres, they will try to mimic it with can openers. The realisation of all this is hitting you now and as you realise that America is the only player that is an actual threat, we need to see the danger for what it is, it is a David and Goliath game where the US is the big guy and North Korea forgot their sling, so it becomes a one sided upcoming slaughter. It is, as I see it diplomacy in its most dangerously failed stage. North Korea rants on and on and at some point, the US will have no option left but to strike back. So in all this, let’s take one more look, so that you get the idea even better.

I got this photo from a CNN source, so the actual age was unknown, yet look at the background, the sheer antiquity that this desktop system represents. In a place where the President of North Korea should be surrounded by high end technology, we see a system that seems to look like an antiquated Lenovo system, unable to properly play games from the previous gaming generation, and that is their high technology?

So here we see the elements come together. Whether you see Kim Jong-un as a threat, he could be an actual threat to South Korea, Japan, China and Russia. You see, even if everything goes right, there is a larger chance that the missile gets a technology issue and it will prematurely crash, I see that chance at 90%, so even as it was fired at the US, the only ones in true peril are Japan, South Korea, Russia and last China, who only gets the brunt if the trajectory changes by a lot. After which the missile could accidently go off. That is how I see it, whatever hydrogen bomb element they think they have, it requires a lot of luck for North Korea to go off, because they lack the engineering capacity, the skills and the knowhow and that is perhaps even more scary than anything else, because it would change marine biology as well as the aftermath as it all wastes into the Pacific ocean for decades to come. So when you consider the impact that sea life had because of Hiroshima and Nagasaki for the longest time, now consider the aftermath of a bomb hundreds of times more powerful by a megalomaniac who has no regards for safety procedures. That is the actual dangers we face and the only issue is that acting up against him might actually be more dangerous, we are all caught between the bomb and an irradiated place. Not a good time to be living the dream, because it might just turn into a nightmare.

Here is the paper I mentioned earlier: booten-ramo


Leave a comment

Filed under IT, Military, Politics, Science

A legislative system shock

Today the Guardian brings us the news regarding the new legislation on personal data. The interesting starts with the image of Google and not Microsoft, which is a first item in all this. I will get back to this. The info we get with ‘New legislation will give people right to force online traders and social media to delete personal data and will comply with EU data protection‘ is actually something of a joke, but I will get back to that too. You see, the quote it is the caption with the image that should have been at the top of all this. With “New legislation will be even tougher than the ‘right to be forgotten’ allowing people to ask search engines to take down links to news items about their lives“, we get to ask the question who the protection is actually for?

the newspapers gives us this: “However, the measures appear to have been toughened since then, as the legislation will give people the right to have all their personal data deleted by companies, not just social media content relating to the time before they turned 18“, yet the reality is that this merely enables new facilitation for data providers to have a backup in a third party sense of data. As I personally see it, the people in all this will merely be chasing a phantom wave.

We see the self-assured Matt Hancock standing there in the image and in all this; I see no reason to claim that these laws will be the most robust set of data laws at all. They might be more pronounced, yet in all this, I question how facilitation is dealt with. With “Elizabeth Denham, the information commissioner, said data handlers would be made more accountable for the data “with the priority on personal privacy rights” under the new laws“, you see the viewer will always respond in the aftermath, meaning that the data is already created.

We can laugh at the statement “The definition of “personal data” will also be expanded to include IP addresses, internet cookies and DNA, while there will also be new criminal offences to stop companies intentionally or recklessly allowing people to be identified from anonymous personal data“, it is laughable because it merely opens up venues for data farms in the US and Asia, whilst diminishing the value of UK and European data farms. The mention of ‘include IP addresses‘ is funny as the bulk of the people on the internet are all on dynamic IP addresses. It is a protection for large corporations that are on static addresses. the mention of ‘stop companies intentionally or recklessly allowing people to be identified from anonymous personal data‘ is an issue as intent must be shown and proven, recklessly is something that needs to be proven as well and not on the balance of it, but beyond all reasonable doubt, so good luck with that idea!

As I read “The main aim of the legislation will be to ensure that data can continue to flow freely between the UK and EU countries after Brexit, when Britain will be classed as a third-party country. Under the EU’s data protection framework, personal data can only be transferred to a third country where an adequate level of protection is guaranteed“, is this another twist in anti-Brexit? You see none of this shows a clear ‘adequate level of protection‘, which tends to stem from technology, not from legislation, the fact that all this legislation is all about ‘after the event‘ gives rise to all this. So as I see it, the gem is at the end, when we see “the EU committee of the House of Lords has warned that there will need to be transitional arrangements covering personal information to secure uninterrupted flows of data“, it makes me wonder what those ‘actual transitional arrangements‘ are and how come that the new legislation is covering policy on this.

You see, to dig a little deeper we need to look at Nielsen. There was an article last year (at http://www.nielsen.com/au/en/insights/news/2016/uncommon-sense-the-big-data-warehouse.html), here we see: “just as it reached maturity, the enterprise data warehouse died, laid low by a combination of big data and the cloud“, you might not realise this, but it is actually a little more important than most realise. It is partially seen in the statement “Enterprise decision-making is increasingly reliant on data from outside the enterprise: both from traditional partners and “born in the cloud” companies, such as Twitter and Facebook, as well as brokers of cloud-hosted utility datasets, such as weather and econometrics. Meanwhile, businesses are migrating their own internal systems and data to cloud services“.

You see, the actual dangers in all that personal data, is not the ‘privacy’ part, it is the utilities in our daily lives that are under attack. Insurances, health protection, they are all set to premiums and econometrics. These data farms are all about finding the right margins and the more they know, the less you get to work with and they (read: their data) will happily move to where ever the cloud takes them. In all this, the strong legislation merely transports data. You see the cloud has transformed data in one other way, the part Cisco could not cover. The cloud has the ability to move and work with ‘data in motion’; a concept that legislation has no way of coping with. The power (read: 8 figure value of a data utility) is about being able to do that and the parties needing that data and personalised are willing to pay through the nose for it, it is the holy grail of any secure cloud environment. I was actually relieved that it was not merely me looking at that part; another blog (at https://digitalguardian.com/blog/data-protection-data-in-transit-vs-data-at-rest) gives us the story from Nate Lord. He gives us a few definitions that are really nice to read, yet the part that he did not touch on to the degree I hoped for is that the new grail, the analyses of data in transit (read: in motion) is cutting edge application, it is what the pentagon wants, it is what the industry wants and it is what the facilitators want. It is a different approach to real time analyses, and with analyses in transit those people get an edge, an edge we all want.

Let’s give you another clear example that shows the value (and the futility of legislation). Traders get profit by being the first, which is the start of real wealth. So whoever has the fastest connection is the one getting the cream of the trade, which is why trade houses pay millions upon millions to get the best of the best. The difference between 5ms and 3ms results in billions of profit. Everyone in that industry knows that. So every firm has a Bloomberg terminal (at $27,000 per terminal), now consider the option that they could get you that data a millisecond faster and the automated scripts could therefor beat the wave of sales, giving them a much better price, how much are they willing to pay suddenly? This is a different level of armistice, it is weaponised data. The issue is not merely the speed; it is the cutting edge of being able to do it at all.

So how does this relate?

I am taking you back to the quote “it would amount to a “right to be forgotten” by companies, which will no longer be able to get limitless use of people’s data simply through default “tick boxes” online” as well as “the legislation will give people the right to have all their personal data deleted by companies“. The issue here is not to be forgotten, or to be deleted. It is about the data not being stored and data in motion is not stored, which now shows the futility of the legislation to some extent. You might think that this is BS, consider the quote by IBM (at https://www.ibm.com/developerworks/community/blogs/5things/entry/5_things_to_know_about_big_data_in_motion?lang=en), it comes from 2013, IBM was already looking at matters in different areas close to 5 years ago, as were all the large players like Google and Microsoft. With: “data in motion is the process of analysing data on the fly without storing it. Some big data sources feed data unceasingly in real time. Systems to analyse this data include IBM Streams “, here we get part of it. Now consider: “IBM Streams is installed on nearly every continent in the world. Here are just a few of the locations of IBM Streams, and more are being added each year“. In 2010 there were 90 streams on 6 continents, and IBM stream is not the only solution. As you read that IBM article, you also read that Real-time Analytic Processing (RTAP) is a real thing, it already was then and the legislation that we now read about does not take care of this form of data processing, what the legislation does in my view is not give you any protection, it merely limits the players in the field. It only lets the really big boys play with your details. So when you see the reference to the Bloomberg terminal, do you actually think that you are not part in the data, or ever forgotten? EVERY large newspaper and news outlet would be willing to pay well over $127,000 a year to get that data on their monitors. Let’s call them Reuter Analytic Systems (read: my speculated name for it), which gets them a true representation of all personalised analytical and reportable data in motion. So when they type the name they need, they will get every detail. In this, the events that were given 3 weeks ago with the ITPRO side (at http://www.itpro.co.uk/strategy/29082/ecj-may-extend-right-to-be-forgotten-ruling-outside-the-eu) sounds nice, yet the quote “Now, as reported by the Guardian, the ECJ will be asked to be more specific with its initial ruling and state whether sites have to delete links only in the country that requests it, or whether it’s in the EU or globally” sounds like it is the real deal, yet this is about data in rest, the links are all at rest, so the data itself will remain and as soon as HTML6 comes we might see the beginning of the change. There have been requests on that with “This is the single-page app web design pattern. Everyone’s into it because the responsiveness is so much better than loading a full page – 10-50ms with a clean API load vs. 300-1500ms for a full HTML page load. My goal would be a high-speed responsive web experience without having to load JavaScript“, as well as “having the browser internally load the data into a new data structure, and the browser then replaces DOM elements with whatever data that was loaded as needed“, it is not mere speed, it would allow for dynamic data (data in motion) to be shown. So when I read ‘UK citizens to get more rights over personal data under new laws‘, I just laughed. The article is 15 hours old and I considered instantly the issues I shown you today. I will have to wait until the legislation is released, yet I am willing to bet a quality bottle of XO Cognac that data in motion is not part of this, better stated, it will be about stored data. All this whilst the new data norm is still shifting and with G5 mobile technologies, stored data might actually phase out to be a much smaller dimension of data. The larger players knew this and have been preparing for this for several years now. This is also an initial new need for the AI that Google wants desperately, because such a system could ascertain and give weight to all data in motion, something IBM is currently not able to do to the extent they need to.

The system is about to get shocked into a largely new format, that has always been the case with evolution. It is just that actual data evolution is a rare thing. It merely shows to me how much legislation is behind on all this, perhaps I will be proven wrong after the summer recess. It would be a really interesting surprise if that were the case, but I doubt that will happen. You can see (read about that) for yourself after the recess.

I will follow up on this, whether I was right or wrong!

I’ll let you speculate which of the two I am, as history has proven me right on technology matters every single time (a small final statement to boost my own ego).


Leave a comment

Filed under Finance, IT, Law, Media, Politics, Science

When the trust is gone

In an age where we see an abundance of political issues, an overgrowing need to sort things out, the news that was given visibility by the Guardian is the one that scared and scarred me the most. With ‘Lack of trust in health department could derail blood contamination inquiry‘ (at https://www.theguardian.com/society/2017/jul/19/lack-of-trust-in-health-department-could-derail-blood-contamination-inquiry), we need to hold in the first stage a very different sitting in the House of Lords. You see, the issues (as I am about to explain them), did not start overnight. In this I am implying that a sitting with in the dock Jeremy Hunt, Andrew Lansley, Andy Burham and Alan Johnson is required. This is an issue that has grown from both sides of the Isle and as such there needs to be a grilling where certain people are likely to get burned for sure. How bad? That needs to be ascertained and it needs to be done as per immediate. When you see “The contamination took place in the 1970s and 80s, and the government started paying those affected more than 25 years ago” the UK is about to get a fallout of a very different nature. We agree that this is the term that was with Richard Crossman, Sir Keith Joseph, Barbara Castle, David Ennals, Patrick Jenkin, Norman Fowler, and John Moore. Yet in that instance we need to realise that this was in an age that was pre computers, pre certain data considerations and a whole league of other measures that are common place at this very instance. I remember how I aided departments with an automated document system, relying on 5.25″ floppy’s, with the capability that was less than Wordstar or PC-Write had ever offered. And none of those systems had any reliable data storage options.

The System/36 was flexible and powerful for its time:

  • It allowed 80 monitors (see below for IBM’s description of a monitor) and printers to be connected. All users could access the system’s hard drive or any printer.
  • It provided password security and resource security, allowing control over who was allowed to access any program or file.
  • Devices could be as far as a mile from the system unit.
  • Users could dial into a System/36 from anywhere in the world and get a 9600 baud connection (which was very fast in the 1980s) and very responsive for connections which used only screen text and no graphics.
  • It allowed the creation of databases of very large size. It supported up to about 8 million records, and the largest 5360 with four hard drives in its extended cabinet could hold 1.453 gigabytes.
  • The S/36 was regarded as “bulletproof” for its ability to run many months between reboots (IPLs).

Now, why am I going to this specific system, as the precise issues were not yet known? You see in those days, any serious level of data competency was pretty much limited to IBM, at that time Hewlett Packard was not yet to the level it became 4 years later and the Digital Equipment Corporation (DEC) who revolutionised systems with VAX/VMS and it became the foundation, or better stated true relational database foundations were added through Oracle Rdb (1984), which would actually revolutionise levels of data collection.

Now, we get two separate quotes (not from the article) “Dr Jeremy Bradshaw Smith at Ottery St Mary health centre, which, in 1975, became the first paperless computerised general practice“, as well as “It is not developed or intended for use in any inherently dangerous applications, including applications that may create a risk of personal injury. If you use this software or hardware in dangerous applications, then you shall be responsible to take all appropriate fail-safe, backup, redundancy, and other measures to ensure its safe use“, the second one comes from the Oracle Rdb SQL Reference manual. The second part seems a bit of a stretch; consider the original setting of this. When we see Oracle’s setting of data integrity, consider the elements given (over time) that are now commonplace.

System and object privileges control access to application tables and system commands, so that only authorized users can change data.

  • Referential integrity is the ability to maintain valid relationships between values in the database, according to rules that have been defined.
  • A database must be protected against viruses designed to corrupt the data.

I left one element out for the mere logical reasons.

now, in those days, the hierarchy of supervisors and system owners was nowhere near what it is now (and often nowhere to be seen), referential integrity was a mere concept and data viruses were mostly academic, that is until we get a small presentation by Ralf Burger in 1986. It was in the days of the Chaos Computer Club and my trusty CBM-64.

These elements are to show you that data integrity existed in academic purposes, yet the designers who were in their data infancy often enough had no real concept of rollback data events, some would only be designed too long later, and in all this, the application of databases to the extent that was needed. It would not be until 1982 when dBase II came to the PC market from the founding fathers of what would later be known as Ashton-Tate, George Tate and Hal Lashlee would create a wave that would get us dBase III and with the creation of Clipper by the Nantucket Corporation, which would give a massive rise to database creations as well as the growth of data products that had never been seen before, as well as being the player that in the end propelled data quality towards the state it is nowadays. In this product databases did not just grow with the network abilities within this product nearly any final year IT person could have its portfolio of clients all with custom based products all data based. Within 2-3 years (which gets us to 1989), a whole league of data quality, data cleaning and data integrity base issues would surface for millions of places, all requiring solutions. It is my personal conviction that this was the point where data became adult, where data cleaning, data rollback as well as data integrity checks became actual issues that were seriously dealt with. So, here in 1989 we are finally confronted with the adult data issues that for the longest of times were only correctly understood by more than a few niche people who were often enough disregarded (I know that for certain because I was one of them).

So the essential events that could have prevented only to some degree the events we see in the Guardian with “survivors initially welcomed the announcement, while expressing frustration that the decades-long wait for answers had been too long. The contamination took place in the 1970s and 80s“, certain elements would not come into existence until a decade later.

So when we see “Liz Carroll, chief executive of the Haemophilia Society, wrote to May on Wednesday saying the department must not be involved in setting the remit and powers of an inquiry investigating its ministers and officials. She also highlighted the fact that key campaigners and individuals affected by the scandal had not been invited to the meeting“, I am not debating or opposing her in what could be a valid approach, I am merely stating that to comprehend the issues, the House of Lords needs to take the pulse of events and the taken steps forward from the Ministers who have been involved in the last 10 years.

When we see “We and our members universally reject meeting with the Department of Health as they are an implicated party. We do not believe that the DH should be allowed to direct or have any involvement into an investigation into themselves, other than giving evidence. The handling of this inquiry must be immediately transferred elsewhere“, we see a valid argument given, yet when we would receive testimonies from people, like the ministers in those days, how many would be aware and comprehend the data issues that were not even decently comprehended in those days? Because these data issues are clearly part of all of these events, they will become clear towards the end of the article.

Now, be aware, I am not giving some kind of a free pass, or give rise that those who got the bad blood should be trivialised or ignored or even set to a side track, I am merely calling for a good and clear path that allows for complete comprehension and for the subsequent need of actual prevention. You see, what happens today might be better, yet can we prevent this from ever happening again? In this I have to make a side step to a non-journalistic source, we see (at https://www.factor8scandal.uk/about-factor/), “It is often misreported that these treatments were “Blood Transfusions”. Not True. Factor was a processed pharmaceutical product (pictured)“, so when I see the Guardian making the same bloody mistake, as shown in the article, we see and should ask certain parties how they could remain in that same stance of utter criminal negligence (as I personally see it), but giving rise to intentional misrepresentation. When we see the quote (source: the Express) “Now, in the face of overwhelming evidence presented by Andy Burnham last month, Theresa May has still not ordered an inquiry into the culture, practice and ethics of the Department of Health in dealing with this human tragedy” with the added realisation that we have to face that the actual culprit was not merely data, yet the existence of the cause through Factor VIII is not even mentioned, the Guardian steered clear via the quote “A recent parliamentary report found around 7,500 patients were infected by imported blood products from commercial organisations in the US” and in addition the quote “The UK Public Health Minister, Caroline Flint, has said: “We are aware that during the 1970s and 80s blood products were sourced from US prisoners” and the UK Haemophilia Society has called for a Public Inquiry. The UK Government maintains that the Government of the day had acted in good faith and without the blood products many patients would have died. In a letter to Lord Jenkin of Roding the Chief Executive of the National Health Service (NHS) informed Lord Jenkin that most files on contaminated NHS blood products which infected people with HIV and hepatitis C had unfortunately been destroyed ‘in error’. Fortunately, copies that were taken by legal entities in the UK at the time of previous litigation may mean the documentation can be retrieved and consequently assessed“, the sources the Express and the New York Times, we see for example the quote “Cutter Biological, introduced its safer medicine in late February 1984 as evidence mounted that the earlier version was infecting hemophiliacs with H.I.V. Yet for over a year, the company continued to sell the old medicine overseas, prompting a United States regulator to accuse Cutter of breaking its promise to stop selling the product” with the additional “Cutter officials were trying to avoid being stuck with large stores of a product that was proving increasingly unmarketable in the United States and Europe“, so how often did we see the mention of ‘Cutter Biological‘ (or Bayer pharmaceuticals for that matter)?

In the entire Arkansas Prison part we see that there are connections to cases of criminal negligence in Canada 2006 (where Canadian Red Cross fell on their sword), Japan 2007 as well as the visibility of the entire issue at Slamdance 2005, so as we see the rise of inquiries, how many have truly investigated the links between these people and how the connection to Bayer pharmaceuticals kept them out of harm’s way for the longest of times? How many people at Cutter Biological have not merely been investigated, but also indicted for murder? When we get ‘trying to avoid being stuck with large stores of a non-sellable product‘ we get the proven issue of intent. Because there are no recall and destroy actions, were there?

Even as we see a batch of sources giving us parts in this year, the entire visibility from 2005-2017 shows that the media has given no, or at best dubious visibility in all this, even yesterday’s article at the Guardian shows the continuation of bad visibility with the blood packs. So when we look (at http://www.kpbs.org/news/2011/aug/04/bad-blood-cautionary-tale/), and see the August 2011 part with “This “miracle” product was considered so beneficial that it was approved by the FDA despite known risks of viral contamination, including the near-certainty of infection with hepatitis“, we wonder how the wonder drug got to be or remain on the market. Now, there is a fair defence that some issues would be unknown or even untested to some degree, yet the ‘the near-certainty of infection with hepatitis‘ should give rise to all kinds of questions and it is not the first time that the FDA is seen to approve bad medication, which gives rise to the question why they are allowed to be the cartel of approval as big bucks is the gateway through their door. When we consider the additional quote of “By the time the medication was pulled from the market in 1985, 10,000 hemophiliacs had been infected with HIV, and 15,000 with hepatitis C; causing the worst medical disaster in U.S. history“, how come that it took 6 years for this to get decent amounts of traction within the UK government.

What happened to all that data?

You see, this is not merely about the events, I believe that if any old systems (a very unlikely reality) could be retrieved, how long would it take for digital forensics to find in the erased (not overwritten) records to show that certain matters could have been found in these very early records? Especially when we consider the infancy of data integrity and data cleaning, what other evidence could have surfaced? In all this, no matter how we dig in places like the BBC and other places, we see a massive lack of visibility on Bayer Pharmaceuticals. So when we look (at http://pharma.bayer.com/en/innovation-partnering/research-focus/hemophilia/), we might accept that the product has been corrected, yet their own site gives us “the missing clotting factor is replaced by a ‘recombinant factor’, which is manufactured using genetically modified mammalian cells. When administered intravenously, the recombinant factor helps to stop acute bleeding at an early stage or may prevent it altogether by regular prophylaxis. The recombinant factor VIII developed by Bayer for treating hemophilia A was one of the first products of its kind. It was launched in 1993“, so was this solution based on the evolution of getting thousands of people killed? the sideline “Since the mid-1970s Bayer has engaged in research in haematology focusing its efforts on developing new treatment options for the therapy of haemophilia A (factor VIII deficiency)“, so in all this, whether valid or not (depending on the link between Bayer Pharmaceuticals UK and Cutter Biological. the mere visibility on these two missing in all the mentions, is a matter of additional questions, especially as Bayer became the owner of it all between 1974 and 1978, which puts them clearly in the required crosshairs of certain activities like depleting bad medication stockpiles. Again, not too much being shown in the several news articles I was reading. When we see the Independent, we see ‘Health Secretary Jeremy Hunt to meet victims’ families before form of inquiry is decided‘, in this case it seems a little far-fetched that the presentation by Andy Burham (as given in the Express) would not have been enough to give an immediate green light to all this. Even as the independent is hiding behind blood bags as well, they do give the caption of Factor VIII with it, yet we see no mention of Bayer or Cutter, yet there is a mention of ‘prisoners‘ and the fact that their blood was paid for, yet no mention of the events in Canada and Japan, two instances that gives rise to an immediate and essential need for an inquiry.

In all this, we need to realise that no matter how deep the inquiry goes, the amount of evidence that could have been wiped or set asunder from the eyes of the people by the administrative gods of Information Technology as it was between 1975 and 1989, there is a dangerous situation. One that came unwillingly through the evolution of data systems, one that seems to be the intent of the reporting media as we see the utter absence of Bayer Pharmaceuticals in all of this, whilst there is a growing pool of evidence through documentaries, ad other sources that seem to lose visibility as the media is growing a view of presentations that are skating on the subject, yet until the inquiry becomes an official part we see a lot less than the people are entitled to, so is that another instance of the ethical chapters of the Leveson inquiry? And when this inquiry becomes an actuality, what questions will we see absent or sidelined?

All this gets me back to the Guardian article as we see “The threat to the inquiry comes only a week after May ordered a full investigation into how contaminated blood transfusions infected thousands of people with hepatitis C and HIV“, so how about the events from 2005 onwards? Were they mere pharmaceutical chopped liver? In the linked ‘Theresa May orders contaminated blood scandal inquiry‘ article there was no mention of Factor VIII, Bayer (pharmaceuticals) or Cutter (biological). It seems that we need to give rise that ethical issues have been trampled on, so a mention of “a criminal cover-up on an industrial scale” is not a mere indication; it is an almost given certainty. In all that, as the inquiry will get traction, I wonder how both the current and past governments will be adamant to avoid skating into certain realms of the events (like naming the commercial players), and when we realise this, will there be any justice to the victims, especially when the data systems of those days have been out of time for some time and the legislation on legacy data is pretty much non-existent. When the end balance is given, in (as I personally see it) a requirement of considering to replace whatever Bayer Pharmaceuticals is supplying the UK NHS, I will wonder who will be required to fall on the virtual sword of non-accountability. The mere reason being that when we see (at http://www.annualreport2016.bayer.com/) that Bayer is approaching a revenue of 47 billion (€ 46,769M) in 2016, should there not be a consequence of the players ‘depleting unsellable stock‘ at the expense of thousands of lives? This is another matter that is interestingly absent from the entire UK press cycles. And this is not me just speculating, the sources give clear absence whilst the FDA reports show other levels of failing, it seems that some players forget that lots of data is now globally available which seems to fuel the mention of ‘criminal negligence‘.

So you have a nice day and when you see the next news cycle with bad blood, showing blood bags and making no mention of Factor VIII, or the pharmaceutical players clearly connected to all this, you just wonder who is doing the job for these journalists, because the data as it needed to be shown, was easily found in the most open of UK and US governmental places.


Leave a comment

Filed under Finance, IT, Law, Media, Politics, Science

Confirmation on Arrival

Last week, I gave you some of the views I had in ‘Google is fine, not fined‘ (at https://lawlordtobe.com/2017/06/28/google-is-fine-not-fined/). I stated “This is not on how good one or the other is, this is how valid the EU regulator findings were and so far, I have several questions in that regard. Now, I will be the last one keeping governments from getting large corporations to pay taxation, yet that part is set in the tax laws, not in EU-antitrust. As mentioned the searchers before, I wonder whether the EU regulators are facilitating for players who seem more and more clueless in a field of technology that is passing them by on the left and the right side of the highway called, the ‘Internet Of Things’“, 5 days later we see that my views were correct, again and again I have shown that looking behind the scenes is adamant to see the levels of misinformation and betrayal. Now in ‘To tackle Google’s power, regulators have to go after its ownership of data‘ (at https://www.theguardian.com/technology/2017/jul/01/google-european-commission-fine-search-engines) we now see: “The Google workshop at the Viva Technology show last month in Paris, which brought together players who shape the internet’s transformation“, this is what it always has been about. Who owns the data? Evgeny Morozov gives us a good story on what should be and what should not be, he pictures a possible upcoming form of feudalism, all drenched in data. It is no longer just about merely data and applicability; it is more and more about governments becoming obsolete. The EU is the first evidence in this. The EU is regarded as something that is on top of governments, yet that is not the case. It seems to be replacing them through orchestration. Mario Draghi is spending massive amounts of funds none of them have, yet in all this, yesterday we see “The European Central Bank has been dealt a heavy blow after inflation in June tumbled further below target, despite extreme measures from policymakers to stoke the economic measure” as well as “Unless price rises are stronger, ECB chief Mario Draghi has signaled that he is unlikely to scale back the mammoth levels of support for the economy“, so it is he and the ECB who are now setting the precedence of spending, printing money without any value behind supporting it. So is it ‘wealth distribution‘ or ‘wealth abolishment‘?

If we agree that this economy has failed, if we believe that this way of life is no more, when we accept that ¼th of this planets population is dead in roughly 25 years, what would come next? I would not presume to know that answer, yet can we imagine that if the dollar stops, we would need something else, in that case is data not a currency?

Now, I am perfectly happy to be utterly wrong here, I am also weirdly unsettled with the notion that our money is dwindling in value day after day. Now let’s get back to the ‘view’ of Morozov. When we see “Alphabet has so much data on each of us that any new incoming email adds very little additional context. There are, after all, diminishing returns to adding extra pieces of information to the billions it already possesses. Second, it’s evident that Alphabet, due to competition from Microsoft and Amazon, sees its paying corporate clients as critical to its future. And it’s prepared to use whatever advantages it has in the realm of data to differentiate itself from the pack – for example, by deploying its formidable AI to continue scanning the messages for viruses and malware“, we see more than just an adjustment in strategy.

Yet, I do not completely agree, you see data is only truly valued when it is up to date, so as data rolls over for new data new patterns will emerge. That would be an essential need for anything towards an AI, in this Data in motion and evolving data is essential to the core of any AI. and that timeline is soon becoming more adamant than some realise.

When we consider a quote from a 2006 article relating to a 2004 occurrence “Google published a new version of its PageRank patent, Method for node ranking in a linked database. The PageRank patent is filed under its namesake, Lawrence Page, and assigned to The Board of Trustees of the Leland Stanford Junior University; US Patent 7,058,628“, we should consider that the value it has will diminish (read: be reduced) in 2024 (for Google that is). There is of course another sight that this was ‘version 2‘, so others would be able to get closer with their own version. In 6 years as the Patent ends it will be open to all to use. No matter what some have, you only need to switch to Bing for a few days to see how straggling and incomplete it is. When you realise that Microsoft has no way at present to offer anything close to it, you get the first inside of how high the current Google value is and how much it scares governments and large corporations alike.

Now we get to the ‘ground works’ of it. From this we can see that Google seems to have been the only one working on an actual long term strategy, an event that others have stopped doing for a long time. All we see from Microsoft and IBM has been short term, masquerading as long term goals with 70% of those goals falling into disrepair and become obsolete through iteration (mainly to please the stakeholders they report to), is it such a surprise that I or anyone else would want to be part of an actual visionary company like Google? If Google truly pulls of the AI bit (it has enough data) we would see a parsing of intelligence (read: Business Intelligence) on a scale never witnessed before. It would be like watching a Google Marine holding a 9mm, whilst the opposite is the IBM Neanderthal (read: an exaggeration, the IBM would be the Cro-Magnon, not Neanderthal) holding a pointy stick named Watson. The extreme difference would be that large. In all this governments are no longer mentioned. They have diminished into local governments organising streams of data and facilitating consumers, mere civil servants in service of the people in their district. Above that, those levels of workers would become obsolete; the AI would set structures and set resources for billions. We went from governments, to organisations, we left fair opportunity behind and moved to ‘those who have and those who have not‘, and they are soon to be replaced for the ‘enablers and obstructers‘ and those who are the latter would fall into the shadows and face away.

Am I Crazy?

Well, that is always a fair argument, yet in all this, we have Greece as an initial example. Greece is possibly the only European nation with a civilisation that would soon become extinct twice. So as we see reports of lagging tourism revenue, on top of high regarded rises in GDP, rises we know that are not happening as the revenues are down by a larger margin (source: GTP), Greek revenue is down by 6.8 percent, which is massive! This gives stronger notions that the ‘beckoning of Greek bonds‘ is nothing more than a façade of a nation in its final moments of life. The fact that the ECB is not giving it any consideration for its trillion spending could also be regarded as evidence that the ECB has written off Greece. So tell me, when was the last time that nations were written off? Some of the press is now considering the works of former ‘rock star’ Yanis Varoufakis. Yet in all this, when did they actually change the landscape by investigating and prosecuting those who got Greece in the state it is in now? In the end, only the journalist releasing a list of millionaires pulling their money out of Greece, only he went to prison. So, as such, Greece is a first step of evidence that governments are no longer the powers they once claimed they were, and as less and less government officials are being held to account when it comes to larger financial transgressions is also a factor as to why the people of those nations no longer give them any regard.

The second view is in the UK, here we see ‘U.K. to End Half Century of Fishing Rights in Brexit Slap to EU‘, in this Bloomberg gives us “Prime Minister Theresa May will pull Britain out of the 1964 London convention that allows European fishing vessels to access waters as close as six to twelve nautical miles from the U.K. coastline“, in here we also see “This is an historic first step towards building a new domestic fishing policy as we leave the European Union — one which leads to a more competitive, profitable and sustainable industry for the whole of the U.K.“, which is only partially true. You see, Michael Gove has only a partial point and it is seen with: “Britain’s fishing industry is worth 775 million pounds and in 2015 it employed 10,162 full-time fishermen, down from about 17,000 in 1990. In almost three decades, fleet numbers dropped a third to 6,200 vessels and the catch has shrunk 30 percent“, the part that is not given is that from 1930 onwards engineering made massive strides in the field of ship engines, not large strides but massive ones. A ship, and its crew can catch fish, yet it is the engines that allow for the nets to be bigger and for the winches to be stronger to hoist those filled nets. In the ‘old’ days 2000 horsepower was a really powerful vessel, which amounted to 1.5 megawatts. Nowadays, these boats start at well over 300% of what was, so not only are the ships larger, can hold more fish and pull more weight, these ships are also getting more efficient in finding fish. I personally witnessed one of the first colour screen fish radars in 1979. In this field technology has moved far beyond this, almost 4 decades beyond this. If there is one part clearly shown, than it is the simple fact that technology changed industries, which has been a given for the better part of three generations. Not merely because we got better at what we do or how we do it, but as fishing results show that catches has been down by 30%, there is the optional element that there is less to catch because we got too efficient. It is a dwindling resource and fishing is merely the first industry to see the actual effects that lack of restraint is leading to.

So when we see a collapsed industry, can we blame governments? Who can we blame and is blame an actual option? In this, is there any validity in the fact that this part of government has surpassed its date of usefulness? Perhaps yes and there is equal consideration that this is not the case, yet the amount of consumers remains growing and as available resources go down we see the need for other solutions.

This is merely a first part. As we now move into the US and their 4th of July part, I will now look at other sides as well, sides we stopped considering. You see, there is opposition and it is growing. CNBC gives us one side to this with ‘Google Deep Mind patient data deal with UK health service illegal, watchdog says‘ (at http://www.cnbc.com/2017/07/03/google-deepmind-nhs-deal-health-data-illegal-ico-says.html), three points were raised. “A data sharing deal between Google’s Deep Mind and the U.K.’s National Health Service “failed to comply with data protection law“, the U.K.’s Information Commissioner’s Office (ICO) said“, “The deal between the two parties was aimed at developing a new app called Streams that helped monitor patients with acute kidney disease” as well as “the ICO said that patients were not notified correctly about how their data was being used“. Now, we can agree that an optional situation could exist. So does Elisabeth Denham have a point? For now let’s agree that she does, I would reckon that there has been a communicative transgression (this is how she plays it), yet is she being over formal or is she trying to slice the cake in a different way? The strongest statement is seen with “For example, a patient presenting at accident and emergency within the last five years to receive treatment or a person who engages with radiology services and who has had little or no prior engagement with the Trust would not reasonably expect their data to be accessible to a third party for the testing of a new mobile application, however positive the aims of that application may be.” OK, I can go along with that, we need certain settings for any level of privacy to be contained, yet…..there is no yet! The issue is not Google, the issue is that the data protection laws are there for a reason and now, it will hinder progress as well. As health services and especially UK NHS will need to rely on other means to stay afloat as costs are weighing it more and more to the bottom of an ocean of shortage of funding, the NHS will need to seek other solutions that will set an upward movement whilst the costs are slowly being worked on, it will take a long time and plenty of cash to sort it out, Google is merely one player who might solve the partial issue. Yet, the news could go in other directions too. Google is the largest, yet not the only player in town, as people seem to focus on marketing and presentations, we see IBM and to the smaller extent Microsoft and we all forget that Huawei is moving up in this field and it is gaining momentum. The cloud data centre in Peru is only a first step. It is only the arrogance of Americans that seem to think that this field is an American field. With Peru, India and China, Huawei is now active on a global scale. It has hired the best of the best that China has to offer and that is pretty formidable, There is no way that Huawei could catch up with Google in the short term, yet there services are now in a stage that they can equal IBM. As we see a race for what is now at times called the IoT landscape, we see the larger players fight for the acceptance of ‘their IoT standard’, and even as we see IBM mentioned, we see clearly that Google has a large advantage in achievements here and is heading the number of patents in this field, as Huawei is pretty much accepting the Google IoT standard, we see that they can focus on growth surpassing IBM, Qualcomm and Intel. In this Huawei will remain behind Apple in size and revenue, but as it is not in that field in a true competitive way Huawei might not consider Apple a goal, yet as they grow in India, Huawei could surpass the Tata group within 2 years.

So how does this matter?

As we see the steps (the not incorrect steps) of Elisabeth Denham, the acts as we saw in the Guardian on how regulators are trying to muzzle and limit the growth and activities of Google, how much influence do they have with Huawei? Even as we see that Huawei is privately owned, there have been a few articles on Ren Zhengfei and his connection to the Chinese military. It has spooked the US in the past, and consider how spooked they will get when Huawei grows their service levels in places like Greece, Spain and Italy? What will the EU state? Something like “your money smells, we will not accept it“. No! The EU is in such deep debt that they will invite Huawei like the prodigal son being welcomed home. So whilst everyone is bitching on how Google needs to be neutered, those people allow serious opponents and threats to Google’s data future to catch up. Huawei is doing so, one carrier at a time and they are doing it in a global way.

So as we see all kind of confirmations from media outlets all over the world, we seem to forget that they are not the only player in town as their growth in EU nations like Spain with a new android base Set Top Box (STB), Huawei just now becomes the competitor for Telefonica, Vodafone and Orange, implying that it now has a growing beach head into Europe with decent technology for a really affordable price. In a place where they all complain on how there is no economy, Huawei is more than a contender and it is growing business where others had mere presence and sustainable levels of revenue. It is merely a contained view on how the EU regulators seem to be fumbling the ball for long term growth, whilst handing opportunity to China (read: Huawei), who will be eagerly exporting to Europe the products they can.

In all this, CoA can be seen as a mere confirmation, a Course of Action by regulators, the Court of Appeal for Google, the Cost of Application for Huawei, the Coming of Age for Business Intelligence and the Center of Attention that Google is calling on themselves, whether intentional or not does not matter. We are left with the question whether at this point, the limelight is the best for them, we will leave that to Mr. Alphabet to decide.

1 Comment

Filed under Finance, IT, Law, Media, Politics, Science

Google is fine, not fined

Yup, that’s me in denial. I know that there will be an appeal and it is time for the EU to actually get a grip on certain elements. In this matter I do speak with some expert authority as I have been part of the Google AdWords teams (not employed by Google though). The article ‘Google fined record €2.4bn by EU over search engine results‘ (at https://www.theguardian.com/business/2017/jun/27/google-braces-for-record-breaking-1bn-fine-from-eu) is a clear article. Daniel Boffey gives us the facts of the case, which is what we were supposed to read and get. Yet there is another side to it all and I think the people forgot just how terribly bad the others are. So when I read: “By artificially and illegally promoting its own price comparison service in searches, Google denied both its consumers real choice and rival firms the ability to compete on a level playing field, European regulators said“, so let’s start with this one and compare it to the mother of all ….. (read: Bing). First of all, there is no ‘Shopping’ tab. So there is that! If I go into the accursed browser of them (read: Internet Explorer), I get loads of unwanted results. In light of the last few days I had to enter ‘Grenfell .co.uk‘ a few times and guess what, I get “Visit Grenfell, Heart of Weddin Shire” in my top results, a .org.au site. The place is in NSW. Did I ask for that? Google gives a perfectly fine result. Now, I am not including the top ads as the advertisers can bid for whatever solution they want to capture. So let’s have a look at Bing ads. First I can choose to be visible in Aussie or Kiwi land, I can be visible globally or I can look at specific locations. So how do you appeal to the Australian and Scandinavian markets? Oh, and when you see the Bing system, it is flawed, yet it uses all the Google AdWords terms and phrases, callout extensions, snippets. They didn’t even bother to give them ‘original’ Bing names. And I still can’t see a way to target nations. So when we see a copy to this extent, we see the first evidence that Google made a system that a small time grocery shop like Microsoft cannot replicate at present. We can argue that the user interface is a little friendlier for some, but it is lacking in several ways and soon, when they are forced to overhaul, you get a new system to learn. So when the racer (Micro$oft) is coming in an Edsel and is up against a Jaguar XJ220, is it dominance by manipulating the race, or should the crying contender considered coming in an actual car?

Next, when I read ‘rival firms the ability to compete on a level playing field’, should the EU regulator consider that the other player does not have a shopping tab, the other players has a lacking advertisement management system that require massive overbidding to get there? Then we get the change history. I cannot see specifics like ‘pausing a campaign‘, this seems like a really important item to show, for the most ALL changes are important and the user is not shown several of them.

In the end, each provider will have its own system; it is just massively unsettling on how this system ‘mimics’ Google AdWords. Yet this is only the beginning.

The quote “The commission’s decision, following a seven-year probe into Google’s dominance in searches and smartphones, suggests the company may need to fundamentally rethink the way it operates. It is also now liable to face civil actions for damages by any person or business affected by its anti-competitive behaviour” really got me started. So, if we go back to 2010, we see the BBC (at http://news.bbc.co.uk/2/hi/business/8174763.stm) give us “Microsoft’s Bing search engine will power the Yahoo website and Yahoo will in turn become the advertising sales team for Microsoft’s online offering. Yahoo has been struggling to make profits in recent years. But last year it rebuffed several takeover bids from Microsoft in an attempt to go it alone” in addition there is “Microsoft boss Steve Ballmer said the 10-year deal would provide Microsoft’s Bing search engine with the necessary scale to compete“. Now he might well be the 22nd richest person on the planet, yet I wonder how he got there. We have known that the Yahoo system has been flawed for a long time, I was for a long time a Yahoo fan, I kept my account for the longest of times and even when Google was winning the race, I remained a loyal Yahoo fan. It got me what I needed. Yet over time (2006-2009) Yahoo kept on lagging more and more and the Tim Weber, the Business editor of the BBC News website stated it the clearest: “Yahoo is bowing to the inevitable. It simply had neither the resources nor the focus to win the technological arms race for search supremacy“. There is no shame here, Yahoo was not number one. So as we now realise that the Bing Search engine is running on a flawed chassis, how will that impact the consumer? Having a generic chassis is fine, yet you lose against the chassis of a Bentley Continental. Why? Because the designer was more specific with the Bentley, it was specific! As Bentley states: “By bringing the Speed models 10mm closer to the ground, Bentley’s chassis engineering team laid the foundation for an even sportier driving experience. To do so they changed the springs, dampers, anti-roll bars and suspension bushes. The result is improved body control under hard cornering, together with greater agility“, one element influences the other, and the same applies to online shopping, which gets us back to Steve Ballmer. His quote to the BBC “Through this agreement with Yahoo, we will create more innovation in search, better value for advertisers, and real consumer choice in a market currently dominated by a single company“, is that so? You see, in 2009 we already knew that non-Google algorithms were flawed. It wasn’t bad, there was the clear indication that the Google algorithms were much better, these algorithms were studies at universities around the world (also at the one I attended), the PageRank as Stanford University developed it was almost a generation ahead of the rest and when the others realised that presentations and boasts didn’t get the consumer anywhere (I attended a few of those too), they lost the race. The other players were all about the corporations and getting them online, getting the ‘path build’ so that the people will buy. Yet Google did exactly the opposite they wondered what the consumer needed and tended to that part, which won them the race and it got transferred into the Advertisement dimension as such. Here too we see the failing and the BBC published it in 2009. So the second quote “Microsoft and Yahoo know there’s so much more that search could be. This agreement gives us the scale and resources to create the future of search“, well that sounds nice and all marketed, yet, the shown truth was that at this point, their formula was flawed, Yahoo was losing traction and market share on a daily basis and what future? The Bing system currently looks like a ripped of copy (a not so great one) of the Google AdWords system, so how is there any consideration of ‘the ability to compete on a level playing field‘? In my view the three large players all had their own system and the numbers two and three were not able to keep up. So is this the case (as the EU regulator calls it) of “by promoting its own comparison shopping service in its search results, and demoting those of competitors“, or is there a clear growing case that the EU regulator does not comprehend that the algorithm is everything and the others never quite comprehended the extend of the superiority of the Google ranks? Is Google demoting others, or are the others negating elements that impact the conclusion? In car terms, if the Google car is the only one using Nitro, whilst the use of Nitro is perfectly legal (in this case). In addition, we see in 2015 ‘Microsoft loses exclusivity in shaken up Yahoo search deal‘ as well as “Microsoft will continue to provide search results for Yahoo, but in a reduced capacity. The two have renegotiated the 2009 agreement that saw Redmond become the exclusive provider of search results for a company that was once known for its own search services. This came amid speculation that Yahoo would try to end the agreement entirely“, so not only are they on a flawed system, they cannot agree on how to proceed as friends. So why would anyone continue on a limited system that does not go everywhere? In addition in April 2015 we learn “The other major change is that Microsoft will now become the exclusive salesforce for ads delivered by Microsoft’s Bing Ads platform, while Yahoo will do the same for its Gemini ads platform“, So Yahoo is cutting its sales team whilst Microsoft has to grow a new one, meaning that the customers have to deal with two systems now. In addition, they are now dealing with companies having to cope with a brain drain. Still, how related are these factors?

I personally see them as linked. One will influence the other, whilst changing the car chassis to something much faster will impact suspension and wheels, we see a generalised article (at no fault to the Guardian or the writer), yet I want to see the evidence the EU regulator has, I have been searching for the case notes and so far no luck. Yet in my mind, as I see the issues that those involves on the EU regulator side d not really comprehend the technology. This can be gotten from “According to an analysis of around 1.7bn search queries, Google’s search algorithm systematically was consistently giving prominent placement to its own comparison shopping service to the detriment of rival services“, where is that evidence? Analyses are the results of the applied algorithm (when it is done correct) and in this the advertiser is still the element not begotten. I have seen clients willing to bid through the roof for one keyword, whilst today, I notice that some of the elements of the Bing Ads do not support certain parts, so that means that my results will be impacted for no less than 10%-20% on the same bidding, so is it ‘demoting results of competitors‘, or is the competitor system flawed and it requires bids that are 20% higher just to remain competitive? And if I can already state that there are dodgy findings based on the information shown, how valid is the EU regulation findings and more important, where else did they lack ‘wisdom’?

There are references to AdSense and more important the issue they have, yet when we consider that the EU is all about corporations, these places want facilitation and as they ignored AdSense, that solutions started to get traction via bloggers and information providers. So when we see: “In a second investigation into AdSense, a Google service that allows websites to run targeted ads, the commission is concerned that Google has reduced choice by preventing sites from sourcing search ads from competitors“. Is that so? The larger publishing houses like VNU (well over 50 magazines and their related sites), so in 2005, Google got new clients and as such grew a business. And that was just in the Netherlands. Now those just yanking in a corner, trying to present systems they did not have 4 years later, and they are now crying foul?

There are leagues of comparison sites. One quote I really liked was “Google is like the person that has it all together but is too conservative sometimes, and Bing is like the party friend who is open to anything but is a hot mess”. Another quote is from 2016: “With Bing Ads though, you can only show your ads on the Content Network if you’re targeting the entire US”. So an issue of targeting shown in 2016, an issue that Google AdWords did not have a year earlier. This is important because if you cannot target the right people, the right population, you cannot be competitive. This relates to the system and the EU-regulators, because a seven year ‘investigation’ shows that a year ago, the other players were still lagging against Google, in addition, when we read in the Guardian article: “the EU regulator is further investigating how else the company may have abused its position, specifically in its provision of maps, images and information on local services”, we need to realise that when we relate to cars, the other players are confined to technology of 1989 whilst Google has the Williams F1 FW40 – 2017. The difference is big and getting bigger. It is more than technology, whilst Microsoft is giving the people some PowerPoint driven speech on retention of staff, something that IBM might have given the year before, Google is boosting mental powers and pushing the envelope of technology. Whilst Bing maps exist, they merely show why we needed to look at the map in Google. This is the game, Microsoft is merely showing most people why we prefer to watch them on Google and it goes beyond maps, beyond shopping. As I personally see it, Microsoft is pushing whatever they can to boost Azure cloud. IBM is pushing in every direction to get traction on Watson. Google is pushing every solution on its own merit; that basic difference is why the others cannot keep up (that’s just a personal speculative view). I noticed a final piece of ‘evidence’ in a marketing style picture, which I am adding below. So consider the quote ’51 million unique searchers on the Yahoo! Bing Network do not use GOOGLE’, so consider the fact of those trying to address those 51 million, whilst they could be addressing 3.5 billion searchers.

The business sector wants results, not proclaimed concepts of things to come. Microsoft is still showing that flaw with their new Consoles and the upcoming Scorpio system (Xbox One X), users want storage, not streaming issues. They lost a gaming market that was almost on equal term with Sony (Xbox 360-PlayStation 3), to a situation where it now has a mere 16% market of the Sony market and that is about to drop further still as Nintendo is close to surpassing Microsoft too.

There is always a niche market (many people), who want to kick the biggest player in town, I get that. Yet at present the issues shown and as far as I get the technology, I feel that the EU regulators are failing in a bad way. I might be wrong here and If I get the entire commission papers and if issues are found, I will update this article as I am all about informing people as good and as correct as possible. Yet the one element that is most funny, is that when I open up Internet Explorer and I type in ‘Buy a Washing Machine‘ Bing gives me 8 options, 7 from David Jones and 1 from Snowys outdoors, which is a portable one and looks like a cement mixer. So when was the last time you went to David Jones to watch a washing machine? In Google Chrome I get 6 models on the right side, with 3 from Harvey Norman, 2 from the Good Guys and one from Betta, and that is before I press the shopping tab, so can we initially conclude that Micro$oft has a few issues running at present? Oh and the Google edition gives me models from $345 to $629, Bing prices were $70 for the portable one and the rest were $499-$1499.

This is not on how good one or the other is, this is how valid the EU regulator findings were and so far, I have several questions in that regard. Now, I will be the last one keeping governments from getting large corporations to pay taxation, yet that part is set in the tax laws, not in EU-antitrust. As mentioned the searchers before, I wonder whether the EU regulators are facilitating for players who seem more and more clueless in a field of technology that is passing them by on the left and the right side of the highway called, the ‘Internet Of Things’.

From my point of view Google is doing just fine!

The EU regulator? Well we have several questions for that EU department.

1 Comment

Filed under IT, Media, Science

Two sides of fruit

There are always issues when you get to the topic of fruits. One is the question whether it applies to the members of the US congress (the members of the US Senate are usually labelled as nuts). Is it an issue with actual nutritional products or are we talking about the device that Newton used for gravity? Yes, it is the third one as Newton discovered gravity with an apple.

Yet even here we see two sides at present. The first one is seen with ‘iMac Pro: Apple launches powerful new desktop – starting at $4,999‘ (at https://www.theguardian.com/technology/2017/jun/05/imac-pro-apple-launches-powerful-new-desktop-macbook-starting-at-4999). Here we see the quote “The new iMac Pro starts with an 8-core Intel Xeon processor, but can be configured with an 18-core processor variant, as well as up to 128GB of EEC RAM, 4TB of SSD storage and Radeon Vega discrete graphics cards with up to 16GB of memory“, you see, Apple, like Microsoft, IBM and since resent ASUS have become agents of iterations, true innovation has not been on their shores for too long a time, which is why my new device is for consideration with Huawei and Google alone. Only they have shown the continued race for actual innovation. there is also Samsung, but as I had a legal issue in 1991, I took them from the consideration list, I can hold a grudge like only the Olympian gods can. Still in their defence. the question becomes how can you make a computer truly innovative? It is a question that is not easily answered. there are a few options, yet some of the technology required is still in its infancy here.

In addition, in similar ways, iWork has been unable to grow due to the restrictions (read: limitations) that the suite offers. Instead of trying to persuade the Microsoft Office users (which is not a bad path), iWork has not grown in the directions it could and they are now paying for it through reduced exposure. Still, there remains a valid opposition to my accusation of: ‘have become agents of iterations’. To see this, we cannot just state that there is a new iMac and as such they are merely iterating. There is in addition the issue of hardware versus software. So in my view, a true innovation would have been a Wi-Fi upgrade, not just a faster system, but a system that is keyed to the home and mobile devices. As we are now a little over a year from the first steps of 5G, as we are all more and more connected via different devices, Apple left out in the open a huge sales opportunity by having the options of having devices linked and interlocked. A missed opportunity. You see as bandwidth becomes more and more an issue, as we tend to have a home bandwidth that is 100 times larger, having the option of the auto upgrade manager on your desktop device (iMac). So when you come home, apps and content will be distributed to the devices you want them to placed in. So at home ‘without even thinking’ (sorry Microsoft for using your Windows 95 slogan). the devices will do what needs to be done and you need not mind. You see, as people are trying to push Block chain into every financial corner, those people forgot on how block chains can also be the foundation for users on multiple devices. Now that is not always needed, because we get mail in the cloud, data in the cloud and via the cloud, but that is not for everyone. In addition, people forget about the photo’s they took and they do not always want that in some cloud. There are legions of options here, but at time we want some of this offline. finally, as we do specific tasks (for example on a train), we prefer not to lose too much bandwidth whilst on a train. Tablet and mobile bandwidth can be expensive. In equal size we tend to forget how large some files are and as such we could rush through our bandwidth in no time. This is just one of two options and we have seen very little development in that regard. Apple might want to let others develop it first, but that also leaves them with less when they need to have that additional step forward. It was a mistake Microsoft hid behind for the better part of 2 decades. In that same approach we see how consultancy and project software could benefit a different side in their designs. Now, that is not for Apple to side with, but it could have been an opportunity to grow in new directions. Anyway this is not about starting a fight on 3rd party vs others, this is about iteration vs innovation and Apple has been reluctantly innovative.

This gets us to the other side of it and here I am not siding with Apple, but I am wondering if Apple has been treated correctly. This we see in ‘Apple ‘error 53’ sting operation caught staff misleading customers, court documents allege‘ (at https://www.theguardian.com/technology/2017/jun/05/apple-error-53-sting-operation-caught-staff-misleading-customers-court-documents-allege). Now first let’s take a look at the error 53 part. The issue is that “‘Error 53’ is a message that occurred after updating to iOS 9.0 on iPhones of people who had had their TouchID fingerprint sensor replaced by a repair shop not licensed by Apple. The phones were rendered useless because the operating system update detected a mismatch between the sensor and the phone, and locked the device, assuming unauthorised access was being attempted.

Now here we see two sides.

In the first side we see “Knives damaged by misuse, improper maintenance, self-repair, or tampering are not covered.“, this is something Buck knives has in play. By the way, this comes with a life time warranty so that remains awesome. In addition, for decades TV warranties were voided if unauthorised repairs were made (or repairs by unqualified repairman). With laptops there was Compaq, who would void any warranty if a non Compaq technician had worked on it. They even created special Compaq screwdrivers to keep a handle on it all. So when we see ‘replaced by a repair shop not licensed by Apple‘, I am not certain if the ACCC has a case, they have not acted against Philips, Sony and a few others for the longest of times.

So when I read: “accuses Apple of wrongly telling customers they were not entitled to free replacements or repair if they had taken their devices to an unauthorised third-party repairer” I remain in doubt whether they have a case.

So when we see “Australian consumer law clearly protects the right of a customer to a replacement or free repair if the product is faulty or of unacceptable quality“, which I agree with, yet the owner did not go to Apple, did they? I have had my own issue with Apple in this regard (different device), yet can we agree that when we read: “It is however important to note that if a non-genuine part is fitted to your Toyota and that part’s failure or the incorrect fitment damages your vehicle, then that damage may not be covered by your Toyota Warranty“, so how can something that applies and is valid for Toyota is not valid for Apple?

I believe that ACCC acted out with another agenda. The need for warranty protection by having repairs done by authorised service people has been in the axial of repairs for decades. In addition, when we look at the facts, why would ANYONE go to a third party for warranty repair? That is just insane. So when we read “wrongly telling customers they were not entitled to free replacements or repair if they had taken their devices to an unauthorised third-party repairer“, I am actually wondering how they could come to the conclusion ‘wrongly‘. You see when we read: “Australian consumer law clearly protects the right of a customer to a replacement or free repair if the product is faulty or of unacceptable quality” we now wonder how true that is. You see, warranty is either valid (Apple fixes it for free), or it is beyond the warranty term and you have to pay for it and then it is no longer done for free, so you might select a third party. Yet if this is not an Apple authorised dealer, don’t you have anyone but yourself to blame?

So this is the other side of the apple, what constitutes voided warranty.

You see, if Apple loses this part, I can start repairing Raytheon’s Griffin systems. You see the upgrade (from C to C-ER) and equipment alignment costs are roughly $15,000 per day (excluding parts), if you do not have the proper Service Level Agreement. I can offer to do it for $5,000 a day. so if my work is shoddy (which they will not know until they fire the device, I can be very innovative towards my income), can they apply for warranty at Raytheon, or have they voided their options? You see I will have a NDA with a ‘this repair has been completed to our highest corporate standards’, so I am in the clear and the way the world goes, with 225 upgrades, I will have a decent Christmas this year. Yet at that point the ACCC will not go after Raytheon, it will go after me (what a whuzzes). So how come that the rights of Raytheon are better than those of Apple?

It seems that people assume so much with their mobile devices nowadays, I need to wonder if people comprehend what they buy and what responsibilities come with it. In this the initial question ‘Why did you not take your device to Apple?‘ is one that is not addressed at present and as such I have little faith that the ACCC has a decent case at present (in the shape we saw presented today).

the second and first part interacts as the upcoming shifts will in equal part see new frontiers in Service Level Agreements, Customer Responsibility and the comprehension of the elements covered in a warranty. Because what is included is likely to shift a fair bit over the next 2 years. In addition, innovation is also a shifting concept. Whilst it was “a new idea, device or method”, we (read: the corporate marketing departments) have often seen it as ‘the application of a solution that allows to meet the new or altered requirement of the customer‘ which we get when we iterate with a more powerful processor, more storage, larger screen. So going from 1080i to 5K screens might be accepted as truly innovative, because that took another level of screen and electronics. Yet at times, the pass through of merely upgraded speeds are also seen as innovation, yet at what level is that? When the device remains merely the same to the largest extent, is that not merely iteration?

So here we see the two sides of the other Apple. What we see, what the maker offers and how we both interpret the presented term of innovation.


Leave a comment

Filed under IT, Law, Media, Military

In light of the evidence

We tend to accept facts and given situations whenever we have a reliable source and a decent level of evidence. The interesting side is that howling to the moon like a group of sheep hoping the lone wolf will not hear them is an equally weird revelation. The question becomes at that point, who is the lone wolf and who are the sheep, because neither position nor identity is a given. Now, for the first art, we have the Guardian article (at https://www.theguardian.com/politics/2017/may/27/eu-theresa-may-combat-terror-brexit-europol), with the expected title ‘We need deal with the EU to combat terror, experts tell Theresa May‘, which of course gets them the DGSE, yet the usefulness of the rest becomes a bit of an issue. For this part we need to look somewhere else, and we will do that after the given quote in the mentioned article “Although our partnership with the US for intelligence sharing is extremely important, the fact is that the current terrorist threat is very much a European dimension issue. The Schengen database and knowing about who has moved where are all intimately dependent on European systems and we have got to try to remain in them“. This could be a valid and valued statement, yet is that truly the case? For this we need to take a little gander to another place of intelligence and Intel interest. The Cyber monkeys, or is that the cyber-mercenaries? The difference is merely a moment when you WannaCry 1.4. You will have heard, or perhaps read regarding the NHS as it was struck, here again we see: “However, it instead appears to be down to organisations and individuals failing to run keep Windows up to date“, which was actually voiced by NHS Digital, the failure of policies as they were not adhered to by IT staff, or at least those responsible for keeping those PC’s up to date with patches. The second quote given much earlier in the IT article is ““To be abundantly clear, the recent speculation concerning WannaCry attributes the malware to the Lazarus Group, not to North Korea, and even those connections are premature and not wholly convincing,” wrote James Scott, a senior fellow at the Institute for Critical Infrastructure Technology (ICIT)“, which is where I have been all along. The one nation that has less computer and internet innovation than a Nintendo GameCube sets this level of hardship? It is just too whack for thought. It is the quote “At best, WannaCry either borrowed heavily from outdated Lazarus code and failed to change elements, such as calls to C2 servers, or WannaCry was a side campaign of a minuscule subcontractor or group within the massive cybercriminal Lazarus APT” that changes the game. In addition we see: “The publication referred to “digital crumbs” that the cyber security firm had traced to previous attacks widely attributed to North Korea, like the Sony Pictures hack in late 2014″, we will exclude the quote “Shadow health secretary Jon Ashworth has said Labour would invest an extra £5 billion into new IT infrastructure for the NHS, after hospitals and services were affected by the widespread Ransomware attack on Friday“, especially as Labour had in the previous government wasted £11.2 billion on an IT system that never worked, so keeping them away from it all seems to be an essential first.

The issue is now in several phases. Who got hit (those not updating their systems). It affected according to some sources thousands of systems, yet when it comes to backtracking to a point of origin, the Cyber Intelligence groups remain unclear. The IT article (at http://www.itpro.co.uk/security/28648/nhs-ransomware-north-korea-may-not-be-behind-wannacry), gives us a few things, yet the clear reference to the Guardians of Peace, the identity the hackers had given themselves in the Sony event gives a few additional worries. Either this is clearly a mercenary group without identity, or we have a common new issue on identity when it comes to Cyber criminals. You see, as we see more and more proclaiming the links between the Lazarus group and North Korea, we do not get to see a clear link of evidence. Many sources give us ‘could be linked‘, or ‘highly likely‘, which is an issue. It makes the evidence too shallow and circumstantial. The NY Times gives us (at https://www.nytimes.com/2017/05/22/technology/north-korea-ransomware-attack.html) yet they are basically stating what Symantec game us and mention that. My issue here is “But the hackers left behind a trail of digital crumbs that Mr Chien and his colleagues had traced to previous attacks by the Lazarus Group“, what if the crumbs were an intentional side? You see, the quote “another group of hackers that call themselves the Shadow Brokers published the details of National Security Agency hacking tools that the WannaCry hackers were able to use to add muscle to their attacks” give a different light. The fact that there is a team reengineering tools and flaws to get somewhere fast is one. We have seen the lack of actual cyberpower of North Korea in the past, the fact that they are regarded on the same level as Chinese Cyber forces is a bit silly. You see, any country has its own level of savants, yet the fact that North Korea, a nation as isolated as it is, gets to be on par with China, an actual superpower that has Cyber infrastructures, experts at the University of Shanghai (the white paper on cracking AES-256, 2001), as well as a growing IT technology base is just a little too whack.

This now reflects back to the European need of Schengen. The UK needs quality intelligence and with the US breaches of Manchester, the fact that no high quality evidence was ever given regarding the Sony Hack, the growing source of all kinds of hacker names and no validity or confirmable way to identify these groups leaves us with a mess that pretty much anyone could have done this. In light of the NSA flaw finders, there is now more evidence in the open giving the speculative hacker as one with skills that equal and surpass people graduating with high honours at MIT, than anything North Korea could produce. It does not put North Korea in the clear (well the fact that the generals there had no comprehension of a smartphone should be regarded as such), and as we see the entire Bitcoin go forward, we need to take more critical looks at the given evidence and who is giving that evidence. We all agree that places like Symantec and Kaspersky should be highly regarded, yet I get the feeling that their own interns know more about hacking then the sum of the population of all North Koreans do, which is saying a lot. We see supportive evidence in the Business Insider (at http://www.businessinsider.com/wannacry-ransomware-attack-oddities-2017-5). Here we see IBM with “IBM Security’s Caleb Barlow, researchers are still unsure exactly how the malware spread in the first place. Most cybersecurity companies have blamed phishing emails — messages containing malicious attachments or links to files — that download the ransomware. That’s how most ransomware finds its way onto victims’ computers. The problem in the WannaCry case is that despite digging through the company’s database of more than 1 billion emails dating back to March 1, Barlow’s team could find none linked to the attack“, one billion emails! That is what we call actual evidence and here IBM is claiming that the issue of HOW the malware spread remains a mystery. Now, can you see that the entire North Korean issue is out of touch with the reality of Common Cyber Sense and Actual Cyber Security? Two elements, both are essential in all this. It is the lack of actual evidence that seems to be the issue, giving us the question, who wants the North Korea issue propagated? Any answer here is more likely to be political than anything else, which now gives us additional questions on where for Pete’s sake the need of European Intelligence remains as they fall short of providing answers. In light of the Schengen database. Why would that not be shared? If the US has access as a non-European, non-EC nation, why would the UK, a clear European nation be barred from access? With all the flawed acts by the US, having actual professionals look at Schengen data, seems to be an elemental first, would you not agree?

An additional question would be on how these Bitcoins would be cashed, it is not like an isolated nation like North Korea ever had a flying business in Bitcoins in the first place. It is actually (yes, I am shocked too), that quality information comes from PwC. In this case Marin Ivezic, a cyber-security partner. He gives us “EternalBlue (the hacking tool) has now demonstrated the ROI (return on investment) of the right sort of worm and this will become the focus of research for cybercriminals“, which would be a clear focus for veteran cyber criminals, yet the entire re-engineering foundation gives another slice of circumstantial evidence that moves us actually away from North Korea. So in this we have two elements. As the FBI and CIA have been all about pointing towards North Korea, the question becomes, where do they not want us to look and whatever else do they not have a handle on? These points are essential because we are shown an elemental flaw in Intelligence. When the source is no longer reliable, why would they be around in the first place? We can agree that governments do not have the goods on Cyber criminals, because getting anything of decent value, tends to require inside knowledge, which is the hardest to get in any case, especially with a group as paranoid as cyber criminals. The second side is that China and Russia were on the list as one of the few abled parties to get through Sony, yet Russia has fallen of the map completely in the last case, that whilst they are actually strengthening ties with North Korea. That does not make them guilty, yet on the sale required Russia was one of the few with such levels of Cyber skills. The fact that we see in the NY Times that it is too early to blame North Korea is equally some evidence, it gives vision to the fact that there are too many unknowns and when IBM cannot give view of any mail that propagated the worm, gives additional consideration that there are other places who cannot claim or show correctly how the worm got started, which is now an additional concern for anyone altering the work for additional harm. As the point of infection is not known, stopping the infection becomes increasingly difficult, any GP can tell you that side of the virus. There is one more side I would like to raise. This comes from a source (at http://securityaffairs.co/wordpress/59458/breaking-news/wannacry-linguistic-analysis.html), it is not a journalistic source, or a verified source, so please take consideration that this news could be correct. It is however compelling. The quote ““The text uses certain terms that further narrow down a geographic location. One term, “礼拜” for “week,” is more common in South China, Hong Kong, Taiwan, or Singapore. The other “杀毒软件” for “anti-virus” is more common in the Chinese mainland.” Continues the analysis “Perhaps most compelling, the Chinese note contains substantial content not present in any other version of the note, is lengthier, and differs slightly in format.” The English note of the ransomware appears well written, but it contains a major grammar mistake that suggests its author is either not a native speaker or possibly someone poorly educated“, that would make sense, yet how was that source acquired?

The second quote: ““Given these facts, it is possible that Chinese is the author(s)’ native tongue, though other languages cannot be ruled out,” Flashpoint concluded. “It is also possible that the malware author(s)’ intentionally used a machine translation of their native tongue to mask their identity. It is worth noting that characteristics marking the Chinese note as authentic are subtle. It is thus possible, though unlikely, that they were intentionally included to mislead.” The Flashpoint analysis suggests attackers may have used the Lazarus code as a false flag to deceive investigators, a second scenario sees North Korean APT recruiting freelance Chinese hackers to conduct the campaign” gives us a few elements, the element of misdirection, which I had noted on from other sources and the element that North Korea is still a consideration, yet only if this comes from a freelance hacker, or someone trying to get into the good graces of Pyongyang, both options are not out of the question as the lack of Cyber skills in North Korea is a little too well set from all kinds of sources. The writer Pierluigi Paganini is a Cyber professional. Now even as Symantec’s Eric Chien is from California, did they not have access to this part and did no one else correctly pick up on this? As I stated, I cannot vouch for the original source, but as I had questions before, I have a few additional questions now. So, exactly how needed is European Intelligence for the UK? I think that data should be shared within reason. The question becomes, how is Schengen data not shared between governments? The Guardian gives us “After the Manchester attack, which killed 22 people and left dozens of others grievously injured, it was revealed that suicide bomber Salman Abedi had travelled back to England from Libya via Turkey and Dusseldorf four days before the attack“, so how reliable is Turkish intelligence in the first place? How could he have prepared the bomb and get the ingredients in 4 days? There is an additional view on ISIS support active in the UK, yet as we now see that this drew attention to him, why on earth was the trip made? Also, was Libya or Mecca the starting point (source: claim from the father in earlier Guardian article)? How would sharing have resolved this?

Now look at this in light of the US leaks and the Cyber Intelligence of a dubious nature. There is a growing concern that the larger players NSA, DGSE, GCHQ have flaws of their own to deal with. As they are relying more and more on industry experts, whilst there is a lack of clear communication and reliable intelligence from such sources, the thoughts now become that the foundation of fighting terror is created by having a quality intelligence system that recognises the need for Cyber expertise is becoming an increasing issue for the intelligence branch. Should you wonder than, then reconsider the quote: ‘demonstrated the ROI (return on investment) of the right sort of worm and this will become the focus of research for cybercriminals‘, if you think that cyber jihadists are not considering the chaos that they could create with this, then think again.  They will use any tool to create chaos and to inflict financial and structural damage. They might not have the skills, yet if there is any reliable truth to the fact that the Lazarus group is in fact a mercenary outfit, there would be enough critical danger that they will seek each other out, that is providing that ISIS could bring cash to that table. I have no way of telling how reliable or how certain such a union could be. What is a known is that Sir Hugh Orde is not answering questions, he is creating them, as I personally see it. The quote “UK membership of EU bodies such as Europol and Eurojust, which brokers judicial co-operation in criminal cases, not only allowed access to huge amounts of vital data, but also meant UK police could set up joint inquiries with German police or those from other national forces without delay“. You see, the UK remains part of Europe and Interpol existed before the EC, so as we now see the virtual creation of red tape, the question becomes why the EU has changed rules and regulations to the degree that the UK would fall out of the boat. Is it not weird that the EU is now showing to be an organisation of exclusion? Even if we laugh on the ridiculous promises that Corbyn is making, just to be counted shows that there is a larger problem in place. Why is there suddenly a need for 1,000 more intelligence staff? Can we not see that the current situation is causing more issues then resolve them? As such, is throwing money and staff on a non-viable situation nothing less than creating additional worries?

The last part is seen in “The Schengen database and knowing about who has moved where are all intimately dependent on European systems and we have got to try to remain in them“, yet this does require all players to enter the data accurately, in addition, that only applies to people entering Schengen, yet as has been shown in the past, after that getting locations on people is becoming an increasingly difficult problem. The fact that after the Paris attacks, some people of interest were found to be in Belgium is one side, the fact that these people could have met up with all kinds of contacts on the road is another entirely. The truth is that the intelligence branch has no way of keeping track in such details. In addition we have seen that the list of people of interest is growing way beyond normal means and organising such data streams and finding new ways not just to find the guilty, but to decrease the list by excluding the innocent is growing in complexity on a nearly daily basis. And that is before the cyber mess is added to the cauldron of nutrition. There is at least a small upside, as the technology stream will soon be more and more about non-repudiation, there will be additional sources of information that adds the branches by pruning the list of people of interest. The extent of pruning is not a given and time will tell how this is resolved.

It all affects the evidence that the parties hold and how it is applied, it remains a matter of time and the proper application of intelligence.


Leave a comment

Filed under Finance, IT, Law, Media, Military, Politics, Science