Tag Archives: IBM Statistics

Brotherhood of Heineken

As we stepwise push forward towards 5G, we think that it all stays the same, it will not. A few parts will change forever. Google has an enormous advantage, yet they too are now pushing for different changes, changes that they had not seen coming a mere year ago. In this case there is no direct link to my IP, so I am happy to give you all the inns and outs of that part (pun intended).

To start this we need to consider a few sides, all with their own premise. The first is the focal point:

4G: Wherever I am
5G: Whenever I want it

That first premise is a large one, it is not a simple localisation part, it is all about getting access at a moment’s notice, yet what we need access to changes with the push we face. The initial part is the creation and the impact of awareness. As we re-distinguish ‘awareness’ the metrics on awareness will also change and for the first year (at the very least) market research companies on a global stage will be chasing the facts. They have become so reliant on dash boarding, Tableau, Q-view and Q Research Software will all have to re-engineer aspects of their software as they fall short. Even the larger players like SAS and IBM Statistics will require an overhaul in this market space. They have been ‘hiding’ behind the respondent, responses and their metrics for too long, the entire matter when the respondent becomes the passive part in awareness is new to them, and that is all it is, it will be new to them and the constructs that are behind the active and passive interactions will change the metrics, the view and the way we register things.

Google has the advantage, yet the stage for them will take a few turns too. Their initial revenue stream will change. Consider the amount of data we are passing now, that amount also links to the amount of ads we see. Now consider that everything in 5G is 10 times faster, yet 10 times more ads is not an option, so they now face revenue from 10% of the ads compared to what we see now. In addition to that, as we adjust our focus on the amounts we face implies that more advertisement space is optionally lost to the larger players like Google and this too impacts the stats for all involved. Google will adjust and change, in what way, I cannot tell yet, but the opposition is starting to become clear a in this example we see Heineken, a global established brand who now has the option to take the lead in 5G awareness.

Introducing

Ladies and gentleman, I am hereby introducing to you the Brotherhood of Heineken, in this fraternity / maternity, we invite all the lords and ladies of their household to become awareness creators towards their brand. In the Netherlands thousands are linked through a company like Havenstad and similar operations, this stretches through Europe and all over the place going global. These lords and ladies can earn points in the simplest thing, by setting a stage for Heineken to spread the message, we see that the initial power is with the consumer to support their brand. Awareness and clicks are converted to points and that leads to exclusive offers and rewards. Consider the unique stuff that Heineken has given to its professional public now for all to get, to buy and to earn. Bags, coolers, clothing, accessories. For decades we saw the materials created and most of us were envious of anyone who had that part others did not, now we could all earn it and because Heineken (Coca Cola too) have created such an arsenal, these players could take the lead in pushing their own awareness to new levels.

Now it is easy to say that Google is already doing this and that is partially true, but that equation will change under 5G and these really large brands could pay a fortune to Google or take the lead and create their own powerhouse and in this day and age that powerhouse will become more and more an essential need. Anyone not looking and preparing to this will hand over opinion and choice to Google and watch how that goes, yet consider that some sources gave us a quarter ago: “Google will remain the largest digital ad seller in the world in 2019, accounting for 31.1% of worldwide ad spending, or $103.73 billion“, now consider that they need to grow 20% quarter on quarter and that in two years that metric has changed and as such the ads could cost up to 30% more, now do the math on how YOU will survive in that environment.

Samsung, Proctor & Gamble, Coca Cola, Nike, Heineken, Sony, Microsoft will all face that premise and that is how it all changes. As we see that the metrics will have reduced reliability, the market research players will need time to adjust and in that lull a player like Heineken can create its own future and set its digital future in another direction to exceed their required expectations. This step seems short now, but as the stage alters it becomes an essential stage. Google may remain in denial and oppose that this will never happen, but the data and metrics are already suggesting this path and that is where we are now; the option to be first or pay the invoice, what would you do?

I believe that the visibility starts to get a little focal just before 2020 games, and it is in full view before the 2022 Beijing Winter Olympics, and in full swing by the time the 2022 FIFA World Cup in Qatar starts. These two are close together and the people will pay through the nose for that visibility, especially the European parties in all this. I expect a more evolved 5G advertising stage via apps as well, seeing ads to unlock premium view and data is likely to happen, all this is coming to us and our view of advertisement will alter to a larger extent. We will be told that this will never happen, it is not how they work, yet they are deceiving and lying to us. Consider that change in the last 25 years alone, in 1994 advertisement through printed medium and TV was at an all-time high, they all claimed it remained this way, within 5 years that stage was already changing with online ads to some extent and the slowing of printed medium, in addition the international channels would push into national advertisement. A mere 5 years after that (in 2004) it started to take off in earnest and would increase revenue to over 100% in the 4 years that followed. Between 2005 and 2017 that would push from $6 billion to 26 billion, do you really think that their words holds true? To keep that growth and their need for greed the metrics and approach has to change, there is 0% chance that these players will accept a growth of data based impact of a mere 10% of what is was in 4G, there is too much riding on this.

For the largest players there is an alternative and it will not take long for them to set the stage to this and start finding their own solution to keep awareness as high as possible. If you have to pay through the nose to keep awareness or create the environment to reward achieved awareness, what path would you choose?

Let’s not forget players like Heineken did not get to the top by merely offering a really good product, they offered a lot more, a view, an awareness that all embraced; Sony learned that lesson the hard way by losing with a superior product against the inferior competition (Betamax versus VHS). 5G will set a similar yet new battle ground and for the most the media is seemingly steering clear for now.

That is with the nice exception of Marketing Interactive, who gives us (at https://www.marketing-interactive.com/going-beyond-the-big-idea-creative-leads-on-5gs-impact-on-advertising/) “There is no denying that the rollout of 5G will change storytelling and the consumer journey“, it is a true and utterly correct view. They also give us: “creatives need to evolve from old habits and stop hiding behind “the big idea”. “We, as creatives, need to evolve from old habits, stop hiding behind “The Big Idea” and evolve our creative process and creative structures to be based on this new digital reality, to create content based on this new innovative context“, this is the view from Joao Flores, head of creative, dentsu X Singapore and he is right. We also get “For agencies, the opportunity calls for unorthodox alliances to make sure our creativity is the beating heart of this quiet revolution“, which is true, but it ignores the alternative path where the largest players start getting this path in house and in light of the two revelations, we see that during the last decades players like Heineken had been doing just that and that makes them ready to take on the 5G behemoth and push the others into second place or worse. There is a need to have expertise and many do not have it, but in that Heineken has been different for the longest times. It is most likely due to the unique view that people like Freddie Heineken had on their market and consumers. You merely have to realise that they were the first to embrace ‘Geniet, maar drink met mate‘ (enjoy, temper your drinking) it was a slogan that came into play around 1990, as well as ‘Drink verantwoord. Geniet meer‘ (drink responsibly, enjoy it more). All pushes to set a better stage, it is there that we see that a new push could be produced by players like Heineken.

We see so many more paths opening, but in all this the one overwhelming side is not what paths there are, but the stage of metrics that they all rely on, as such having control on the expenses as well as the foundation to create a reliable stage for their metrics will be a first soon enough. Not merely: ‘Who is your population?‘, it is the stage where the passive and active awareness can be differentiated on, that too will push advertisements and the applied visibility through 5G apps and 5G advertising and how the funds are spent, that will be the question that impacts player like Google Ads on the next 24 months, because if they do not do that, their quarter on quarter growth will suddenly take a very different spin, and they are not the only ones affected.

 

Advertisements

Leave a comment

Filed under IT, Media, Science

Deadlock removed

Forbes gave us news in several ways. It merely flared my nostrils for 0.337 seconds (roughly) and after that I saw opportunity knock. In all this Microsoft has been short-sighted for the longest of times and initially that case could be made in this instance too. Yet, I acknowledge that there is a business case to be made. The news on Forbes with the title ‘Why Microsoft ‘Confirmed’ Windows 7 New Monthly Charges‘ (at https://www.forbes.com/sites/gordonkelly/2018/09/15/microsoft-windows-7-monthly-charge-windows-10-free-upgrade-cost-2) gives us a few parts. First there is “Using Windows 7 was meant to be free, but shortly after announcing new monthly charges for Windows 10, Microsoft confirmed it would also be introducing monthly fees for Windows 7 and “the price will increase each year”. Understandably, there has been a lot of anger“. There is also “News of the monthly fees was quietly announced near the bottom of a September 6th Microsoft blog post called “Helping customers shift to a modern desktop”“, so it is done in the hush hush style, quietly, like thieves in the night so to say. In addition there is “Jared Spataro, Corporate Vice President for Office and Windows Marketing, explained: “Today we are announcing that we will offer paid Windows 7 Extended Security Updates (ESU) through January 2023. The Windows 7 ESU will be sold on a per-device basis and the price will increase each year.” No pricing details were revealed“. This is not meant for the home users, it is the professional versions and enterprise editions, that is meant for volumes and large businesses. So they now get a new setting. Leaving pricing in the middle, in the air and unspoken will only add stress to all kinds of places, but not to fret.

It is a good thing (perhaps not for Microsoft). You see, just like the ‘always online’ folly that Microsoft pushed for with the Xbox, we now see that in the home sphere a push for change will be made and that is a good thing. We all still have laptops and we all still have our Windows editions, but we forgot that we had been lulled to sleep for many years and it is time to wake up. This is a time for praise, glory, joy and all kinds of positive parts. You see, Google had the solution well over 5 years ago, and as we are pushed for change, we get to have a new place for it all.

Introducing Google Chromebook

You might have seen it, you might have ignored it, but in the cast of it all. Why did you not consider it? Now, off the bat, it is clear if you have a specific program need, you might not have that option. In my case, I have no need for a lot of it on my laptop, yes to the desktop, but that is a different setting altogether.

So with a Chromebook, I get to directly work with Docs (Word), Sheets (Excel) and Slides (PowerPoint) and they read and export to the Microsoft formats (as well as PDF). There is Photos, Gmail, Contacts and Calendar, taking care of the Outlook part, even Keep (Notes), Video Calling and a host of other parts that Microsoft does not offer within the foundation of their Office range. More important, there is more than just the Google option. Asus has one with a card reader allowing you to keep your files on a SD card, and a battery that offers 7-10 hours, which in light of the Surface Go that in one test merely gave 5 hours a lot better and the Chromebook is there for $399, a lot cheaper as well. In this it was EndGadet that labelled it: ‘It’s not perfect, but it’s very close.

Asus has several models, so a little more expensive, but comes with added features. In the bare minimum version it does over 90% of whatever a student needs to do under normal conditions. It is a market that Microsoft could lose and in that setting lose a lot more than merely some users. These will be users looking for alternatives in the workplace, the optional setting for loss that Microsoft was unable to cope with; it will now be on the forefront of their settings. In my view the direct consequence of iterative thinking.

And in this it is not merely Asus in the race, HP has a competitive Chromebook, almost the same price, they do have a slightly larger option 14″ (instead of 11.9″) for a mere $100 more, which also comes with a stronger battery, and there is also Acer. So the market is there. I get it, for many people those with stronger database needs, those with accounting software needs, for them it is not an option and we need to recognise that too. Yet the fact that in a mobile environment I have had no need for anything Microsoft Specific and that there Surface Go is twice the price of a Chromebook, yet not offering anything I would need makes me rethink my entire Microsoft needs. In addition, I can get a much better performance out of my old laptop by switching to Linux, who has a whole range of software options. So whilst it has been my view that Microsoft merely pushed a technological armistice race for the longest time, I merely ignored them as my windows 7 did what it needed to do and did it well, getting bullied into another path was never my thing, hence I am vacating to another user realm, a book with a heart of Chrome. So whilst we look at one vendor, we also see the added ‘Microsoft Office 365 Home 1 Year Subscription‘ at $128, so what happens after that year? Another $128, that whilst Google offers it for free? You do remember that Students have really tight budgets, do you not? And after that, students, unless business related changes happen, prefer a free solution as well. So whilst Microsoft is changing its premise, it seems to have found the setting of ‘free software’ offensive. You see, I get it when we never paid for it, but I bought almost every office version since Office 95. For the longest times issues were not resolved and the amount of security patches still indicates that Windows NT version 4 was the best they ever got to. I get that security patches are needed, yet the fact that some users have gone through thousands of patches only to get charge extra now feels more like treason then customer care and that is where they will lose the war and lose a lot.

So when you see subscription, you also need to consider the dark side of Microsoft. You partially see that with: “If you choose to let your subscription expire, the Office software applications enter read-only mode, which means that you can view or print documents, but you can’t create new documents or edit existing documents.” Now we agree that they clearly stated ‘subscription’, yet they cannot give any assurances that it will still be $128 next year, it could be $199, or even $249. I do not know and they shall not tell, just like in Forbes, where we saw ‘News of the monthly fees was quietly announced‘.

When we dig deeper and see: ‘Predicting the success of premium Chromebooks‘, LapTopMag treats us to: “The million-dollar question is whether these new, more expensive Chrome OS laptops can find a foothold in a market dominated by Windows 10 and Mac OS devices. Analysts are bullish about Chromebook’s potential to make a dent in the laptop market share“, which was given to us yesterday. Yet in this, the missing element is that Windows will now come with subscriptions to some and to more down the track, or lose the security of windows, now that picture takes a larger leap and the more expensive Google Pixelbooks (much higher specs then the others mentioned) will suddenly become a very interesting option. One review stated on the Pixelbook: “the Pixelbook is an insanely overpowered machine. And, lest we forget, overpriced“, which might be true, yet the little lower Atlas Chromebook was $439. So yes, the big one might not be for all and let’s face it. A 4K screen is for some overkill. That’s like needing to watch homemade porn in an IMAX theatre. The true need for 4K is gaming and high end photography/film editing, two elements that was never really for the Chromebook. At that point a powerful MacBook or MacBook pro will be essential setting you back $2900-$11400. So, loads of options and variations, at a price mind you. As I see it, the Microsoft market is now close to officially dissolving. There is a whole host of people that cannot live without it, and that is fine. I am officially still happy with my Windows 7, always have been. Yet when I see the future and my non-gaming life, Linux will be a great replacement and when being mobile a Chromebook will allow me to do what I need to do. It is only in spreadsheets that I will miss out a little at time, I acknowledge that too, but in all this there is no comparison with the subscription form and as it comes from my own pocket is see no issues with the full on and complete switch to Google and its apps in the immediate future. I feel close to certain that my loss will minimal at the most. A path that not all will have, I see that too, but when thinking the hundreds of thousands of students that are about to start University, they for the most can make that switch with equal ease and there we see the first crux. It was the setting that Microsoft in a position of strength had for the longest time, enabling students so that they are ready for the workplace changes. They will now grow up with the Chromebooks being able to do what they need and they will transfer that to the workplace too. Giving us that the workplace will be scattered with Chromebooks and with all kinds of SaaS solutions that can connect to the Chromebook too. The Chromebook now becomes some terminal to server apps enabling more and more users towards a cloud server software solution. As these solutions are deployed, more and more niche markets will move in nibbling on the Market share that Microsoft had, diminishing that once great company to a history, to being pushed beyond that towards being forgotten and at some point being a myth, one that is no longer in the game. It is also the first step that IBM now has to bank in on that setting and push for the old mainframe settings, yet they will not call it a mainframe, they will call it the Watson cloud, performing, processing and storing, available data on any Chromebook at the mere completion of a login. It is not all there yet, but SPSS created their Client server edition a decade ago, so as the client becomes slimmer, the Chromebook could easily deal with it and become even more powerful, that is beside the optional dashboard evolutions in the SaaS market, the same could be stated for IBM Cloud and databases. That is the one part that should be embraced by third party designers. As SaaS grows the need to look in Chromebook, Android and IOS solutions will grow exponentially. All this, with the most beautiful of starting signals ever given: ‘Windows 7 New Monthly Charges‘, the one step that Microsoft did not consider in any other direction and with G5 growing in 2021-2023 that push will only increase. If only they had not stuffed up their mobile market to the degree they had (my personal view). I see the Windows Mobile as a security risk, plain and simple. I could be wrong here, but there is too much chaff on Windows and as I cannot see what the wheat is (or if there is any at all), and as Microsoft has been often enough in the ‘quietly announcing‘ stage and that is not a good thing either.

Should you doubt my vision (always a valid consideration), consider that Veolia Environnement S.A. is already on this path. Announced less than two weeks ago we see “So we propose a global migration program to Chromebooks and we propose to give [our employees] a collaborative workplace. “We want to enable new, modern ways of working”“, linked to the article: ‘Veolia to be ‘data centre-less’ within two years‘ (at https://www.itnews.com.au/news/veolia-to-be-data-centre-less-within-two-years-499453), merely one of the first of many to follow. As the SaaS for Chromebooks increases, they will end up with a powerful workforce, more secure data and a better management of resources. Add to this the Google ID-Key solution and the range of secure connections will go up by a lot, diminishing a whole host of security issues (or security patches for that matter). All options available now and have been for a few years now. So when we see the Chromebook market push forward, we should thank Microsoft for enabling exponential growth; it is my personal believe that the absence of a monthly fee would have slowed that process considerably in a whole range of markets.

So thanks Microsoft! You alienated gamers for years, and now we see that you are repeating that same silly path with both starting students and businesses that are trying to grow.

I’ll ask Sundar Pichai to send you a fruit basket, it’s the least I can do (OK, the least I can do is nothing, but that seems so mean).

 

Leave a comment

Filed under IT, Media, Science

Who’s Promptly Promoted?

The Guardian is giving us the news that Moody is downgrading WPP (at https://www.theguardian.com/business/2018/apr/17/moodys-downgrades-wpp-martin-sorrell-departure-ratings-agency-negative). It is a weird situation! You see, some do not like Sir Martin Sorrell (I personally never knew him), some like the man and some think he was a visionary. I think I would fall in the third category. There is no way that under normal situations the departure of a CEO, even a founder would have had such a massive impact when he left and let’s be clear when a departure sparks not just the downgrade of WPP, but we also see “WPP has hired a New York-based recruitment firm as it begins the global search to replace founder and chief executive“, his impact has been a hell of a lot larger than anyone is willing to admit. There are however other parts. When I see “In Moody’s view, the high-profile departure of Sir Martin Sorrell raises concerns over the future strategy and shape of the group, increases client-retention risk and could hence hinder WPP’s ability to meet its 2018 guidance“, I feel a strong desire to disagree. When we consider that within WPP is Millward Brown, TNS and IMRB, we need to acknowledge that WPP already had problems. You see, I was a partial witness to the laziness and stupidity, I saw how executives looked at presentations, were unwilling to listen and it was their right to do so, but in the end part of their market got screwed over. You see SPSS was the big analytic and as a program it is still the Bentley for analysing data. Yet beyond the program the corporation faltered. It fell to meetings, and presented concepts, yet no delivery. I still have the presentations, 1994 parallel processing, never came to be. Yet the biggest bungle was seen in 1997, when SPSS acquired Danish software company In2itive Technologies Corp. They had actual perfect software. The interface was intuitive and flawless. I was so looking forward to teaching people this software and for a while did. It was amazing to see dozens of people literally making a running start in their own designs in an hour, by the end of the day they did all kinds of things that most market researchers could not conceive. It was a jackpot acquisition. Yet SPSS had its own Data entry solution called Data Entry and apart from a few flaws it had regarding memory and larger data entry sheets, it worked really well, it was a work horse, so internally we were so happy to hear that it had become a Windows program. The backlash was Titanic in proportions. It was hard to work, the initial versions weren’t even stable, there was processing power issues, saving issues and a whole range of issues that were not solved, not even within the first year. It was all about the holy ‘Data Entry‘ and whilst the issue of the perfect In2itive was set to the sides and whilst the internal corporate marketing decided that Data entry was a ‘Form Design Program‘, the audience was left without quality Data Entry. So as I (and others) pleaded for In2Form and its suite to be evolved and set towards the users, we were told it was merely a 16 bit program, and SPSS is 32 bit and larger only (mainframes excluded). So there I was watching the mess evolve for well over 3 years whilst the redesign of a 32-bit In2itive suite would have been done in 160 days (rough estimate), no, at SPSS they really knew what they were doing. So they decided to up the ante, there was going to be a server edition of Data entry, the SPSS Data Entry Enterprise Server. I saw how the confidence of users went down further and further. Yet, the corporation did not sit still in all this and we got to see the Dimensions 2000 part, now that blew us away, we saw software on a whole new level and it was amazing. The 2 programs mrPaper, mrInterview, both truly steps forward, options to format webpages using XML so that the web interview could flawlessly fit in any corporate website. We saw the good days come back and with mrPaper we saw paper interviews with options to link to Readsoft’s scan software, so that data entry was almost a thing of the past, scan the returned interviews and reading the data with a scanner. It was not flawless, but it was really good to see a stage where government sites all over Europe could do quality interviews on many levels. Yet the program had issue as any large program had and there were more issues and they stacked up. Only then was I introduced to Surveycraft. It was an utter shock. Even as it was old, DOS based and looking like the old Data Entry, Surveycraft was miles ahead of mrDimensions. It had working quota’s it had all kinds of options that were ahead of the Quancept software in the UK, it was a shock to be a decade ahead and finding the old software visionary. SPSS had acquired it, and after that the developers managed to get less than 60% of the functionality transferred. Even later when I worked actively with it, I was finding issues that the new software never had, or it worked really badly. So when i tried to emphasize the need for new software to be made as i was no longer part of SPSS, the need for better software was essential, especially in Market Research. They decided not to listen and to believe the SPSS executives that better versions were coming soon, they never came! The entire market research industry was lucky, because other players like Tableau and Q Research software were just like me; they never trusted the SPSS executives and they now corner the market. In this the market research agencies that had the option to push forward decided to wait and basically cut themselves in the fingers and lost on two fronts. With the 2008 crash the markets changed and they lost loads of customers who had to massively trim down, it was a mere effect of events. Yet Tableau and Q-Software were still in a small stage, yet their software was for a much larger audience, so not only did the market research Industry lose customers, the two software programs allowed for mid and larger ranged corporations do it all themselves and that is what happened. Market research companies still get the larger projects, but they lost the smaller stuff, a group of revenue representing near 60% (a personal speculation) and as Tableau and Q-Software grows, the mr market is in more and more peril that is where WPP owning Millward Brown, TNS and IMRB finds itself. It takes a visionary to not merely grow the market, but to spread the options of a market. That ship has now sailed and beyond less than a dozen former SPSS people I worked with, I have merely seen a lack of vision. Some of these market research agencies are now all about ‘telling a story‘, setting the presentation that can in most cases be done with SAP Dashboards and a karaoke system. In this the only part that is still tacky is that when we want to buy the SAP solution (approximately $500) we get to see “Please contact your local SAP account executive for more information on how to buy and implement SAP BusinessObjects Dashboards“, was adding a price that much of a reach?

So as we see the pressures of one branch, we need to see that the overlap is large, even as some are in different territories we know that they are intertwined. Yet this market is also as incestuous as it gets. Lightspeed Research acquires part of Forrester (the Forrester’s Ultimate Consumer Panel business), Forrester is growing in different directions and they are all connected to some degree. There is every chance that the higher echelons will have worked in any combination of SPSS, Forrester, Lightspeed, SPSSmr and ConfirmIT. Likely they already worked in 3 of the five players. Yet the visionary growth has remained absent to a larger degree and digital media is all about evolution and implementing new technologies and new solutions to drive consumer engagement, because the future here is consumer engagement, that alone will get you the data to work with and to set the needs of the industry.

That is the part SPSS as a company ignored and now that we see the shifts, especially in WPP, we see that both Tableau and Q-software have a massive opportunity to grow their market segment even further. The moment they or a third player comes with consumer engagement software, at that point IBM will also feel the pinch, even as it hides behind Watson, options like IBM Statistics (formerly SPSS) and IBM Miner (formerly Clementine, SPSS Data Miner), they get to realise that these two programs also brought new business as the consultants were able to see the needs of the larger customers. When that diminishes, IBM will feel the loss of a lack of visionaries in a very real way. A loss only accelerated by the impacts on WPP and all its subsidiaries. This last part is speculative, but supported with data. As we saw ‘Paul Heath resigns was Ogilvy worldwide chief growth officer and non-executive director of AUNZ‘, we need to realise that the larger insightful players will be seeing more changes. Ogilvy & Mather might be merely the first one, but these people all realise that changes will be different and market shares will change, not all in favour of WPP. We can see “Heath is resigning all his titles at WPP worldwide to return to Brazil to start a new streaming tech venture“, we can read this as a positive: ‘he is going to try something new‘. Or negatively ‘he knows who is on his level at WPP‘ and he has decided that he can grow a nice personal global market share by setting his view on the new player with a promising option for mucho growth. I believe that he is setting his view to become the larger player himself. This is good news as it optionally invigorates the market research market which WPP desperately needs, yet WPP is a lot more than merely market research. It is digital advertising, a field that SPSS (read: IBM) ignored until it was too late, yet when we see some of the services: Branding & identity, Consumer insights, Design, Digital Marketing, Market research, Media planning and buying, Public relations, Relationship marketing’ all valid groups yet there is a lack of options for consumer engagement and several of the other groups are options that many offer, some in niches, some only to midrange players, but effective due to expertise. That should have been a massive red flag and reasons for alarms at WPP, yet not too much was seen there. In all a situation that does not merely warrants the downgrade by Moody’s, the fact that it was averted whilst Sir Martin Sorrell was there as CEO is an actual much larger issue then most identified.

So the problem is not merely who can replace him, but who can alter the course of failed objectives will soon become a much larger issue for WPP, which optionally pushes down the market value by a mere 5%, which considering the 2017 revenue of £15.265 billion becomes an interesting amount.

 

Leave a comment

Filed under Uncategorized

Choosing an inability

This all started last night when a link flashed before my eyes. It had the magical word ‘NHS’ in there and that word works on me like a red cloth on a bull. I believe that there is a lot wrong there and even more needs fixing, it needs to be done. There is no disagreement from anyone. The way to do it that is where the shoes start feeling tight. There are so many sides to fix, the side to start with is not always a given. There will be agreement and disagreement, yet overall most paths when leading to improvement should be fine. There is however one almighty agreement. You see the data analyses side of health care is not that high on the list. Most would agree that knowing certain stuff is nice, but when you have a primary shortage (nurses and doctors) the analyst does not rank that high on the equation. Although I am an analyst myself, I agree to that assessment of the NHS, my need is a lot lower than getting an extra nurse (at present). So when I see ‘Another NHS crisis looms – an inability to analyse data‘ (at https://www.theguardian.com/science/political-science/2017/feb/08/another-nhs-crisis-looms-an-inability-to-analyse-data), I start wondering what actually is going on. The first issue that rises is the author. Beth Simone Noveck is as the Guardian states “the former United States Deputy Chief Technology Officer and Director, White House Open Government Initiative. A professor at New York University“, you see, it is a given that Yanks always have an agenda. Is this about her book ‘Smart Citizens, Smarter State: The Technologies of Expertise and the Future of Governing‘? Just asking, because the by-line there is: “New tools—what Beth Simone Noveck calls technologies of expertise—are making it possible to match citizen expertise to the demand for it in government. She offers a vision of participatory democracy rooted not in voting or crowdsourcing but in people’s knowledge and know-how“, which seems to match the article. So, is this her sales pitch? You see, she must have missed the memo where the previous labour government wasted £11.2 billion on something that never worked and now as the NHS has plenty of crises moments, spending it on something that limits the growth towards nurses and doctors is a really bad idea.

Then she sets the focus on the HQIP with: “The Healthcare Quality Improvement Partnership (HQIP) conducts forty annual audits comparing hospital and physician outcomes, and the implementation of National Institute of Clinical Excellence standards across England and Wales. But, as HQIP Director Dr Danny Keenan admits, although they have the expertise to do the analysis, “we are woefully inadequate at translating such analysis into improvements. What’s the takeaway for the hospital or community provider Board or the medical director? They cannot understand what they have to do.”“, from that I get that the existence of the HQIP is under discussion because they cannot communicate. This we see in: ‘They cannot understand what they have to do‘, which means that the hospital or community provider Boards or the medical directors are either incompetent or there is a communication issue. I am willing to ‘auto-set’ to: ‘the inability to communicate’. I admit that I would have to read those reports to get a better view, but it is clear that the HQIP has a few cogs missing, which is on them and not on the NHS as such. So if the NHS needs to cut further, that’s where the cutting can start.

Am I against the HQIP? No, of course not, but the NHS has actual problems and putting more resources in communication gaps when a place is running low on gauss and staff the priority seems to be pretty clear. I also accept that if this path is taken that restoration of the NHS will take longer, I get that, but I hope you can agree with me that once the ability to properly aid patients is restored, we can look at the next stage of fixing the NHS, because aiding patients’ needs to be the primary concern for all sides of the NHS.

A second element in the given sales pitch comes from Dr Geraldine Strathdee, where we see “National Mental Health Intelligence Network, together with partners, launched the Fingertips Mental Health data dashboard of common mental health conditions in every locality. Strathdee points out there is a tremendous need for such benchmarking data: to design services based on local need, build community assets, and improve NHS services“, I have stated at a few conferences (mid 90’s) that there is an inherent need to document and create clear paths of internal knowledge retention, which included healthcare, education and government departments. I literally stated “as you grow the knowhow with your own staff members, you will increase their value, they will be better motivated and you create a moment when you become less and less reliant on outside sources, which usually cost a fair amount“, I have been proven correct in more than one way and the lack from some people who saw the gravy train benefit by being aligned with consultants is now at an end and those people tend to not have any allegiance, other than the need to grow their bank account. Creating internal knowledge points has always been a primary need and as this opportunity was wasted, we now see the plea ‘a tremendous need for such benchmarking data‘. They should have listened to some of their IT people a long time ago. The second opposition is seen in “Without it, NHS resourcing is just based on historical allocations, guesswork or the “loudest voice”“. This implies that there has been no proper data collection and reporting for well over 5 years, whilst 10 year gap would sound a little more correct (an assumption from my side). When you look at the Netherlands, there is a long list of reports that psychiatrists and psycho analysts need to adhere to and deliver towards those paying for the services. That has been the case for the longest time. What happens afterwards? Are they not properly collated and reported? In the Netherlands it was and I think it still is (a fact, not verified at present). Yet what happens in the UK? The yank might not know, but I reckon that if the MP’s ask these questions from Dr Geraldine Strathdee that we will get proper responses on what is done now, how it is recorded, reported on and considered for continued improvement. If all of that is absent, who should we talk to? Who needs to give an accountable response?

At that point the doctor becomes a little confusing to me; perhaps that is just me, because when I read “The data dictates investment in early intervention psychosis teams, which dramatically improves outcomes. Fifty per cent of patients get back to education, training or employment. However, there is a shortage of people able to draw these insights“, I just wonder what is set in reports. It is confusing because psychosis is only one of many mental health issues that are in play. When someone gets diagnosed as such a treatment plan comes into focus and as such data had no impact. The patient is either correctly treated or the patient is not. Data had no influence there, it is the carer’s report that is submitted and for that this person will either get the resources needed, or not. Data will not influence this. A report on how many are treated with psychosis is required, but as the reports are handed upwards, those numbers would be known and as such the required needs in medications, staff, treatment plans and of course the required funds to pay for all this would be known. If not, the question becomes: is Professor Noveck there to aid in obscuring events, or should we consider that the National Mental Health Intelligence Network has become redundant and is draining funds needlessly? If you think that this is an exaggerated notion, consider that when we look for the ‘National Mental Health Intelligence Network‘, we get the website (at http://mentalhealthpartnerships.com/network/mental-health-intelligence-network/), the latest thing on their website is a meeting from September 2013, in addition there is something from Professor Chris Cutts on STORM Skills Training and that is May 2014. So I think that the National Mental Health Intelligence Network did get itself involved in a sales pitch and a very poorly constructed one I might add. You see, when we go to Public Health England, we see that there are health Intelligence Networks, but the one they have is called ‘National Mental Health, Dementia and Neurology Intelligence Networks (NMHDNINs)‘, perhaps an oversight from the two sales people? You see the Mental Health Dementia and Neurology path gives us all kinds of information (shallow information I admit), but I wonder if that is wrong or just not the proper place to find it. In addition I see when I look at ‘Severe Mental Illness‘, some 2017 mentions (so it is up to date) with the Psychosis Care Pathway, where I see “The Psychosis Care Pathway provides a high level summary using 16 key indicators to help users assess and benchmark how they manage this important condition. This pathway is consistent with and linked to the Commissioning for Value Psychosis packs to be published by NHS England“, this is an interesting part isn’t it? Does this mean that this is happening, not happening, or more important, what on earth does Dr Geraldine Strathdee think she is doing? Perhaps it is an ill-conceived hostile takeover using an outsider who was published and has a name, whilst the minimum needs to be taken seriously are not even there (an up to date website perhaps). This whilst the mention ‘based at Public Health England‘ is an issue as the Public Health England (at https://www.gov.uk/government/organisations/public-health-england), has no mention at all of the ‘National Mental Health Intelligence Network‘, is that not odd? So what ill-conceived sales pitch are we reading in The Guardian?

Perhaps the quote ‘The NHS needs data analytical talent, which comes from a variety of disciplines‘ gives us that. And as the NHS has no immediate need to hire analysts, see there, the ‘National Mental Health Intelligence Network’ would come to the rescue and save the moment. Perhaps the first thing they would consider is hire a web designer and make sure that the latest INTEL is not 2+ years old (cautious advice from my side). In addition, as it seems that the NHS is likely to be pushed into a ‘we need analytics data‘ conversation (one they can go without at present), not taking the word from a professor and a doctor who dropped the ball might be a first notion to consider. Making a proper inventory of what data the NHS has and seeing if a conversation (a non-invoiced conversation) with someone from Q Research Software is likely to be a hell of a lot more productive than talking to the previous two ‘sales’ people that the Guardian article touches on. I will be honest I had a few issues with that program in the past (for specific reasons) but Q Software has never stopped improving and it has grown to the extent that it is now chiseling to the marginal groups IBM Statistics had and they are now losing those customers to Q Research, which is quite the accomplishment. In that I think it is Dr Danny Keenan who is likely to get the most out of such a meeting. From what the Guardian tells us, we get the implied understanding that he needs the solution to tell a better story. You see, translating statistical results into actions is done through stories. Not fabrications mind you, but a story that helps the receiver understand what direction would be the best to take. The listener will get a few options and each will have a plus and a minus side and usually the one with the best track movement tends to win. If that path includes successfully suppressing the negative elements even more, so much the better.

My main reason for opening this door is because there is enough low level talent in the NHS in several places that might have the ability to do this on the side, a simple path that allows additional reporting whilst not needing to drain essential resources. I call them ‘low level’ not because of anything negative. When working with proper analytics you need to have someone on your back and call with a degree in applied mathematics. Anyone claiming that this is not needed is usually lying to you. In the case of Q, a lot of the calculations have been auto completed and the numbers that are reflecting in the tables still need some level of statistics, but many with a tertiary business degree would have had exposure to a lot more stats than is needed here so as such this person would be low-level only in that regard. It is for all intent and purposes a reporting tool that goes a lot further than mere tabulation and significance levels. It could be the tool of choice for the NHS. Even when they start getting forward momentum, this tool would still be massively useful to them and any change might be limited to getting a dedicated person for this goal. Which with the current shortages all over the NHS is not that far a stretch anyway.

So as we realise what one program can do, we see the questionable approach that the sales person named Beth Noveck is making. The mention “the NHS should expand efforts already underway to construct an NHS Data Lab“, “Improving public institutions with data also requires strong communications, design and visualisation skills. Digital designers are needed who know how to turn raw data into dashboards and other feedback mechanisms, to support managers’ decisions” and “So the NHS needs to be able to tap into a wide range of data analytic know-how, from computer scientists, statisticians, economists, ethicists and social scientists. It is impractical and expensive to meet all of these needs through more hiring. But there are other ways that the NHS can match its demand for data expertise to the supply of knowledgeable talent both within and outside the organization

Three distinct statements which are not false, yet the first one is currently not feasible with the shortages that the NHS has the second one was debunked by me in merely 5 minutes as I introduced Q Research Software to you the reader. Anyone stating that this is not the best solution has a case, but in the shortage world the NHS lives in, with the cost of Q-Software against 93% of all other software solutions, it is the best value for money the NHS could ever lay there fingers on and the third one is even more worrying, because that expensive track of consultants is one of the ways that partially accounts for the £11.2 billion loss that the NHS already suffered. Should the esteemed professor come up with ‘additional considerations’ the NHS should become really scared, because there is a growing concern that some people want to get their fingers on the NHS data, the one treasure the bulk of ALL American healthcare insurers and provides want, because that is one data warehouse they have never been able to properly build.

She ends the article with “Whether the NHS wants to know how to spot the most high-risk patients or where to allocate beds during a particularly cold winter, it can use online networks to find the talent hiding in plain sight, inside and outside the health and social care system“, so how does that work? Where to allocate a bed in cold winter? Are they moved by truck to another location (impeding nurses and doctors as more aid needs to be given at that location), will it require the patient to move, which is actually simply done by finding out where a bed is available. The article is a worrying one, in that light that the article was published and I wonder if it was properly vetted, because there is a difference of many miles between a political science piece and an opinionated sales pitch. So my next step is to take a very critical look at “Smarter Health: Boosting Analytical Capacity at NHS England“, because my spidey sense is tingling and I might find more worrying ammunition in that piece.

 

1 Comment

Filed under IT, Media, Politics, Science

The Dangerous Zuckerberg Classification

Even as Microsoft seems to be quiet and in denial of what is uploaded without consent, we have a second issue that is floating to the surface of our life. Now, first of all, this link is not what we should consider a news site. What came from Forward.com is also known as The Jewish Daily Forward, published by Samuel Norich and has Jane Eisner as the editor. Its origins goes back to 1897, so it has been around for a while. They are not some new wannabe-on-the-block. It is an American newspaper published in New York City for a Jewish-American audience, and there are plenty of those around, so this is a valid niche publication. Yet no more than a day ago, it did something dangerous, perhaps unintentional and perhaps it is a sign of the times, but it remains a dangerous path to take.

This path all started when Mark Zuckerberg had an idea. He created this place called Facebook, you might have heard of it. Within there we get to ‘like’ things. Now, we can do this to complement the poster, we can do this because the subject interests us, or when we use the machine correctly, Facebook would send us more stuff from topics that we like. This already shows three different approaches to ‘like’ and when Forward starts the article with: “Canadian Mosque Shooter Suspect ‘Liked’ Israel Defense Forces, Marine LePen“, it basically shot itself in the foot.

This is part of the problems we are all facing, because the world is changing and it has shifted the values that we have given words over time and shifted them into concepts of what it might be. We see the same shift in the Business Intelligence industry as tools like SPSS (read: IBM Statistics) are no longer used to get the significant statistics needed and the ‘sellers’ of the story that the client wants told rely on tools like Q Software to tell the story that matches the need. The problem is that this story reflects what is offered and from that there is more than one identifier (weight being one) that the reflection is less accurate and often warped to fit the need of the receiver of these data files. Meaning that the actual meaning unlikely to be there, making a correct assessment not possible and any action based upon it, without scrutiny will come at a hefty price for the decision makers down the track.

So when we see “Canadian Mosque Shooter Suspect ‘Liked’ Israel Defense Forces, Marine LePen” we need to be cautious at best, at worst we are being told a fair bit of rubbish! Now we also get “Authorities claim that Alexander Bissonnette, a student at the city’s Laval University, perpetrated the attack, calling in from a bridge near the mosque to report himself“, which could be very true, but it also averts the first signs we see of ‘Lone Wolf‘, because a real lone wolf will go into the night if he or she is lucky without a trace and plans his/her next attack. This one attack person seems to be seeking the limelight as I personally see it. For what reason is at present unknown. Perhaps it is about fame, perhaps the evidence will find evidence of mental health issues. Time and the proper people will need to assess this. We see this in the picture of a tweet by @Rita_Katz when she states ‘making Jihadi ties unlikely‘, which could be true, however I got there via another route. What is interesting is that when we look at the Toronto Star we see “Rosalie Bussieres, 23, lives across the street. She told the Star her older brother was in school with Bissonnette. He was “very solitary” and “very antisocial,” said Bussieres. Bissonnette studied at the Université Laval, according to a statement released by the university late Monday. He was a student in the department of political science and anthropology, according to Jean-Claude Dufour, Dean of the Faculty of Agriculture and Food Sciences

This is interesting as those in political science tend to be decently social minded, so there is a lot more under the water than we think there is and the fact that Forward only gave us the likes, means that there is a part that they either ignored or overlooked. You see, what else did his Facebook account have to say?

The Toronto Star gives us a lot more “He was on both the Sainte-Foy and Université Laval chess club“, with Forward we got more on Rita Katz. “Rita Katz is the Executive Director and founder of the SITE Intelligence Group” is one, and the next part is the one we should consider: “the world’s leading non-governmental counterterrorism organization“, as well as “Ms. Katz has tracked and analyzed global terrorism and jihadi networks for nearly two decades, and is well-recognized as one of the most knowledgeable and reliable experts in the field“. Which makes me wonder why it is the Toronto Star who gives us the part I did not initially showed “with his twin brother, said Université Laval professor Jean Sévigny, who said he knew Bissonnette and his brother through the club“. So how come The Forward didn’t have the goods on that?

Yet they did give us “François Deschamps, member of Quebec’s Refugee Welcome Committee, told the La Presse newspaper that he recognized Bissonette because the man had often left hateful comments on the group’s page. “I flipped when I saw him,” he said. “We observe much of what the extreme right says and does. He’s made statements of that sort on our Facebook page. He also attacked women’s rights,” Deschamps recalled“. The full story is at http://forward.com/news/361614/canadian-mosque-shooter-suspect-liked-israel-defense-forces-marine-lepen/

So as we are invited to judge on likes, I see a hole of intelligence. How many friends? How many clubs? Was he linked to Chess groups? Was he linked to his Twin Brother, and was his twin brother on Facebook? There is no one mentioning whether the twin brother was reached and what he had to say (if he had been willing to talk), which he might not be willing to do and that is perfectly understandable. It is just such a weird experience to see a total lack of effort in that regard (especially by the press).

Forward is telling its readers a story, yet the Toronto Star (at https://www.thestar.com/news/canada/2017/01/30/six-dead-two-arrested-after-shooting-at-quebec-city-mosque.html) seems to offer a lot more. In that view ABC news in Australia blunders (as I personally see it) even more when we see (at http://www.abc.net.au/news/2017-01-31/quebec-city-mosque-shooting-lone-wolf-attack-student-charged/8225294), ‘Police charge ‘lone wolf’ student suspected of terrorist attack‘, so what evidence is there? What is the definition of a Lone Wolf? Perhaps we need to agree on the shifting sands and make sure it is sand and not quicksand. They both might contain the same 4 letters, but the experience will be mind-bogglingly different.

So as we now see that the US is using this attack to justify its actions, we need to take heed on the dangers we invite. The first is like the attack in Sydney, Australia at Martin Place, on December 15-16 2014. We again see a link to extremism that is incorrect and misleading. Yes, the act was extreme, but we have seen for decades on how mental health patients are very able to act in extreme ways. You only need to see the footage from Paris attacks to see how actions in places like Nairobi and Paris to clearly see that they are different from events in places like Martin Place and perhaps the Quebec Mosque.

We can argue on how correct the FBI setting is, yet it is an important one! “Terrorism is the unlawful use of force and violence against persons or property to intimidate or coerce a government, the civilian population, or any segment thereof, in furtherance of political or social objectives“. So what were the social and political objectives of Alexander Bissonnette?

There is a lot we don’t know and won’t know. Yet at present Forward is presenting the dangers that social media rely on, they rely on quick and classifiable actions and label them in the most general way possible. The dangers that we see in the Zuckerberg classification is that it relies on the quick acceptance of the ‘audience’ yet in the same way the danger is that the ‘like’ itself becomes a problem. You see, too many elements are about specifics and as we see less and less, we see that people in general will start to rely on an aggregation of ‘reportable elements’, not even on an aggregation of facts.

Heavy.com, another place that is not really a news site gives us a whole range of additional ‘facts’. They refer to Reuters, who reported (at http://www.reuters.com/article/us-canada-mosque-shooting-idUSKBN15E04S), where we get “Initially, the mosque president said five people were killed and a witness said up to three gunmen had fired on about 40 people inside the Quebec City Islamic Cultural Centre. Police said only two people were involved in the attack“, in that part the Lone Wolf no longer applies and it is either ‘lone Wolves’ or something else. Forward however gave us “Police investigating the shooting at a Quebec mosque that killed six have narrowed down their list of suspects to one man” Yet 5 hours after the initial message Reuters (at http://www.reuters.com/article/us-canada-mosque-shooting-toll-idUSKBN15E0F6) gives us “Police declined to discuss possible motives for the shooting at the Centre Culturel Islamique de Québec. They consider this a lone wolf situation,” a Canadian source familiar with the situation said“, which is a statement that should be under some scrutiny to say the least.

All this links to an event one year ago, which was covered in the Tech Times, where we see ‘Sheryl Sandberg Sees Facebook Likes As Powerful Weapon Against ISIS, Other Extremists‘ with the quote “Rather than scream and protest, they got 100,000 people to Like the page, who did not Like the page and put messages of tolerance on the page, so when you got to the page, it changed the content and what was a page filled with hatred and intolerance was then tolerance and messages of hope“. This is now a linked issue. You see the part ‘they got 100,000 people to Like the page, who did not Like the page‘, this implies that data was intervened with, so if that is happening, how reliable was the ‘like’ part in Forward.com?

The fact that papers all over the place are trying to ‘cash’ in on this by adding a page with ‘the latest facts‘ or ‘what we know at present‘, like The Globe and Mail, whilst showing an avalanche of news on the matter. Actually, the page The Globe and Mail brought was pretty good. It is Heavy.com who does something similar, yet at that point they move into the ‘5 things you need to know‘ mode and give us a stream of links. Links to classmates and how they thought. Yet, are these facts correct and complete? Heavy links to the Globe and Mail, and in addition gives us the part we needed to hear: “He also likes U.S. Senator John McCain, a moderate Republican who has opposed Trump on some issues, President George W. Bush, the Canadian New Democratic Party and late Canadian politician Jack Layton, who was a leader of the left-wing NDP, so the likes do not shed much light on Bissonnette’s beliefs“, Forward.com, and as such linked SITE Intelligence Group had nothing on any of that in the article. So anyone relying on Forward is now missing out of essential facts. In equal measure, the fact that many of these items are not voiced by other papers make the statements of Heavy.com equally an issue until confirmed.

And finally there is the impact of how the like was obtained. Plenty of sources started with a few ‘like to win’ campaigns. How many people have clicked on a like and forgot about doing so? Yet in this light, the ‘like’ is implied to have a much larger impact, much larger than the user considers or even comprehends. The places using those likes for telling a story have left that concept behind, giving us unclean and incorrect data, which now implies that any conclusion based on it is pretty much useless.

Be aware, I am not stating, or accusing these posters of fake news, yet there is the option that some will see it as such. As I stated at the beginning regarding Forward.com, their origin goes back to 1897, which means that they have been around for some time. So why were so many facts missed and why did Forward link this suspect to both the Israel Defense Forces and Marine LePen, especially in light of what others reported?

What is not related to the Facebook side is the news that the initial news of two shooters (up to three) is now reduced to just the one. When a witness states up to three, there is a clarity to assume (to some degree) that there was more than one shooter (which is a speculation from my side). So what happened to the second one? Just be aware that there might just have been one shooter, yet the documentation we are seeing implies more than one.

So how is this a Zuckerberg thing?

Well, apart from him inventing Facebook and bringing about the evolution of Social media, his ‘like’ is almost like his ‘poke’, they are Social media tools, yet the value the users tend to give it is different, it is even debatable whether the users at large could ever agree on the usage of it, making it a transient value. A shifted number whilst the contemplators cannot agree how the value is to be used, so the usage of ‘like’ in the way it was used in by the press becomes a debate as well. Because what we like implies where we are. That is not a given, even better it is incomplete. You see, you can state your like, but as you cannot state a dislike, we end up having no real comparison. It is the old debate of Yes and No dichotomies, if you did not say ‘yes’, there is no validity that you stated ‘no’, because it might have been overlooked, or it was the fourth option in a list of three. There is a decent abundance of reasons to take that point of view.

fox_poll

Let me show this in another way. The Fox poll of the Refugee Ban (see image). We see the full story at http://insider.foxnews.com/2017/01/29/poll-nearly-half-america-voters-support-trumps-immigration-order, but what we do not see are the specifics on what would have given this value. You see, we do not know the number of responses, where it was done and when it was done. It is at https://poll.qu.edu/ that we learn parts of the facts, “From January 5 – 9, Quinnipiac University surveyed 899 voters nationwide with a margin of error of +/- 3.3 percentage points“, can anyone explain to me how Fox was so stupid to use a base of 899 to set a national value? Doesn’t the United States have around 320 million people? And as we realise that there 50 states, how can 18 people be significant on a view in state, and this is before we consider whether the use of gender was normalised, because men and women tend to feel different on emotional issues and is there is one element in abundance on issues concerning refugees it will be emotion.

 

So in all this, we see recurring waves of generalisation and trivialisation. Mark Zuckerberg is not to blame, but he is a factor. In addition there is an overwhelming lack in educating its customer base (by both Fox and Facebook), so we need to consider the dangers and well as the irrelevance of these ‘revelations‘. It is in this scope and in the application as seen used where classification becomes dangerous and a danger, because how will the people around a person react when they see that this person likes something people find offensive (and that is when we keep it to simple things like actors, actresses and politicians)? This will impact on the like as there will be peer pressure, so how can this Zuckerberg element be undermined? That is the actual question!

Is it as simple as condemning the press for using the fact? Is it as simple as giving out complete information? The Zuckerberg Classifications are here to stay, there is nothing against it and the fact that they are is in no way negative, but the usage of it leaves a lot to be desired and as such it is a misleading one, other than ‘this person clicked on the like button of this page, for reasons unknown’, giving it any more value is as meaningless as setting the national acceptance of a refugee ban based on 899 unquantifiable votes which represents at best 0.00028% of the United States population. If any vote was incorrectly vetted, the number will go down fast making the poll even more useless.

 

Leave a comment

Filed under Media, Politics, Science

Room for Requirement

I looked at a few issues 3 days ago. I voiced them in my blog ‘The Right Tone‘ (at https://lawlordtobe.com/2016/09/21/the-right-tone/), one day later we see ‘MI6 to recruit hundreds more staff in response to digital technology‘ (at https://www.theguardian.com/uk-news/2016/sep/21/mi6-recruit-digital-internet-social-media), what is interesting here is the quote “The information revolution fundamentally changes our operating environment. In five years’ time there will be two sorts of intelligence services: those that understand this fact and have prospered, and those that don’t and haven’t. And I’m determined that MI6 will be in the former category“, now compare it to the statement I had made one day earlier “The intelligence community needs a new kind of technological solution that is set on a different premise. Not just who is possibly guilty, but the ability of aggregation of data flags, where not to waste resources“, which is just one of many sides needed. Alex Younger also said: “Our opponents, who are unconstrained by conditions of lawfulness or proportionality, can use these capabilities to gain increasing visibility of our activities which means that we have to completely change the way that we do stuff”, I reckon the American expression: ‘He ain’t whistling Dixie‘ applies.

You see, the issue goes deeper than mere approach, the issue at hand is technology. The technology needs to change and the way data is handled requires evolution. I have been in the data field since the late 80’s and this field hasn’t changed too much. Let’s face it, parsing data is not a field that has seen too much evolving, for the mere reason that parsing is parsing and that is all about speed. So to put it on a different vehicle. We are entering an age where the intelligence community is about the haulage of data, yet in all this, it is the container itself that grows whilst the haulage is on route. So we need to find alternative matters to deal with the container content whilst on route.

Consider the data premise: ‘If data that needs processing grows by 500 man years of work on a daily basis‘, we have to either process smarter, create a more solutions to process, be smarter on what and how to process, or change the premise of time. Now let’s take another look. For this let’s take a look at a game, the game ‘No Man’s Sky’. This is not about gaming, but about the design. For decades games were drawn and loaded. A map, with its data map (quite literally so). Usually the largest part of the entire game. 11 people decided to use a formula to procedurally generate 18 quintillion planets. They created a formula to map the universe with planets, planet sized. This has never been done before! This is an important part. He turned it all around and moreover, he is sitting on a solution that is worth millions, it could even be worth billions. The reason to use this example is because games are usually the first field where the edge of hardware options are surpassed, broken and redesigned (and there is more at the end of this article). Issues that require addressing in the data field too.

Yet what approach would work?

That is pretty much the ‎£1 billion question. Consider the following situation: Data is being collected non-stop, minute by minute. Set into all kinds of data repositories. Now let’s have a fictive case. The chatter gives that in 72 hours an attack will take place, somewhere in the UK. It gives us the premise:

  1. Who
  2. Where
  3. How

Now consider the data. If we have all the phone records, who has been contacting who, through what methods and when? You see, it isn’t about the data, it is about linking collections from different sources and finding the right needle, that whilst the location, shape and size of the haystack are an unknown. Now, let’s say that the terrorist was really stupid and that number is known. So now we have to get a list of all the numbers that this phone had dialled. Then we get the task of linking the information on these people (when they are not pre-paid or burner phones). Next is the task of getting a profile, contacts, places, and other information. The list goes on and the complexity isn’t just the data, the fact that actual terrorists are not dumb and usually massively paranoid, so there is a limit to the data available.

Now what if this was not reactive, but proactive?

What if the data from all the sources could be linked? Social media, e-mail, connections, forums and that is just the directly stored data. When we add mobile devices, Smartphones, tablets and laptops, there is a massive amount of additional data that becomes available and the amount of data from those sources are growing at an alarming rate. The challenge is to correctly link the data from sources, with added data sources that contain aggregated data. So, how do you connect these different sources? I am not talking about the usage, it is about the impaired data on different foundations with no way to tell whether pairing leads to anything. For this I need to head towards a 2012 article by Hsinchun Chen (attached at end), Apart from the clarity that we see in the BI&A overview (Evolution, Application and Emerging Research), the interesting part that even when we just look at it from a BI point of view, we see two paths missing. That is, they seem to be missing now, if we look back to 2010-2011, the fact that Google and Apple grew a market in excess of 100% quarter on quarter was not to be anticipated to that degree. The image on page 1167 has Big Data Analytics and Mobile Analytics, yet Predictive Interactivity and Mobile Predictive Analytics were not part of the map, even though the growth of Predictive Analytics have been part of BI from 2005 onwards. Just in case you were wondering, I did not change subject, the software need that part of the Intelligence world uses comes from the business part. A company usually sees a lot more business from 23 million global companies than it gets from 23 intelligence agencies. The BI part is often much easier to see and track whilst both needs are served. We see a shift of it all when we look at the table on page 1169. BI&A 3.0 now gets us the Gartner Hype Cycle with the Key Characteristics:

  1. Location-aware analysis
  2. Person-centred analysis
  3. Context-relevant analysis
  4. Mobile visualization & HCI

This is where we see the jump when we relate to places like Palantir that is now in the weeds prepping for war. Tech Crunch (at https://techcrunch.com/2016/06/24/why-a-palantir-ipo-might-not-be-far-off/) mentioned in June that it had taken certain steps and had been preparing for an IPO. I cannot say how deep that part was, yet when we line up a few parts we see an incomplete story. The headline in July was: ‘Palantir sues investor Marc Abramowitz for allegedly stealing company secrets‘, I think the story goes a little further than that. It is my personal belief that Palantir has figured something out. That part was seen 3 days ago (at http://www.defensenews.com/articles/dcgs-commentary), the two quotes that matter are “The Army’s Distributed Common Ground System (DCGS) is proof of this fact. For the better part of the last decade, the Army has struggled to build DCGS from the ground up as the primary intelligence tool for soldiers on the battlefield. As an overarching enterprise, DCGS is a legitimate and worthwhile endeavour, intended to compute and store massive amounts of data and deliver information in real time“, which gives us (actually just you the reader) the background, whilst “What the Army has created, although well-intentioned, is a sluggish system that is difficult to use, layered with complications and unable to sustain the constant demands of intelligence analysts and soldiers in combat. The cost to taxpayers has been approximated at $4 billion“, gives us the realistic scope and that all links back to the Intelligence Community. I think that someone at Palantir has worked out a few complications making their product the one winning solution. When I started to look into the matter, some parts did not make sense, even if we take the third statement (which I was already aware of long before this year “In legal testimony, an Army official acknowledged giving a reporter a “negative” and “not scientific” document about Palantir’s capabilities that was written by a staff member but formatted to appear like a report from the International Security Assistance Force. That same official stated that the document was not based on scientific data“, it would not have added up. What does add up (remember, the next part is speculative), the data links required in the beginning of the article, have to a larger extent been resolved by the Palantir engineers. In its foundation, what the journal refers to as BI&A 3.0 has been resolved by Palantir (top some extent). If true, we will get a massive market shift. To make a comparison, Google Analytics might be regarded as MSDOS and this new solution makes Palantir the new SE-Linux edition, the difference on this element could be that big. The difference would be that great. And I can tell you that Google Analytics is big. Palantir got the puzzle piece making its value go up with billions. They could raise their value from 20 billion to 60-80 billion, because IBM has never worked out that part of analytics (whatever they claim to have is utterly inferior) and Google does have a mobile analytics part, but limited merely as it is for a very different market. There have always been issues with the DCGS-A system (apart from it being as cumbersome as a 1990 SAS mainframe edition), so it seems to me that Palantir could not make the deeper jump into government contracts until it got the proper references and showing it was intentionally kept out of the loop is also evidence that could help. That part was recently confirmed by US Defense News.

In addition there is the acceptance of Palantir Gotham, which offered 30% more work with the same staff levels and Palantir apparantly delivered, which is a massive point that the Intelligence groups are dealing with, the lack of resources. The job has allowed NY City to crack down on illegal AirBnB rentals. A task that requires to connect multiple systems and data that was never designed to link together. This now gets us to the part that matters, the implication is that the Gotham Core would allow for dealing with the Digital data groups like Tablet, mobile and streaming data from internet sites.

When we combine the information (still making it highly speculative) the fact that one Congressman crossed the bridge (Duncan Hunter R-CA), many could follow. That part matters as Palantir can only grow the solution if it is seen as the serious solution within the US government. The alleged false statements the army made (as seen in Defence News at http://www.defensenews.com/articles/dcgs-commentary) with I personally believe was done to keep in the shadows that DCGS-A was not the big success some claimed it to be, will impact it all.

And this now links to the mentions I made with the Academic paper when we look at page 1174, regarding the Emerging Research for Mobile Analytics. The options:

  1. Mobile Pervasive Apps
  2. Mobile Sensing Apps
  3. Mobile Social Networking
  4. Mobile Visualization/HCI
  5. Personalization and Behavioural Modelling

Parts that are a given, and the big players have some sort of top line reporting, but if I am correct and it is indeed the case that Palantir has figured a few things out, they are now sitting on the mother lode, because there is currently nothing that can do any of it anywhere close to real-time. Should this be true, Palantir would end being the only player in town in that field, an advantage corporations haven’t had to this extent since the late 80’s. The approach SPSS used to have before they decided to cater to the smallest iteration of ‘acceptable’ and now as IBM Statistics, they really haven’t moved forward that much.

Now let’s face it, these are all consumer solutions, yet Palantir has a finance option which is now interesting as Intelligence Online reported a little over a week ago: “The joint venture between Palantir and Credit Suisse has hired a number of former interception and financial intelligence officials“, meaning that the financial intelligence industry is getting its own hunters to deal with, if any of those greedy jackals have been getting there deals via their iPhone, they will be lighting up like a Christmas tree on those data sets. So in 2017, the finance/business section of newspapers should be fun to watch!

The fact that those other players are now getting a new threat with actual working solutions should hurt plenty too, especially in the lost revenue section of their spreadsheet.

In final part, why did I make the No Man’s Sky reference? You see, that is part of it all. As stated earlier, it used a formula to create a planet sized planet. Which is one side of the equation. Yet, the algorithm could be reversed. There is nothing stopping the makers to scan a map and get us a formula that creates that map. For the gaming industry it would be forth a fortune. However, that application could go a lot further. What if the Geospatial Data is not a fictive map, but an actual one? What if one of the trees are not trees but mobile users and the other type of trees are networking nodes? It would be the first move of setting Geospatial Data in a framework of personalised behavioural modelling against a predictive framework. Now, there is no way that we know where the person would go, yet this would be a massive first step in answering ‘who not to look for‘ and ‘where not to look‘, diminishing a resource drain to say the least.

It would be a game changer for non-gamers!

special_issue_business_intelligence_rese

 

Leave a comment

Filed under Finance, IT, Military, Politics, Science

The Right Tone

Today we do not look at Ahmad Khan Rahami, we look at the engine behind it. First of all, let’s get ugly for a second. If you are an American, if you think that Edward Snowden was a ‘righteous dude’, than you are just as guilty as Ahmad Khan Rahami injuring 29 people. Let’s explain that to those who did not get through life through logic. You see, the US (read: NSA) needed to find ways to find extremists. This is because 9/11 taught them the hard way that certain support mechanisms were already in place for these people in the United States. The US government needed a much better warning system. PRISM might have been one of these systems. You see, that part is seen in the Guardian (at https://www.theguardian.com/us-news/2016/sep/20/ahmad-khan-rahami-father-fbi-terrorism-bombing), the quote that is important here is “Some investigators believe the bombs resemble designs released on to the internet by al-Qaida’s Yemeni affiliate through its Inspire publication“, PRISM would be the expert tool to scan for anyone opening or accessing those files. Those who get certain messages and attachments from the uploading locations. To state it differently “the NSA can use these PRISM requests to target communications that were encrypted when they travelled across the internet backbone, to focus on stored data that telecommunication filtering systems discarded earlier“, so when a package is send through the internet and delivered, it gets ‘dropped’, meaning the file is no longer required. The important part is that it is not deleted, it is, if we use the old terms ‘erased’, this is not the same! When it is deleted it is removed, when it is erased, that space is set as ‘available’ and until something else gets placed there it is still there. An example you will understand is: ‘temporary internet files’. When you use your browser things get saved on your computer, smartphone, you name it. Until this is cleaned out, the system has that history and it can be recalled with the right tool at any given moment. PRISM allows to find the paths and the access, so this now relates to the bomber, because if correct, PRISM could see if he had actually gotten the information from Inspire magazine. If so, a possible lone wolf would have been found. Now, the system is more complex than that, so there are other path, but with PRISM in the open, criminals (especially terrorists) have gotten smarter and because PRISM is less effective, other means need to be found to find these people, which is a problem all by itself! This is why Edward Snowden is a traitor plain and simple! And every casualty is blood on his hands and on the hands of his supporters!

The right tone is about more than this, it is also about Ahmad Khan Rahami. You see, he would be a likely recruit for Islamic State and Al-Qaida, but the issue is that his profile is not clean, it is not the target recruit. You see, apart from his dad dobbing him in in 2014, he stands out too much. Lone wolves are like cutthroats. Until the deed is done, they tend to remain invisible (often remain invisible after the deed too). There is still a chance he allowed himself to be used as a tool, but the man could be in effect a slightly radicalised mental health case. You see, this person resembles the Australian Martin Place extremist more than the actual terrorists like we saw in Paris. I reckon that this is why he was not charged at present. For now he is charges with attempted murder (3 hours ago), yet not all answers have been found. You see, the quote “they had linked Rahami to Saturday’s bombing in Chelsea, another unexploded device found nearby, both constructed in pressure cookers packed with metallic fragmentation material. They also said he was believed to be linked to a pipe bomb that blew up in Seaside Park, New Jersey, on Saturday and explosive devices found in the town of Elizabeth on Sunday“, the proper people need to ascertain whether he is just the set-up, or a loser with two left hands. The FBI cannot work from the premise that they got lucky with a possible radicalised person with a 60% fail rate. If he is the start of actual lone wolves, PRISM should have been at the centre of finding these people that is if Snowden had not betrayed his nation. Now there is the real danger of additional casualties. I have always and still belief that a lot of Snowden did not add up, in many ways, most people with actual SE-LINUX knowledge would know that the amount of data did not make sense, unless the NSA totally screwed up its own security (on multiple levels), and that is just the server and monitoring architecture, yet I digress (again).

The big picture is not just the US, it is a global problem as France found out the hard way and new methods are needed to find people like that. The right tone is about keeping the innocent safe and optional victims protected from harm. The truth here is that eggs will be broken, because an omelette like this needs a multitude of ingredients and not to mention a fair amount of eggs. The right tone is however a lot harder than many would guess. You see, even if Man Haron Monis (Martin Place Sydney) and Ahmad Khan Rahami both could be regarded as mental health cases (Man more than Ahmad), the issue of lone wolf support does not go away. Ahmad got to Inspire magazine in some way. Can that be tracked by the FBI cyber division? It might be a little easier after the fact, so it becomes about backtracking, but wouldn’t it have been great to do this proactively? It will be a while until this is resolved to the satisfaction of law enforcement and then still the question becomes, was he alone? Did he have support? You see a lone wolf, a radicalised person does not grow from within. Such a person requires coaching and ‘guidance’. Answers need to be found and a multitude of people will need to play the right tune, to the right rhythm. The right tone is not just a mere consideration, in matters like these it is like a red wire through it all. It is about interconnectivity and it is always messy. There is no clear package of events, with cash receipts and fingerprints. It is not even a legal question regarding what was more likely than not. The right tone is also in growing concern an issue of resources. It isn’t just prioritisation, it is the danger that mental health cases drain the resources required to go after the actual direct threats. With the pressures of Russia and the US growing, the stalemate of a new cold war front works in favour of Islamic state and the lone wolves who are linked to someone, but not usually know who. The workload on this surpasses the power of a google centre and those peanut places tend to be really expensive, so resource requirements cannot be meet, so it becomes for us about a commonwealth partnership of availability which now brings local culture in play. The intelligence community needs a new kind of technological solution that is set on a different premise. Not just who is possibly guilty, but the ability of aggregation of data flags, where not to waste resources. For example, I have seen a copy of Inspire in the past, I have seen radicalised video (for the articles). I don’t mind being looked at, yet I hope they do not waste their time on me. I am not alone. There are thousands who through no intentional act become a person of investigative interest. You see, that is where pro-activity always had to be, who is possibly a threat to the lives of others? The technical ability to scrap possible threats at the earliest opportunity. Consider something like Missing Value Analyses. It is a technique to consider patterns. SPSS (now IBM Statistics) wrote this in its manual “The Missing Value Analysis option extends this power by giving you tools for discovering patterns of missing data that occur frequently in survey and other types of data and for dealing with data that contain missing values. Often in survey data, patterns become evident that will affect analysis. For example, you might find that people living in certain areas are reluctant to give their annual incomes, thus creating missing values in your data. If you leave these values out, are your statistical conclusions valid?” (Source: M.A. Hill, ‘SPSS Missing Value Analysis 7.5’, 1997). This is more to the point then you think. consider that premise, that we replace ‘people living in certain areas are reluctant to give their annual incomes’ with ‘people reading certain magazines are reluctant to admit they read it’. It sounds innocent enough when it is Playboy or penthouse (denied to have been read by roughly 87.4% of the male teenage population), but what happens when it is a magazine like Inspire, or Stormfront? It is not just about the radicalised, long term it must be about the facilitators and the guides to that. Because the flock is in the long term not the problem, the herder is and data and intelligence will get us to that person. The method of getting us there is however a lot less clear and due to a few people not comprehending what they were doing with their short sightedness, the image only became more complex. You see, the complexity is not just the ‘missing data’, it is that this is data that is set in a path, this entire equation becomes a lot more unclear (not complex) when the data is the result of omission and evasion. How the data became missing is a core attribute here. Statisticians like Hackman and Allison might have looked at it for the method of Business Intelligence, yet consider the following: “What if our data is missing but not at random? We must specify a model for the probability of missing data, which can be pretty challenging as it requires a good understanding of the data generating process. The Sample Selection Bias Model, by James Heckman, is a widely used method that you can apply in SAS using PROC QLIM (Heckman et al., 1998)“, this is not a regression where we look at missing income. We need to find the people who are tiptoeing on the net in ways to not get logged, or to get logged as someone else. That is the tough cookie that requires solutions that are currently incomplete or no longer working. And yes, all these issues would require to be addressed for lone wolves and mental cases alike. A massive task that is growing at a speculated 500 work years each day, so as you can imagine, a guaranteed billion dollar future for whomever gets to solve it, I reckon massive wealth would be there for the person who could design the solution that shrinks the resource requirements by a mere 20%, so the market is still lucrative to say the least.

The right tone is an issue that can be achieved when the right people are handed the right tools for the job.

1 Comment

Filed under IT, Media, Military, Politics, Science