Tag Archives: Data

One bowl of speculation please

Yup, we all do it, we all like to taste from the bowl of speculation. I am no different, in my case that bowl can be as yummy as a leek potato soup, on other days it is like a thick soup of peas, potato with beef sausages. It tends to depend on the side of the speculation (science, engineering or Business Intelligence) today is Business Intelligence, which tends to be a deep tomato soup with croutons, almost like a thick minestra pomodore. I saw two articles today. The first one is seen (at https://www.bbc.co.uk/news/technology-64917397) and comes from the BBC giving us ‘Meta exploring plans for Twitter rival’, no matter that we are given “It could rival both Twitter and its decentralised competitor, Mastodon. A spokesperson told the BBC: “We’re exploring a standalone decentralised social network for sharing text updates. “We believe there’s an opportunity for a separate space where creators and public figures can share timely updates about their interests.”” Whatever they are spinning here, make no mistake. This is about DATA, this is about AGGREGATION and about linking people, links that too often Twitter has and LinkedIn and Facebook does not. A stage where the people needs clustering to see how to profiles can be linked with minimum connectivity. It is what SPSS used to call PLANCARDS (conjoint module). In this by keeping the links as simple as possible, their deeper machine learning will learn new stage of connectivity. That is my speculated view. You see this is the age where those without exceptional deeper machine learning, new models need to be designed to catch up with players like Google and Amazon, so the larger speculation is that somehow Microsoft is involved, but I tell you now that this speculation is based on very thin and very slippery ice, it merely makes sense that these to will find some kind of partnership. The speculation is not based on pure logic, if that were true Microsoft would not be a factor at all.

But the second article (from a less reliable source is giving us (at https://newsroomodisha.com/meta-to-begin-laying-off-another-11k-employees-in-multiple-waves-next-week/) so they are investigating a new technology all whilst shedding 11% of their workforce. A workforce that is already strained to say the least and this new project will not rely on a dozen people, that project will involve a lot more people, especially if my PLANCARDS speculation is correct. That being said, if Microsoft is indeed a factor, the double stump might make more sense, hence the larger speculative side. Even as the second source gives us ““We’re continuing to look across the company, across both Family of Apps and Reality Labs, and really evaluate whether we are deploying our resources toward the highest leverage opportunities,” Meta Chief Financial Officer Susan Li said at an Morgan Stanley conference on Thursday. “This is going to result in us making some tough decisions to wind down projects in some places, to shift resources away from some teams,” Li added.” Now when we consider the words of Susan Li, the combination does not make too much sense. The chance of shedding the wrong people would give the game away, yes Twitter is in a bind, but it will add full steam in this case and they will find their own solutions (not sure where they will look), a stage that is coming and the two messages make very little sense. Another side might be pushing it if Meta is shedding jobs to desperately reduce cost, which is possible. I cannot tell at present, their CFO is not handing me their books for some weird reason.

Still, the speculation is real as the setting seems unnatural, but in IT that is nothing new, we have seen enough examples of that. So, enjoy your Saturday and feel free to speculate yourself, we all need that at times to TLC our own ego’s.

Advertisement

Leave a comment

Filed under Finance, IT, Science

They just won’t learn

That happens, people Incapable of learning. IT people listening to salespeople because these sales people know what buttons to push. Board members pushing for changes so that their peer will see that they are up to speed on the inter-nest of things (no typo) and there are all other kinds of variation and pretty much every company has them. Even as Australia is still reeling from the Optus debacle, Telstra joins the stupid range (at https://www.abc.net.au/news/2022-10-04/telstra-staff-have-details-hacked/101499920). So explain to me why an HR system needs to be online? OK, you will get away with that and there is a need for some to access it, but in what universe does this need to be so open that EVERYONE can get to it? That is the question we see raised with ‘Telstra data breach sees names and email addresses of staff uploaded online’, a blunder of unimaginable proportions. On the other hand, Telstra will be bleeding staff members left, right and forward pretty soon. You see, this list is well desired by over a dozen telecoms in Europe, North America, the Middle East and Asia. They all need staff all over the place and now their headhunters know EXACTLY where to dig. Even as the article gives us two parts. The first part is “a third party which was offering a rewards program for staff had the data breach in 2017” as well as “Telstra has not used the rewards program since 2017, the spokesperson said” in all this the question that matters are not asked. We get Bill Shorten trying to change the conversation back to Optus with: “get the information so I can stop hackers from hacking into government data and further compromising people’s privacy”. The massive part is “Why was a reward program not used for 5 years still linked to HR data?” It seems that ABC does not ask this and the others do not either. So even if we get “Attorney-General Mark Dreyfus has said he will review Australia’s privacy laws and tighter protections could be brought in by the end of the year” Yet the larger question remains unanswered. How to protect these systems from STUPID people? A reward system that has a direct link to the HR data and was not used for 5 years is stupid, plain and simple stupid. As such this affects their IT and their HR department. Yet the people (politicians and media are not asking these questions are they? They let Labor loser Shorten change the conversation. Oh, do not worry we are not even close to done with Optus, but the setting that the conversation is pushed away from Telstra allegedly implies that Telstra has too large a hold on Media and politicians. So whilst the media allowed Telstra to hide behind “while the data is of minimal risk to former employees” they fail to see the larger picture. In an age brain drains these people are worth their eight in Lithium (more valuable than gold) and it seems to me that an employment database of 30,000 telecom people will be eagerly mined in the three earlier mentioned regions. These hackers were smart, they can get a million easily (over 10-15 customers) and these customers will not care where that data comes from, they need personnel and they needs them now. So it seems that certain people just ill not learn and there is no hiding behind “in an attempt to profit from the Optus breach” Telstra claims to be so superior, of that is so either the hack would not have affected them, or these systems are in a worse shape than ever before and that is also missing from the article. Two competitors successfully hit by the same flaw? It seems that too many people are asleep at the wheel. And no one is asking the right questions, not even the media, why is that?

Leave a comment

Filed under IT, Media, Politics

As banks cut corners

There was news on ABC news, it was not really news, this was a stage that I saw coming a mile away and that was 5 years ago, yet the speed at which this is procreating is cause for concern. The article ‘Protecting yourself from phone porting and SIM card scams’ (at https://www.abc.net.au/everyday/protecting-yourself-from-phone-porting-and-sim-card-scams/100421586) is not just this, the entire COVID registration issues are making things worse. When we take notice of ““At 5:55pm, I got a text message from my telco. It said, ‘Hi, received your port out request for this service,'” he says. “By the time I tried to call them, my phone already went to SOS only. Before I could even react, my number was gone.””, you might think that this is an isolated case, but it is not, when we add ““They had my customer ID [for online banking], and you can do a password reset if you have the customer ID and mobile number,” he explains. “It was really professional. I had daily limit of $10,000, so they sent $10,000. They bypassed that limit by opening another account inside my account, which you can do online, and then they transferred another $10,000.”” There is. massive flaw, the banks refer to this as being customer friendly, I personally see it criminal friendly. All kinds of level of checks and balances are left out of the equation and for now we see banking party-lines that these matters are seldom, the people are protected and it can be reversed. Yet in 5G, within the next 2-3 years the costs will go beyond what the banks find reasonable and we are left with the costs, we are left with the impact and we are left outside in the cold. That is an almost given and matters are merely getting worse. 

The banks (to cut corners) are setting up more and more to be done online, all whilst proper security is lagging and there is a whole range of actions that will not and should not be allowed. I had to check and make sure that online banking was DISABLED, it makes a few issues a bit more hassle, but compared to the damage I could face 2-5 times a year it is a no-brainer. This is a mere beginning when we consider “If I want to change providers, before the [new] standard was put in place, I just had to give my name, my date of birth and my address,”, all whilst the increased made “scammers ask a victim’s existing telco to switch the number to a new SIM”, the effect is the same and because some players are cutting corners the consumer is left with the hardship. There is no easy way here and I get that, yet there is a larger stage of checks and balances missing all whilst cost cutting parties make ‘customer friendly’ needs, whilst parses of verification needs to be at the centre of this all and it is getting worse. 

Why is it getting worse?
Well, There were 5 attempts to scam me in the last 8 weeks, 2 of them were so good that I could not find anything wrong with the information and sources given, more importantly in one case I had to make a separate call to PayPal to make checks to make sure, they had become that good and I know what to look for, yet I have an ace up my sleeve (which I will not reveal here), it stopped numerous scams from being completed.

The first is that YOU NEVER EVER USE A LINK GIVEN! You find the number, the generic number of for example PayPal and you reference the numbers that you write down, they were ready to tell me that no such activity exists. If you click on any link you are causing damage to yourself. But the two (including PayPal) were so well done that finding the differences were close to impossible and I know what to look for. A consumer will have little to no chance at all. 

And matters are getting worse, because 5G will enable the scammers to approach well over 500% in the same time, their revenue goes up and at some point it will cost us, insurances will soon stop paying out and then it will become a much larger problem. You either pay an annual fee, or lose your money. I feel that this is where it is going. 

So whilst we see “to enable to SIM port or swap, scammers will need personal information, like your name, address, and date of birth” COVID give them the name and phone number, the phone number can in some cases link to an address and then only the date of birth is missing and with all these transgressed data bases. Now consider all these places that got hacked, which have a birthdate? Which have a phone number? And the image below completes the picture. 

We see three sources required to get all the data they need and they keep on adding data, data you freely give away in apps, data they captured, data from hacks on the dark web and it is BIG BUSINESS, in the example it is one person with the $10,000 target, now consider 750,000 in the UK alone, 500,000 in Australia, 35,000,000 in the US and consider that $10,000 was a small jab, even smaller would work for them, like a mere $500, with these numbers these criminals become billionaires within a month and these actions need to be done fast. They have per nation 3-4 days at the most, so within 2 weeks they are looking at millions and with 5G they can get more and they can get there faster. Do you still think I am kidding? Take a good look at what data you entered in ANY app or any website, now consider that these people are doing nothing more but to add data as much as they can, at some point (within a dozen sources) they have enough data to port you, to capture your bank accounts and to make changes to your life. They merely needed some time, a $2500 computer and a decent internet connection, the pay off would be a 7 figure number and with the speed they are tracked they would be living large in another country with nothing attached to them. That is the current reality and the level of checks and balances that are missing is just too unbelievable for words.

Enjoy your bank account (for as long as you still have it)

2 Comments

Filed under Finance, IT

In retrospect

I (for the most) react to facts, as I do now, but the results are not anticipated new facts, what comes next is pure speculation, no matter how correct I think I am, it is speculation and that needs to be said up front. Even as I start now, my mind is racing through speculative ideas and options in other realms (science realms no less), but I digress. The thoughts started with a Reuter article called ‘Analysis: Biden’s COVID-19 strategy thwarted by anti-vaxxers, Delta variant’, the article (at https://www.reuters.com/world/us/bidens-covid-19-strategy-thwarted-by-anti-vaxxers-delta-variant-2021-07-29/) gives us “Dr. Peter Hotez, a vaccinologist and dean of the National School of Tropical Medicine at Baylor College of Medicine, said the Biden administration’s acknowledgement of the “terrible impact” of the anti-vaccine movement was important, but he said the government could do more. “Anti-science is arguably one of the leading killers of the American people, and yet we don’t … treat it as such. We don’t give it the same stature as global terrorism and nuclear proliferation and cyber attacks,” he said”, it might be a mere quote, it might be the paraphrasing from the article writer, which is not a negative view, but it got me thinking. When we see the anti-vaxxer movements in the US and EU, they are uncannily effective, they are almost too effective. For the most and proven since the 90’s, the anti-vaxxers are either religiously inclined like the Dutch people in Giethorn (their ‘sort of’ version of Amish) or loons (often people who are one shade away from being absolutely bug-nuts). In the first, these people are driven and they are also self isolationists, it is merely about them and their community, it makes them a danger to themselves, not to others. The second group is a danger to all, but often so stupid they merely hit other stupid people. These anti-vaxxers are driven, not merely by intelligent people, no, they are driven like they are terrorist tools, like biological DOS agents and they are growing. These people are not accepting any scientific evidence, they forward non-scientific papers as ‘their’ evidence and they are not merely more effective, they are almost centrally driven by a similar source. 

In the UK the Guardian is giving visibility to Kate Shemirani, in the USA we see Alabama Curt Carpenter and the list grows. Someone is somehow fuelling this, yes this is speculative and this is not merely the power of social media, someone had months to prepare the weaker minded and target them in a direction, limelight seeking nobodies all wanting their limelight with as large as an audience as possible. The evidence is not clear and as such this is speculation, yet consider the timelines of each of these Anti-vaxxers, what their audience was a year ago and each month after that. This goes beyond buying likes on places like Facebook. Some people are fuelling these ‘bright’ illumination spots and they are not done, even as they are retracting their ‘assistance’ there is still a digital footprint and it is now diminishing. Yes, I admit upfront that my view is speculative, but my speculation fits the profile, are the US and the EU under attack from bio-terrorists? You might think that they are not the same, but there you would be wrong. In this I grasp back to a writing from 2012 called ‘A Proposed Universal Medical and Public Health Definition of Terrorism’. Here we see “We propose the following universal medical and public definition of terrorism: The intentional use of violence — real or threatened — against one or more non-combatants and/or those services essential for or protective of their health, resulting in adverse health effects in those immediately affected and their community, ranging from a loss of well-being or security to injury, illness, or death”, in this, if even one of my speculations are proven, these anti-vaxxers become complicit in acts of terrorism. Did you even consider that? Now, there is a dangerous fence. I am not debating THEIR right to be anti vaccinated. If they die, they only have themselves to thank, just like Curt Carpenter. Yet by attacking science by non-science and debunked non-facts, the setting changes and that is where we are now. What should have been a straight path to recovery is now a much larger issue. The delay is not on President Biden, and now that we can optionally see that the US is yet again under terrorist attack his priorities need to change, attacking big-tech is futile and counter productive, the laws needs adjusting free speech, it needs to be validated by accountability. 

And for the love of god, can some well trained data analyst please take a look at the timeline of these anti-vaxxers? I think it is time to look at timelines here and that is when my brain went into some sort of overdrive. It goes back when I designed an intrusion system that stayed one hop away from a router table between two points and to infect one of the routers to duplicate packages from that router on that path, one infection tended to not be enough, 2-3 infections needed to be made so that the traffic on that route between two points could be intercepted, I called it the Hop+1 solution, I came up with it whilst considering the non-Korean Sony hack. That  thought drove me to think of an approach to find the links. In the first we most likely need to find on where and when they accessed the dark web, then we see another part, because if we can find their access, we can optionally see others too, when we have that list and we can correlate it to other anti-vaxxers we have an optional pattern for action. No matter how this is seen it will be staged towards my speculation, something that needs proof, proof is required to give validity to actions that follow. I believe that I am correct, but I admit that it is a speculative push in a path towards thinking something is what I personally think it is, not a path towards evidence, evidence needs to be found and the evidence that is made to fit the solution, is no evidence, it is like stating that there is a linear relationship when you only have two plot points. A pattern of evidence is required, it is always about the patterns. 

So when I look at the ‘in retrospect’ part, I am wondering when the connections were there in the early stages and I also wonder why the others are not on that path yet (or seemingly yet). The media is only partly to blame, yes they give limelight, but that was their job from the early days, like the people exploiting Google cookies, the media can be exploited too, seeking the limelight is not a crime, but in conjunction with a terrorist agenda we are on new shaky grounds, and that is the problem, any law eagerly over-quick created is pointless whilst inaction is useless, caught between two rocks whilst the floor is not lava it is the ever exploiting media, exploiting for clicks, for visibility and circulation, whilst calling it ‘the people have a right to know’. This has the option of heading into a really bad direction soon enough. Will it? I have absolutely no idea.

Leave a comment

Filed under IT, Military, Politics, Science

Consider the question

We always have questions, we all do. Some are based upon curiosity, some are based on acquisition and some on compilation. The people tend to have questions in the range of one and three, businesses on two and three, with an optional need for the first group to see if a creation towards awareness is required. And in this we need to see ‘Facebook v Apple: The ad tracking row heats up’, the article (at https://www.bbc.com/news/technology-56831241) gives us “The IDFA can also be paired with other tech, such as Facebook’s tracking pixels or tracking cookies, which follow users around the web, to learn even more about you”, yet the question no one seems to be asking is how much is an advertiser entitled to get? I have no issue that Facebook, within Facebook measures and ‘collects’ it is the price of a free service, but did we sign up for a larger stake (or is that steak) at the expense of the consumer? Even as we tend to agree and accept “Apple co-founder Steve Jobs acknowledged that some people didn’t care about how much data they shared, but said they should always be informed of how it was being used”, in this the question takes a few steps and has a few exits in where to go next and we tend to remain in the dark about our needs, and what we are comfortable with. This is not new, but digital marketing is new, we have never faced it before. Even as we accept the quote by Tim Cook, the setting given with “If a business is built on misleading users, on data exploitation, on choices that are no choices at all, it does not deserve our praise. It deserves reform”, we forget that this is not merely misusing, it is a much larger stake. I some time ago refused to play a game because it collected my religion. Since when is a game’s requirement the religion I have? So (its Catholic by the way), even as we decide to not use an application, consider the price we pay and it goes further as app’s and their advertisements strategy on nearly EVERY device is set to showing us advertisements (to further the financial setting of the maker), in this I have no real problem, but what information is collected by the advertiser? And we all like the steps Apple seems to be making and as we ‘revere’ “Apple is baking privacy into its systems. Its browser Safari already blocks third-party cookies by default, and last year Apple forced app providers in iOS to spell out in the App Store listings what data they collect” we are forgetting what all advertisers are collecting and no less the issue becomes what happens when 5-7 games collectively are collecting and for the most we have no idea where this will end and it is important to take that in mind. It is there where Facebook is getting the largest negative wave. With “And it argues that sharing data with advertisers is key to giving users “better experiences””, precisely what is that ‘better experience’? And in what setting should ANY data be shared with an advertiser? We get that the advertiser wants to segment WHO gets to see their advertisement, we get that and I reckon no one will object. Yet why share our details? How is that priced and why are we not informed? OK, we are not told that Facebook is getting money of us, it is after-all a free service and as Mark Zuckerberg told the senate in a hearing “We sell ad’s”, yet he did not say “We sell ad’s and user data”, you all do understand that there is a fundamental difference between the two, you do get that, do you? And we see that given in the BBC article when we are given “Facebook appeared to accept the changes and promised “new advertiser experiences and measurement protocols”. It admitted that the ways digital advertisers collect and use information needed to “evolve” to one that will rely on “less data””, but that now gives us a much larger problem (optionally), when we see ‘new advertiser experiences’ we should be concerned on what it will cost, in pricing, in experience and in data segments. It does not make Facebook evil or bad, but when we are given “Technology consultant Max Kalmykov wrote in Medium that advertisers had to “prepare for the next, privacy-focused era of digital advertising””we accept change, we accept evolution, but in the stage of digital marketing most can be achieved WITHOUT sharing data of any individual level with the advertiser, the setting we see come might be good, yet I am concerned with their view of ‘new advertiser experiences and measurement protocols’, a setting for sales, not the consumers and optional victims, because to some degree that matters. Do I care when I see another advertisement by MWAVE.com.au? No, I do not, and for the most I do not care about that part, it is basically the cost of a free service, but no one accepted sharing data and that I what Apple is bringing to the surface even more than Cambridge Analytica brought. 

There is a larger setting in all this and we optionally see that with “Device fingerprinting combines certain attributes of a device – such as the operating system it uses, the type and version of web browser and the device’s IP address to identify it uniquely. It is an imperfect art, but one that is gaining traction in the advertising world”. You see I made the personal choice not to link devices, not to link services of any kind, it will not stop aggregation, it will merely slow it down, yet most of the people did not have the foresight I had a decade ago, as such the apps that have a identifier of hardware, they will get a lot more information on non-Apple devices in the near future. When the people realise that all others will take a backstage, it is a powerful advantage that Apple is creating, I wonder what Google will do next, because their market is in the middle of Apple and Facebook, they need to side one way or the other and it will have deeper repercussions in the long game. As such we see that Apple made its choice, it is one the consumers will embrace, some will accept the scenario that Facebook offers, and laughingly they oppose the data governments have and give it to whomever else wants it. In this Google has an opportunity (or a burden), but only if they change the game they are playing. When the consumers see this, they will wonder where to go next and they are all about flames and biased options through the media. 

It started last year and got to be serious in December 2020 when we were given (at https://www.theverge.com/2020/12/17/22180102/facebook-new-newspaper-ad-apple-ios-14-privacy-prompt) ‘Facebook hits back at Apple with second critical newspaper ad’, in one form we are given “Forty-four percent of small to medium businesses started or increased their usage of personalised ads on social media during the pandemic, according to a new Deloitte study. Without personalised ads, Facebook data shows that the average small business advertiser stands to see a cut of over 60% in their sales for every dollar they spend”, is that true? When you pick up the newspaper, how much is personalised? There will remain a level of personalised ads within Facebook, but the following outside of Facebook (within Apple products) stops and that might be a relief to a lot of consumers. As such I have a much larger issue with “the average small business advertiser stands to see a cut of over 60% in their sales for every dollar they spend”, I would be interested to investigate the data that brought the statement, and I have some reservations on the application of the data used. We could optionally say that the digital marketing that relies on such a 100% application is also to some degree unfair on printed media, but that is a very different conversation. 

And in all this the question will soon become “What should you (be allowed to) collect from me?” And now with the upgrades Apple has created a massive advantage, Google will need time to define an answer and direction, because Google will need to make a choice, and this is not a simple one, their business profile will alter accordingly and as Facebook is setting its premise, we see a larger stage, one with the option where Google Plus might be re-introduced in a much larger application of personal and non personal data, you see they are all about the personal data all whilst the hardware fingerprints in 5G will be a much larger setting then it ever was and there a much larger gain could be made by the proper makers in all this.

Did you see the new world where your mobile, tablets, laptop and domotics are linked? I can see it and the application of one of my mobile devices, yet the stage that it offers (or not) is still open to a lot of the players, so as I see it the next year will see a rapid evolution of digital marketing. Those who adjust will see 2023, those who do not ‘Goodbye!

Leave a comment

Filed under IT, Media, Science

If not, then; else, return;

How cryptic is that? It was a sentence that I used in the 80’s, I sounded clever and cryptic at the same time, yet it was not for that, it was the stage where some had no idea how some things worked in IT programming (Clipper), the use of Boolean variables wasn’t alien to them, but it was close to the unknown and just now, the idea hit that in all these stages of ‘showing’ things, I wonder how many have shown the stage of choices, Boolean choices?

A stage overlooked for such a long time and why was it overlooked? The people who need it are in a stage of wondering things out, now for the most it does not matter, but what happens when the dataset you are looking at is a few million cases?

As such as you look at this small triangle, can you answer the 4 results? And this is a setting with merely 3 variables, and merely 2 Booleans used. When that list grows in variables and Booleans, it becomes a larger scene of people wondering if they missed anything, wouldn’t it be nice to see an answer there? In an age of dashboard people whose Business intelligence setting is absent of a degree in advanced mathematics, statistics is the best we can hope for, at least in this setting someone can give them a better tool? What do you think?

When we look at the stage of larger datasets, do you think such a tool is less needed or more needed? And when IT makes these people the 14th export, will they agree with the assessment?

I will leave it up to you, gee, another day another set of ideas added, in an age where marketing hands iteration over and calls it innovation. I wonder how many software solutions have this option at present.

Leave a comment

Filed under IT, Science

Is it real?

Yes, that is the question we all ask at times, in my case it is something my mind is working out, or at least trying to work out. The idea that my mind is forming is “Is it the image of a vision, or is it a vision of an image”, one is highly useful, the other a little less so. The mind is using all kinds of ideas to collaborate in this, as such, I wonder what is. The first is a jigsaw, consider a jigsaw, even as the image is different, the pieces are often less so different, one could argue that hundreds of jigsaws have interchangeable pieces, we merely do not consider them as the image is different and for the most, how many jigsaws have you ever owned? With this in the back of the mind what happens when we have data snippets, a data template, with several connectors, the specific id of the data and then we have the connector which indicates where the data comes from, both with date and time stamps. But like any jigsaw, what if we have hundreds of jigsaws and the pieces are interchangeable? What is the data system is a loom that holds all the data, but the loom reflects on the image of the tapestry, what happens, when we see all the looms, all the tapestries and we identify the fibres as the individual users? What happens when we create new tapestries that are founded on the users? We think it is meaning less and useless, but is it? What if data centres have the ability to make new frameworks, to stage a setting that identifies the user and their actions? We talk about doing this, we claim to make such efforts, but are we? You see, as IBM completed its first Quantum computer, and it has now a grasp on shallow circuits, the stage comes closer to having Ann actual AI in play, not the one that IT marketing claims to have, and salespeople states is in play, but an actual AI that can look into the matter, as this comes into play we will need a new foundation of data and a new setting to store and retrieve data, everything that is now is done for the convenience of revenue, a hierarchic system decades old, even if the carriers of such systems are in denial, the thinking requires us to thwart their silliness and think of the data of tomorrow, because the data of today will not suffice, no matter how blue Microsoft Italy claims it is, it just won’t do, we need tomorrows thinking cap on and we need to start considering that an actual new data system requires us to go back to square one and throw out all we have, it is the only way.

In this, we need to see data as blood cells, billions individual snippets of data, with a shell, connectors and a core. All that data in veins (computers) and it needs to be able to move from place to place. To be used by the body where the specific need is, an if bioteq goes to places we have not considered, data will move too and for now the systems are not ready, they are nowhere near ready and as such my mind was spinning in silence as it is considering a new data setup. A stage we will all need to address in the next 3-5 years, and if the energy stage evolves we need to set a different path on a few levels and there we will need a new data setup as well, it is merely part of a larger system and data is at the centre of that, as such if we want smaller systems, some might listen to Microsoft and their blue (Azure) system, but a smurf like that will only serve what Microsoft wants it to smurf, we need to look beyond that, beyond what makers consider of use, and consider what the user actually needs.

Consider an app, a really useful app when you are in real estate, there is Trulia, it is great for all the right reasons, but it made connections, as it has. So what happens when the user of this app wants another view around the apartment or house that is not defined by Yelp? What happens when we want another voice? For now we need to take a collection of steps hoping that it will show results, but in the new setting with the new snippets, there is a larger option to see a loom of connections in that location, around that place we investigate and more important, there is a lot more that Trulia envisioned, why? Because it was not their mission statement to look at sports bars, grocery stores and so on, they rely on the Yelp link and some want a local link, some want the local link that the local newspapers give. That level of freedom requires a new thinking of data, it requires a completely new form of data model and in 5G and later in 6G it will be everything, because in 4G it was ‘Wherever I am’, in 5G it will become ‘Whenever I want it, and the user always wants it now. In that place some blue data system by laundry detergent Soft with Micro just does not cut it. It needs actual nextgen data and such a system is not here yet. So if I speculate on 6G (pure speculation mind you), it will become ‘However I need it’ and when you consider that, the data systems of today and those claiming it has the data system of tomorrow, they are nowhere near ready, and that is fine. It is not their fault (optionally we can blame their board of directors), but we are looking at a new edge of technology and that is not always a clear stage, as such my mind was mulling a few things over and this is the initial setting my mind is looking at. 

So, as such we need to think what we actually need in 5 years, because if the apps we create are our future, the need to ponder what data we embrace matters whether we have any future at all.

Well, have a great easter and plenty of chocolate eggs.

Leave a comment

Filed under IT, Science

Data, Mind setting and Intent

It has always been the case that dat allows for more, Cambridge Analytica might have brought it to the surface, but it was there, it always was. I have been involved with data since 1992, so I see no surprises here. Even as some are ‘befuddled’ or ‘baffled’, I, and many others were not. So when I see the BBC article (at https://www.bbc.com/news/technology-54915779), I merely shrug my shoulders and go ‘Meh’. Yet the larger part is not seen, it is partially hidden by “buying someone’s name can lead to making guesses about their income, number of children and ethnicity – which is then used to tailor a political message for them”, when I see ‘making guesses about their income’, I wonder who was setting that strange event. When I have a name, I do not need to do any of that, When we combine the election roll data, when we set the stage via social media and when we add real estate data that some have (Equifax, Transunion, Thomson Reuters, Experian, Dunn and Bradstreet), we can start to combine information. I have don this for well over a decade. So when I see the statement from Lucy Purdon, I merely wonder if she is intentionally stupid. You see, it is not about “Data collection is out of control and we need to put limits on what is collected”, it is about “Data collection is out of control and we need to put limits on what is connected”, the shift is two letters which is a huge stage. I have been combining real estate data, past connections, as well as location information. There are really good programs out there and in some cases, I can combine the details of close to a dozen sources, as long as I can create a unique key and that is often possible (not always), privacy is what you had before there was an internet. When we got to the combinations of Merchant house data (Dutch: Kamer van Koophandel), I had the givings of well over a million people, a million more if multiple connections were made and that was in 1994, that was well over 25 years ago and that world did not stop, it never stopped running. Over 10 years ago Oracle introduced array tables, the manual states “Unbounded means that, theoretically, there is no limit to the number of elements in the collection. Actually, there are limits, but they are very high—for details, see Referencing Collection Elements”, it was a game changer, as I saw it it was the first real instance where we could create many to many relationships as well as set that data to a single person. In IBM Statistics I had to be clever and make a workaround, which was per person and a little time consuming, Oracle gave the setting where the computer did all the work, the more powerful the computer. The more data and the quicker we saw results, this was over 10 years ago, and a person like Lucy Purdon should know this, making her either super stupid, or she has an agenda. I do not think that she is stupid, so I am going to make the agenda assumption. There is a stage on what is collected and what is connected, she should know this. Financial institutions are ahed of that curve, because it gives them additional mitigated risk, this is one reason why Google Financial institutions need to keep a Chinese wall on their data away from their Financial Institutions, I gave that view somewhere two weeks ago in ‘A fair call’ (at https://lawlordtobe.com/2020/11/09/a-fair-call/), so when we see the events all clinging together, what are we chastising Google for when the stage is a lot worse? And when the BBC gives us ‘So how do the parties get my data in the first place?’ With the added “The electoral register forms “the spine” of data sources, according to PI, but beyond that it is surprisingly difficult to work out what the parties use”, well, I think I have just given you the run down on the way I did it for aver a quarter of a century, as such the gap the BBC is claiming to have versus is weird, especially when they do not give us “We think that they get from A, through B,C ,D and E, through to the result, we merely cannot prove it at present”, but they didn’t give us that, did they?

Several players have the data, and they have the mindset to make the connections in their need to set an advantage, but the stage of the intent cannot be proven, it remain allegedly, and in light of optional data (if others can acquire that data). It was never about collections, it was about connections and enough players know this to set some serious question marks to this article.

Leave a comment

Filed under Finance, IT, Media, Politics, Science

To knowingly intentionally ignore

There is a state in any person’s mind to ignore anything that does not fit the need of the receiver. This is not a bad thing (at times), and we can ignore all we can, yet to deceive ourselves that it does not exist is another matter. 

To look at the station we need to look at the consideration of two settings. The first is ‘an organized effort to gather information about targer markets or customers’ this is the foundation of market research. After this we consider the second part, which is ‘the process or set of processes that links the producers, customers, and end users to the marketer through information used to identify and define marketing opportunities and problems’, as I personally see it, some do not see the difference (or ignore that there is one), or as I would imply, to knowingly use one as the other. The first difference is the population. In market research we investigate a population and we set our hypothesis based on the station of it. We dabble, we slice and dice this population, and we draw conclusions. The problem is that some hide behind the slicing and dicing, calling it the arbitrary process. For the most I have no issues with it, or better stated I do not care one hoot about some of these analysts. Yet lately I see the impact of decisions and business processes and I wonder if the people accepting the marketing stories whether they are in the dark, they do not care or if they are clueless. 

It started with Microsoft, then Ubisoft, after that there was a stage at Apple as well as a stint in the US administration. All acts based on what some would call ‘a market research into the people and the impact of view’ yet it seems like the marketing research of passing a bitter pill to the extent of surviving the action. That is clearly how it feels and the first act on my side was remembering a previous conversation, a conversation I had roughly 20 years ago. The premise was that a board was cutting expenses and setting the stage of having the environment where they stopped getting a 90% approval on their product and settling on an 80% approval. It is a dangerous and slippery slide. Yes it seems cheaper and it might in the beginning be cheaper, yet the station as we see it is dangerous as the degrees of freedom diminish (intended pun). As a product drills down in different areas, the 10% shift implies that on three fields the danger grows that the overall approval rate is optionally down to 60%, especially if the 20% missed rate hits any consumer 3 times. This is where Ubisot is at present and that is where Microsoft was in the last few years. In that stage we see “to develop technology that will enable them to stream games to whatever piece of tech a person is holding – be that a smartphone, console, or something yet to be invented”. It is the Ubisoft statement and that is fine, yet with the testing and inadequate versions over the last two years alone gives the consumer (the player) a much larger lag, especially when these players are only relatively happy and get hit again and again with downloads that tend to exceed 20 GB, how long until the player has had enough?

It is nice to drill down unto a group of satisfied players, yet the larger issue is that the non players are too often disregarded making the story told one that is largely built onto a shallow base of shifty sands. My view is supported by one small detail, as the PS5 was viewed, we saw an absence of Ubisoft games, the station of that 20% is now growing is it not? One of the largest software makers in history had no business not being present at this Sony show, whether they are going forward on both systems or not. It seems to me that this is not a small part, this is a much larger part and it seems to me that the predictions that I gave last year is slowly coming to fruition. 

Could I be wrong?

Absolutely! I cannot state that there is certainty, that would be short sighted on my side, but the symptoms are there, the lack of excellent gaming, dozens of updates that are several GB in size, there is a lack of testing there is a lack of listening to the gamers and the ones setting the stage of listening are rolling the dice which they optionally loaded themselves. They look better that way, yet the consequences for Ubisoft seem better, until the gamers move away, when you set the stage of a non-assassins creed game to call it that, something they did once before, the stage changes. Even as we were given last year “Ubisoft didn’t provide numbers, but said that it had made a “sharp downward revision” in the revenues expected from both games, which it blamed on a failure to differentiate Breakpoint from its predecessor, an overall lack of interest in sequels to live games, and excess bugs for the game’s failure.” (source: PC Gamer) There are two parts in this, the first is that the game sucks, the basic failures seen only yesterday by myself give a larger rise to that Ubisoft has much larger issues at present. The fact that I ended up with the game at 20% of the full price only 6 months ago, and the fact that to start the game I needed a 38GB patch shows that the issues are close to massive. It is not ‘to differentiate’, but to ‘properly code’ a game that is at the core of it all. The difference of Market research where Ubisoft investigates their game against ALL gamers, to a stage of Marketing research where Ubisoft merely investigates the Ubisoft players is part of that optional setting. When anyone hides behind the message and not behind the quality of product, we have a much larger issue and in the next console war, it would optionally set the deck to a very different stage. 

This is not about Ubisoft, Apple and Microsoft have shown similar failings for too long, and the stage where the US administration is in shows similar flaws, even as it is not a product, trust is, and it is faltering on several levels in the US. We can blame several stages in this, but it is not the blame, it is the investigation into the analysts and the conclusions drawn seems to be a much larger stage of marketing research, not market research. One optional stage is the way evidence is rejected and optionally completely ignored. We might look at the Coronavirus, it is not the point, that element merely brought it to the surface faster. Huawei is the first one that matters, no evidence was ever brought to light, it shows a stgs where the US is close to economic collapse, at that stage we see the greed driven marketing research where the actions are at disposal to the US assets, not the US citizens. This matters because it shows that the slicing and dicing of data is not getting the attention it is due, it is happening on corporate and political levels and elements like ‘How Austin Tech Is Democratizing Data’ might seem nice in theory, yet the larger issue is that some views are now seemingly solely supported by the topline makers, not actual academics with the education required to make some conclusions based on data, not the presented views that those in charge of governments and corporations would like it to be. So when we learn that “Imagine business analysts, marketing teams and even the C-suite having the power to interpret data without the help of the entire IT department”,consider that we are now in a phase where those who have are about to decide the fate of those who have not and those people are in for a massive rough ride, or so they believe. When we see the corporate players like Ubisoft and Microsoft folding on strategies as they lose larger and larger market shares, we will see destabilisation of a much larger degree and there the game is up for grabs. Even as some resorted to terms like data democratization, it is a much older principle, it is the discrimination between those who matter to some, against those who do not. Corporations will find out the hard way what their choice brings them, in politics it is a different story and the impact there is nowhere to be seen. We cannot predict it until it is too late and there I expect (or is it dare I expect?), is the stage larger, even as places like YouTube is flooded in some positive light, the negative impact is much larger. The US riots are merely a consequence to part of what happens in data, it is not the cause but there will be much larger and much more defining then we ever expected, the problem there is that after the fact, repairing damage is close to impossible. You see it is not ownership of data, it is the fact that decisions are made on a level where too much data is disregarded. Hiding behind entrepreneurial action is close to a farce. The largest danger of misinterpretation and as the sources are less and less trustworthy and that is disregarding any ethical consideration, or to make it slightly more simple, as data democratization moves forward, the essential part of comprehensive information will be filtered and optionally disregarded too often, a such a full view is not available, implying that the decision makers are merely looking at a limited scope and consider that action when it is done by a billion Euro company. 

We are only seeing this because the surrounding scope was pushed to the forefront, as such those reacting are doing it too late having to disregard increasingly larger consumer markets. When was the last time that such an action was an actual benefit to that company?

 

Leave a comment

Filed under Finance, IT, Politics, Science

A handjob at twice the price

It started 8 hours ago, the stage that we have been watching on Hydroxychloroquine, an anti malaria drug. The article ‘Influential study on hydroxychloroquine withdrawn’ leaves me with a lot of questions. The quote “An influential article that found hydroxychloroquine increases the risk to death in coronavirus patients” should leave us all with a lot of questions. That is even before we get to the data concerns. Consider that the coronavirus had its initial cases last december (optionally a little earlier), so in January we knew that there was a problem, we also knew that there was NO vaccine at this stage. This was 5 months ago, now we see “Research for the article, published last month in medical journal the Lancet, involved 96,000 coronavirus patients across 671 hospital worldwide. Nearly 15,000 were given hydroxychloroquine – or a related form” In this light, we need to consider that there were enough patients in April, around 3 million, yet as we realise that reporting of Corona cases have been all over the field, so getting 671 hospitals to set up treatments, testings patients and reporting to a source takes time, the incentive for a vaccine started in january/february, and even as they might be on top of their game, the entire setting would require time. As far as I can tell, the situation does not add up. Consider for a moment that there are 4008 forms of approved medication (to coin amere small fat), someone decided to set the stage where hydroxychloroquine was an optional solution, I will not fault that reasoning (as I never studied medicine). So the medication is ‘offered’ as an optional partial solution, there is no vaccine, so still we are all OK. Consider that this started in January, so any negative feedback would not be there until February the earliest. As such, it takes time for possible patterns to form, as such February/March is the start. Now consider that in a period of 60 days, a report was filed with the foundation of ‘hydroxychloroquine increases the risk to deaths in coronavirus patients’, and keep in mind the ‘increases risk’ part, it matters.

You see the timeline to assess and identify ‘increases risk’ is not done in 90 days, the entire path would require all kinds of data on multiple levels and under larger scrutiny, the entire matter should be under scrutiny and should be up for debate in many places

Now we are in a stage where in under 90 days 96,000 patients are measured, 15,000 are documented on the effects of hydroxychloroquine on these patients and the effect and evidence of death due to medication. The timeline does not make sense, so personally, I would state Yes! I very much want to test and scrutinise that data. I would in addition make a memorandum with critical questions to Surgisphere, the timeline leaves me with questions and the data and evidence path would require investigations (in multiple ways), as such when I see this article, I am left with several questions, I also have questions in the direction of Harvard professor Mandeep Mehra. Not in a hostile way, but the entire setting leaves me with a bad taste in my mouth and the professor could end up answering questions. 

So in all 96,000 patients over 90 days at the max, gives us well over 1050 patients a day, after that we have the stage of 166 patients on the drug a day and over a period of 90 days, not all have been properly tested, the stage of data gathering and data collection with tests and setting the proper stage of analyses, verification and reporting. I see a whole range of issues from a distance. Oh, and with the lockdown, how many resources would have been available? We see nothing of this entire field in the BBC article or anywhere else. 

Did someone look into the matter on an empty stomach? 

These managers chasing quick wins are shown to be be lacking in a few ways, I hope that the professor has a good explanation, he most likely does, and perhaps Surgisphere, but the entire data matter is not as I personally see it some ‘client agreement’ issue, I see it as something a lot more serious, and if it was up to me at this stage, unless Surgisphere cannot answer all questions to the satisfaction of all, they should never ever be allowed near medical data ever again. I am not alone in this, some people have been asking serious questions for days, some have has question marks on a few items that I mentioned and most include issues of data collection, it is time for serious organisations to step in, we would ask the WHO, but it seems that America is not paying that bill, so who would properly vet data of this magnitude?

 

Leave a comment

Filed under Finance, IT, Science