Tethered to the bottom of the ocean

Perhaps you remember a 1997 movie, about a ship that decided to take a fast trip to America, the HMS Titanic. We all have our moments and what you might not know is that there is a deleted scene that only a few limited editions had. The captain (played by Bernard Hill) was asked a question by one of the passengers: ‘Is land far away?‘ The response was: ‘No, it is only 3900 yards to the nearest land………straight down‘. OK, that did not really happen, but it does sound funny. You see, the image of a place can be anything we need it to be, dimensionality is everything and that is where we see the larger problem.

This is actually directly linked to the article I wrote on September 18th, the article ‘The Lie of AI‘ gets another chapter, one that I actually saw coming, the factors at least, but not to the degree the Guardian exposes. In the article (at https://lawlordtobe.com/2019/09/18/the-lie-of-ai/) I gave you: “more importantly it will be performing for the wrong reasons on wrong data making the learning process faulty and flawed to a larger degree“, now we see (at https://www.theguardian.com/society/2019/sep/19/thousands-of-reports-inaccurately-recorded-by-police) a mere 8 hours ago ‘Thousands of rape reports inaccurately recorded by police‘, so we are not talking about a few wrong reports, because that will always happen, no we are talking about THOUSANDS of reports that lack almost every level of accuracy. When we consider the hornets’ nest the Guardian gives us with: “Thousands of reports of rape allegations have been inaccurately recorded by the police over the past three years and in some cases never appeared in official figures” Sajid Javid is now facing more than a tough crowd, there is now the implied level of stupid regarding technology pushes whilst the foundations of what is required cannot be met and yes, I know that he is the Chancellor of the Exchequer. It is not that simple, the simplicity is not seen in the quote: “More than one in 10 audited rape reports were found to be incorrect“, the underlying data is therefore more than unreliable; it basically has become useless. this is a larger IT problem, it is not merely that the police cannot do its job, anything linked to this was wrongfully examined, optionally innocent people were investigated (which is not the worst part), the worst part is that the police force has a resource issue and there is now the consideration that the lack of resources have also been going in the wrong direction. The failing becomes a larger issue when we see: “The data also found that a number of forces failed to improve in subsequent inspections, with some getting worse“, the failing pushed on from operational to systemic. Now consider IT, the laughingly hilarious step of AI, even the upgrades to existing systems that cannot be met in any way because the data is flawed on several levels. It is a larger issue that out of the national police force in this regard only Cumbria, Sussex and Staffordshire past the bar, a mere 3 out of 36 forces did their job (above a certain level) and it gets worse when you consider that this is merely the investigations into the sexual assault section, the matter could actually be a lot worse. Consider the Guardian article in July ‘Police trials of facial recognition backed by home secretary‘ (at https://www.theguardian.com/uk-news/2019/jul/12/police-trials-facial-recognition-home-secretary-sajid-javid-technology-human-rights), as well as ‘UK police use of facial recognition technology a failure, says report‘ from May 2018 (at https://www.theguardian.com/uk-news/2018/may/15/uk-police-use-of-facial-recognition-technology-failure), you might not have made the link, but I certainly did. When you take the quote: “Police attempts to use cameras linked to databases to recognise people from their face are failing, with the wrong person picked out nine times out 10, a report claims“, now consider that a  victim reported the assault on her, a report is made and at some point the evidence is regarded and looked over, the information is linked to CCTV data and now we are off to the races, whilst 3 out of 36 forces did it right, there is now a stage where 91% is looking at the wrong information, inaccurate information and add to that the danger of 10% getting properly identified, even if the right person was picked out, there is still a well over 75% chance that the investigation is going in the wrong direction and optionally an innocent person gets investigated and screened, in the meantime the criminal is safe to do what he wanted all along.

Now we get the good stuff, in 2018 home secretary, Sajid Javid gave his approval and now as he is the Chancellor of the Exchequer, he approves the invoice and also sets the stage of handing out £30 million to a system that cannot function in a system that is based on cogs that were not accurate and are transposing the wrong data. Even then we see “the BBC reported that Javid supported the trials at the launch of computer technology aimed at helping police fight online child abuse“, a system this inaccurate, not merely because of its flawed technology is set in a stage where the offered data is not accurate either, this simply implies that until the systemic failure is fixed the new system can never function and it will take well over a year to fix the systemic failure. So tell me, what do you normally do to a person who is knowingly and willingly handing over £30 million to a plan that has no chance of success?

We need to stop politicians from wasting this level of resources and funds merely to look good in the eyes of big business. I also feel that it is appropriate that Sajid Javid will be held personally accountable for spending funds that would never be deployed correctly.

The reasoning here is seen in the quote “Recorded rape has more than doubled since 2013-14 to 58,657 cases in 2018-19. However, police are referring fewer cases for prosecution and the CPS is charging, prosecuting and winning fewer cases. The number of cases resulting in a conviction is lower than it was more than a decade ago“, the stage is twofold, we see a doubling over 5 years whilst convictions were down from more than a decade ago, it will in the end link to conviction rate on data, whilst the data numbers are not reliable. The quotes “the case was not recorded as a crime“, as well as “noting it as an incident“, in both cases rape registered as something else, and there is no conviction required on ‘incident‘, the underlying questions is whether this lack is optionally intentional to skew that statistics. You might not agree and it might not be true, but when we see a 91% failing from the police force there is something really wrong. The problem intensifies when we see the Guardian statement that “West Midlands was found to be ‘of concern’ and had ‘not improved’ rape recording upon re-inspection in 2018” this implies that the work of the Inspectorate of Constabulary and Fire and Rescue Services (HMICFRS) is either not taken seriously or is intentionally ignored, you tell me which of the two it is and connected to this is Sajid Javid ready to ‘upgrade’ to AI (that remains funny) and spend over £30 million on that system, as well as the funds wasted on the current CCTV facial recognition solution, which is not cheap either.

I wonder who the CCTV will point to arrest for the person allegedly having sex on the desk of the Terry Walker, Lord Mayor of North East Lincolnshire. Images show that the local police might be seeing Noel Gallagher as a person of interest at present.

I wonder how that data was acquired?

In opposition

There is however the other side and even a I did not give it the illumination, there was no intent to ignore it. The options to ‘AI to reduce the burden on child abuse Investigators‘ is not to be ignored, it must be the task that will burn out a person a lot faster than they would transporting bottles of nitro-glycerin by hand through a busy marketplace. I am not insensitive to this, yet the Police Professional gives us: “The development will cost £1.76 million from a total investment in the CAID from the Home Office of £8.2 million this year, which is different from the £30 million given, as I see it additional questions come to the foreground now. Yet there are other issues that are not part of this. There is the danger of misreading (and incorrectly acting on) seeded data. In SIGINT we see the part where data fields are used to misrepresent information (like Camera model, owner, serial number), when we start looking in the wrong direction, even if some of the data might be correct you are in a different -phase and the problem is that no AI can tell you that a camera serial number might be wrong, or right. There are larger data concerns, yet I do understand that some tasks can alleviate stress from the police, yet when we link this to the lack of accuracy on police data, the task remains equal to mopping the floor whilst the tap is running spilling water on the floor. None of these steps make sense until the operational procedures are cleared, tested and upgraded. A failing rate of 91% (33 out of 36) makes that an absolute given.

And for those who missed the Gallagher joke, please feel free to watch the movie the Grimsby brothers. There are actually two additional paths that are an issue, it is not about presentation, it is about the interpretation, as well as the insight of sliced data, they interact and as such a lot of metrics will go wrong and remain incorrect and inaccurate for some time to come. Data will get interpreted and optionally acted on, which becomes a non-option when accuracy is below a certain value. So feel free to be anchored to the ground in the approach to data surveillance employing AI (I am still laughing about that part), yet when you are tethered to the bottom of the ocean, how will you get a moment to catch your breath?

Precisely, you won’t!

 

Advertisement

Leave a comment

Filed under Finance, IT, Media, Politics, Science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.