I was amazed by a story in the guardian (at https://www.theguardian.com/society/2020/feb/27/phones-that-may-hold-child-abuse-images-returned-to-suspects). Now, we all have that at times, a moment where we just do not get the idea that something is happening (or not), the issue here is that it is a much larger setting and we see this with “Police are giving back to suspected paedophiles phones and computers that possibly hold child abuse images because they do not have the time or technology to search the devices“, so the police ran out of time (or options) hand the evidence that could be used against these people and let them go?
Then there is “the technology that helps officers quickly scan devices to determine the likelihood of indecent images being present is not consistently available across forces” in this that it is important that we take notice of ‘quickly‘, how determining is that factor? As I see it with the range of mobiles that are coming in the next two years, the hardship of the police will increase by factor 16 at the very least (on average factor 32 applies). There is a larger setting where the police have a duty, but so do the tech firms. I am not the person to blame all the tech firms, yet there is a larger setting where certain tools need to become available with the next stage of transportable drives and hardware. And we need to look beyond the normal FAT (or NTSC) stage of scans where allocated space is scanned alone, making the hardship for the police increase to factor 64 at the least.
Then we see “limited capacity of forces to conduct many costly and time-consuming digital forensic examinations is also hampering investigations into suspects who have downloaded indecent images of children” and that is when we see the impact of people saving images on their own drives, it is the group that has dark web links in a sort of 4chan (not blaming 4chan here) that allows these people to look at such images at their own ‘leisure’ in any free wifi situation as the images are encrypted until at the endstation with the decrypting part in the app itself, and as the hardship of the police is merely to scan for images, the solution to find these people is unlikely to become a larger solution ever.
So when we see “restore 20,000 police to the streets of England and Wales will not be enough to match the increasing demand placed on officers to protect children” we need to consider very different solutions and the adaptation of law to protect children becomes a much larger need. It is seen in “In one case inspectors found that 100 days had passed since police were notified that a 10-year-old girl had been receiving indecent images from three older men via social media. During that time there was no effort to identify and trace the perpetrators“, which is interesting because they were apparently able to identify that these were ‘three older men‘. Is it just me or is there a larger failing in the making? The second failure is seen in “Safeguarding planning for children linked to a suspected perpetrator is routinely deferred until a criminal investigation has begun“. As such there are actually three failings. We overlooked ‘social media‘, they too play a role. There should be a clear path for a younger person to press the alarm button alerting social media on any indecent picture sent via social media if the account holder is under 18, this could have been avoided years ago. This is not a stage of freedom of expression, this is not free speech, it is optionally criminal speech and evidence must be gathered at this point.
There is no defence in ‘someone had my password!‘, the owner of the social media account had responsibilities too. As such as we see “The delay is worsened by the lack of technology available to officers to search devices for child abuse images“, the statement is cutting on both sides, as the images might not have been on the device. other means of tracking usage must be found and we need to do more to keep the children safe.
In all this there is a much larger failing, yes there are criminal prosecution needs, yet it is almost indecent to push the blame onto the police. I believe that whatever enlargements places like GCHQ is getting, they need to get off the horse of blaming players like Huawei on events that come from alleged unproven sources like the US state department and place these sources on finding true solutions to aid the police. Consider the need for solutions and less so towards unfounded allegations, that is close to 15% of GCHQ resources freed overnight. I call that a decent solution, do you not?
Yet, I am not blaming GCHQ, the issue is that we need to adjust the laws on digital prosecution and where we are presently allowed to go, that is not a given in the stage we see. We need to adjust the track we can walk and who can walk it for us, it is the only solution that remains at present and too many people think in call centre cubicle terms and refuse to see the larger pasture that we need to canvas.
In all this tech firms and governments need to find common ground and we are in the space where we can blatantly blame tech firms, yet it is not that simple. The tech firms offered a solution and someone found another use for it. We cannot blame Sony for people using their PS3 as a powerful Ubuntu Linux station and that is basically what is happening. This is not some tech firm problem, it is the station where a generic piece of hardware can run another app and use it as it sees fit, use and adjust for other solutions and implement that and the police has little to no hope at all solving the issue they face and tech firms need to come out and play with governments and stay nice.
Yet the issue is much larger than anyone thinks. We saw part of this last year in the Crime report with ‘Tech Firms’ Neglect Lets Pedophiles Run Rampant Online‘, the fact that ‘freedom of expression’ is used in a way none are willing to agree to also means acknowledging that sometimes an aerosol is used, not to hand out what it was intend on doing, but to assassinate a politician. See here the object (at https://www.amazon.com.au/Aluminum-Pneumatic-Refillable-Pressure-Compressed/dp/B00JKED4MS/ref=sr_1_3?keywords=aerosol&qid=1582859473&sr=8-3), as I add it with the right Arsenic mix and switch the bottle, the user kills himself. Is the bottle maker to blame (or I am even more devious and add the mix to their own bottle, was the victim in the end to blame for their suicide)?
So the entire ‘rampant’ part is (as I personally see it) intentional miscommunication, there is a larger stage and both sides need an actual point of reference. there is a system in place and we see “YouTube removed this video, and many others, after WIRED alerted it to the problem” (source: Wired) yet we forget that this is a massive engine and google is not in a place to stop the engine being used by criminals to make a few quick bucks. We need to accept and understand that. Even as several people hide behind “on a test account created to investigate the network of paedophiles on YouTube, the platform’s algorithm continues to suggest similar videos of children that have been commented on by sexual predators“, the engine did exactly what it was supposed to do, yet in this case we see that it is servicing the criminals and the short sighted people shout and blame the tech company, just as they blame the police and neither is at fault, the criminals are. We can look at the T91 assault rifle and claim it is used to kill, which is true, yet we forget that the person using it can kill criminals and police officers alike, blaming the makers for that is just short sighted and stupid.
We need a much better solution and we need to rely on tech makers to hand the tools to us, all whilst we know that those making the request (see hidden images) have no clue what to look for and how to look for them, it is maddening on several levels and the people on the side lines have no clue that the referee is looking for an orange jersey all whilst the All Blacks are playing Australia, so he sees Green, Yellow, Black and White (the fern). It is a stage where we look at the players, whilst the field has several other members that are validly there and we overlook them, just like the ‘hidden pictures’ are sought in a game where the pictures are optionally not even on the mobile device, merely the link to them is.
That part is overlooked and as we go from one item to the other, we forget that links can be added in several ways and the police will run out of time long before it runs out of options to check. In all this the law failed the children long before the tech firms did. So whilst we see Wired giving us “To date, Disney, Nestlé, Epic Games, Dr. Oetker and a number of other companies have halted advertising on YouTube after it emerged that the platform was monetising videos being uploaded and viewed by paedophiles“, I merely see one sanctimonious firm and 3 short sighted ones, it could be two for two, but I leave you to decide on that. An automated systems was designed and put into place, the criminals were hiding in the woodworks and there are close to a dozen ways to hide all this from an AI for years, all whilst we clearly see that We need to realise that YouTube became so much more than it ever was intended to be and when we take notice of ‘300 hours of video are uploaded to YouTube every minute!‘ and consider that 18,000 hours of video is uploaded every hour, we get a first input of just how difficult the entire setting is, because these 18,000 hours of video will include 3,000 hours of videos that is set to items no more than 5 minutes per video, making the issue 20 times larger, in all this we forget that this is a global thing and cross border criminal activities are even harder to set any mind to then anything else and in all this, there is no actual number on the actual number of uploads. Consider that ten minutes out of 18,000 hours is illegal and that 30 seconds out of those 10 minutes is on paedophiles. At that point do you get a first inkling of how large the problem is. and that is merely YouTube, there are channels that have no monitoring at all, there are channels that have encrypted images and video solutions and there are solutions out there that have an adapted DB2 virus header and the police has no clue on how to go about it (not their fault either), in all this places like the DGSE and GCHQ are much better solution providers and it is time the tech firms talked to them, yet whenever that discussion starts we get some stupid politician who conveniantly adds a few items to the agenda, because to that person it made sense and as such no solution is designed and it has been the situation of non action and non solutions for a few years now and I see the same discussion come up and go about it all whilst I already know the outcome (it is as simple as using an abacus).
We have larger tech needs and we have better law needs, And whilst we see people like Andy Burrows, NSPCC associate head of child safety online go on about “extremely disturbing“, all whilst a person like that should realise that the system designed is generic and severely less than 0.03% of the population abuses it is beyond me, I would go on that a person like Andy Burrows should not be in the position he is when he has little to no regard of the designed system, more precisely, he should remove the ‘online‘ part from his title altogether.
And whilst Wired ends with “During our investigation into his claims, we found comments from users declaring their “love” for the children and exchanging phone numbers with one another to share more videos via WhatsApp“, I merely wonder how the police is investigating these phone numbers and whatsApp references, in all this the absence of WhatsApp (Facebook) is also a little weird, it seems that these social media predators are all over the place and the open abuse of one system is singled out whilst we get no real feel of just how the abuse statistics are against the total statistics. Consider that Windows has a 2.3% error to abuse by non users, in all this for Google to get a system that is close to 99.4% decent is an amount that is almost unheard of. most people seem to forget that Google gets pushed into a corner by media and madiamediators on transgressions on IP protected events (publishing a movie online), there is the abuse of video, there are personal videos that are disallowed and terrorism via YouTube, in all this harsh or not, the paedophile issue is a blip on the radar, Youtube gets $4 billion out of a system that costs $6 bilion to maintain and it pays off in other ways, yet the reality on the total is ignored by a lot of players and some of them are intentionally averting their eyes from the total image and no one asks why that is happening.
So whilst we look at the Wired byline ‘Legislation to force major tech platforms to better tackle child sexual abuse on their networks is also “forthcoming”, a Home Office spokesperson has confirmed‘ we need to seriously ask whether these legislation people have any idea of what they are doing. The moment these people vacate to another nation the entire setting changes and they have to start from scratch again, all whilst there is no solution and none will be coming any day soon. You might think that vacating nations solves anything, but it does not, because the facilitators of these images can pick up their server and move from place to place whilst they get millions, all whilst the payers are still out of reach from criminal prosecution. and whilst we go in the magic roundabout, we get from point to point never having a clue on the stage we are on, we are merely going in circles and that is the problem we face. Until the short sighted blaming stops and governments truly sit down with tech firms trying to find a solution, we are left in the middle without any solution, merely with the continued realisation that we failed our children.
We have dire tech needs and we need to make a cold list of what we need, and the first we need to do is blaming them for a situation that they are not to blame. Consider that we are blaming Johannes Gutenberg for the creation of the printing press, he created it in 1439, basically to make the bible available to all (before that only rich people could afford a bible), yet he is the one being accused of aiding the spread of Mein Kampf by Adolf Hitler. that is what we face, we blame YouTube and Google for something they never did and optionally never considered facing. In 1814 Joseph Nicéphore Niépce made the first photograph (like we know camera’s today), yet in that same year Julien Vallou de Villeneuve used it to photograph naked women, should Joseph Nicéphore Niépce be held accountable? We all seem to say yes and blame Google, but it had little to no control at all, a system like the one Google made was not meant for the 0.00000000925% abusing the system, yet that is what is happening right now and we need to take a step back and consider what we are doing. I am not claiming that Google is a saint, yet we refuse to hold Microsoft to account for their 97.5% operating system, yet we are going to all lengths to prosecute Google for 0.00000000925% of materials produced (actually it is up to 1/24th of that if not smaller) by others through abusing the YouTube system, all whilst the problem is a lot larger and is beyond almost any tech firm, so why are we doing that?
It becomes clear when we add last year’s CNN article in the process. They gave us “Frustrated that those regulators are moving too slowly, Congress, with support from Democrats and Republicans, will use its investigative power for a top-to-bottom review of the tech industry, and focus on the biggest companies. Congress cannot break up companies under existing laws, but it could cook up new ones — and Sen. Elizabeth Warren of Massachusetts, who’s established herself as Democrats’ big ideas leader in 2020, already has a plan to break up the largest tech monopolies.” (at https://edition.cnn.com/2019/06/04/politics/washington-turn-against-tech-companies/index.html), I believe that this is not about the materials, it is about a handle of the company and flaming conversations brings emotional response and the quickest way to push voters into an area where they are the most useful. Google is still too big for politicians, so they push and push until something gives and they are hoping that the people will be malleable to a much larger extent then the tech companies ever were.
Lets face it, how many companies are actually interested in fixing a problem that covers 0.00000000925% of their materials? That is the actual question! The police can’t go after it, these politicians are unwilling to adjust laws where paedophiles are actually processed, as such the entire situation does not make sense and tech firms are suiting up for their defense, that is all the politicians have enabled, now the politicians through media hope for enough outrage and we see the fallout, those politicians are willing to endanger the lives of the children by not seeking an actual solution, but a solution that fit their needs and these two do not align. and in this both sides of the isle on a global scale are guilty, both the elected and unelected (this term) parties are all equally guilty of setting a stage that suits them, not one that solves the problem.
We seemingly forget about that part of the equation, I wonder why that is.