Tag Archives: Getty

Choices of representation

This happens, we all make choices and that is fine, but when the BBC makes choices of representation there is a larger catch and we need to look at that. In this, I feel largely uneasy regarding the choices the BBC made, but I could agree that we all are allowed to make choices, so lets take a look and see whether their choices were really wrong.

First there is the title ‘UK video games market value dipped by 5.6% in 2022’ which we see (at https://www.bbc.com/news/entertainment-arts-65175394) and that is perfectly fine. Yet is the representation of a PS5 image correct? To see this we need to consider “The PlayStation 5 has sold 33.54 million units in 28 months, while the Xbox Series X|S sold 20.86 million units. The PlayStation 5 has a 61.7 percent marketshare (+2.0% year-over-year), compared to 38.3 percent for the Xbox Series X|S (-2.0% year-over-year)” (source: VGChartz), but in similar setting we always see this. You see, the setting (from my point of view) would need to be PS5 and Xbox series X, whilst the Xbox Series X is compared to the PS4 pro. But Microsoft is spinning the numbers to the extent that we can never do that, because that would show just HOW BAD Microsoft is actually doing. You see the Xbox series S is powerful enough to make next-gen games look great, albeit at a lower resolution and that is what Microsoft was toiling with, it was more powerful than the PS4 pro (not by much) but it was aggressively priced to do so and the series S now misses the drive and can only work with digital products (not a real issue in todays market). But that alone is setting a different stage and it makes Microsoft less than a winner in this. The second tier (completely unmentioned) is the Nintendo store, who is the massive winner here. In 2021 they had (according to released numbers) $15,990,000,000 in revenue in 2021, whist they ‘only’ had $14,011,000,000 in revenue in 2022, which was a drop for them to around 87.6%. There is your 5% market fall and I reckon that the fall will be the largest representation in the UK as well. No matter how great Nintendo is doing, losing out on $1.8 billion globally will do that, and the numbers I did not look at (as I do not give a hoot) is the mobile game revenue, which I expect show a somewhat similar drop. Yet the article does not show any of that, does it? So what was Emma Saunders doing? And why did she use a Getty image of Sony? The last one was the cherry on the cake, but it matters to me. The losses are clearly seen even if unexpected in the Nintendo and Mobile software setting and that is before I look at the funny money element of Mobile gaming. If anything Sony was a clear winner in 2022 and then we get “Pokémon merchandise was the top performer. According to the ERA Yearbook 2023, top performing titles in the UK in 2022 were Lego Star Wars: The Skywalker Saga, PlayStation exclusives God of War Ragnarök and Horizon Forbidden West, Pokémon Legends: Arceus on Switch and Elden Ring” which is fine. But that implies that Nintendo and Sony did amazing in these losses, as such why did we not see a Getty image of Microsoft there? Interesting how the BBC is shielding Microsoft from more established elements that do bring the bacon. The top performers do not mention anything by a Microsoft exclusive article, why is that? It perfectly fine that they failed to perform, but in that setting the 5% drop would be in the Microsoft realm, even as we see that Nintendo did a little less (and still was ahead of everyone else). 

So why did the BBC took to the streets with choices of representation by setting the image of Sony whilst they have been making numbers and growing marketshare? Is the stakeholder at the BBC shielding someone from bad news reflection? Just how neutral is the BBC at present? 

Just asking.

Leave a comment

Filed under Gaming, Media

Late to the party

It happens, we all are late to a party at times, I am no exception. I was busy looking at the stupidity of law, the stupidity of plaintiffs, waiting the courts, all whilst the approach to common sense was kept at bay. As such I did not read the BBC article ‘MP Maria Miller wants AI ‘nudifying’ tool banned’ until today. The article (at https://www.bbc.com/news/technology-57996910) shows the stage of what is called ‘a nudifying tool’. We can argue that the tool does not exist, because AI does not exist. You see, what some call AI is nothing but deeper learning. Someone took a deeper learning approach and accepted and acknowledged the setting that ‘sex sells’ and used that to create a stage. It is a lot worse than it sounds. You see when we consider ‘Nudifier APK for Android Free Download’ and we consider the hundred of millions horny guys (of all ages) and we see a market, a market for organised crime to exploit as an injector of backdoors (no pun intended) and in this Maria Miller loses from stage one. You see the app gives us “allows you to pixelate parts of your photo so that the illusion that your body is naked is also there”, they know how it will be used, but this statement washes their hands. Nothing is ever for free and this free download comes at a price I reckon. And with “By the way, packed with decorative elements for the picture, for example, the magazine cover is also paid separately. And that’s $ 0.99, which is the same price as the app.” We see the setting where the maker is looking at the setting to become a multimillionaire by September 30th. 

So when we get to see some Getty image of a despairing woman with the text “Currently nudification tools only work for creating naked women”, we wonder what on earth is going on, even as we also get ‘one developer acknowledged was sexist’, just one? How many developers were asked? It becomes a larger stage with “similar services remain on the market, many using the DeepNude source code, which was made publicly available by the original developers.While many often produce clumsy, sometimes laughable results, the new website uses a proprietary algorithm which one analyst described as “putting it years ahead of the competition”.” Is no one standing still at the one small part ‘a proprietary algorithm’? Proprietary algorithms are never handed over, this has a larger stage coming and even as we see the actions of Maria Miller, I wonder how far it will go, the moment someone attaches the word ‘art’ whilst not taking responsibility of the images used (the user does that), where will it end? 

And I am right, the end of the article gives us “The goal is to find what kind of uses we can give to this technology within the legal and ethical [framework].” But in the meantime we will see hundreds or even thousands of senior high school girls see their images with a nudified version, all whilst the stage was known for the better part of 2 years. I reckon that within 5 years the glossy magazines will all have nudified versions of every celebrity, thats how the money flows and that is how the station will go. Is it right? Of course it is not, but the larger stage of ‘sex sells’ has been out there for decades and the law never did anything to stop it, yet some MP’s listened to silly old clerics and they merely attacked porn, now we see that the larger station is evolving and the involved parties are all wondering what to do next. And in this no one takes notice of ‘one developer acknowledged was sexist’, just the one? The UK has approximately 408.000 software development professionals. The US has almost 4 million developers. And we see that one developer who acknowledged it was sexist? I have not even included the EU and Asian developers. So in all this, I reckon we have a much larger problem, optionally the writer Jane Wakefield needs to take another look at the article. So whilst millions of 14-23 year old boys are looking to find DeepSukebe’s website, hoping to reveal a slightly more interesting view of Olivia Wilde, Laura Vandervoort, Leslie Bibb, Emma Watson, Paris Hilton and the cast of Baywatch? We need to consider  that this was always going to happen, there are shady sides to deeper learning and whilst the enterprising and greed driven people are pushing for others to take a look, so are the members of organised crime, so are the enterprising people who considered an IT solution to push millions of paparazzi’s out of work. You really thing that some glossy magazine will ignore images when the people cannot tell whether it is real or fake? Consider the image below, as the technology becomes so good that we can no longer tell the difference in a face we have seen in dozens of movies, do you think we would be able to tell whether the boobies and shrubberies we never saw were real or deepfake? And when the images achieve 2400dpi, do you think the glossy magazines  and gossip providers will ignore them when circulation and clicks grow? 

Leave a comment

Filed under IT, Media, Science