Mariana Mazzucato, in her book The value of everything:
Yet in presenting themselves as modern-day heroes, and justifying their record profits and cash mountains, Apple and other companies conveniently ignore the pioneering role of government in new technologies. Apple has unashamedly declared that its contribution to society should not be sought through tax but through recognition of its great gizmos. But where did the smart tech behind those gizmos come from? Public funds. The Internet, GPS, touchscreen, SIRI and the algorithm behind Google – all were funded by public institutions. Shouldn‘t the taxpayer thus get something back, beyond a series of undoubtedly brilliant gadgets? Simply to pose this question, however, underlies how we need a radically different type of narrative as to who created the wealth in the first place – and who has subsequently extracted it.
Mazzucato argues we need new stories and new ideas to shape how we think about value, capitalism, and economics. I‘ve only just started her book but she‘s building a strong case.»
Time is weird right now. Feilding Cage found out why:
Think back to when you were asked to stay home to prevent the spread of the new coronavirus. Did that period go by quickly or slowly?
Most New Yorkers, for example, were asked to stay home beginning some time in March, and many seemed to note that April flew by despite the repetitive days.
Craig Callender, a professor of philosophy at the University of California San Diego, explains that we’re making this judgment based on an event recalled from our longer term memory.
“If you think of every salient event as ticks of the clock, there weren’t that many ticks in April so it feels like time went by really fast,” he said.
The full piece has a five simple tests to illustrate how people perceive time. Worth a throw if you’ve found yourself baffled by how slippery the days and hours have seemed lately.»
Patrick Wintour, writing about the brewing cold war between China and the US:
China has been lucky in its enemy. Just as China has courted its allies, Trump has insulted his. Mira Rapp-Hooper, in her new book Shields of the Republic, documents both how Trump has gloried in the destruction of alliances, and the price the US is paying. She concludes: “Trump does not need legally to sever treaty alliances – by treating them as protection rackets for which the protected parties can never pay enough, he obviates them. By embracing adversaries, he challenges the very notion that his allies share threats.” Not surprisingly some Chinese diplomats would welcome Trump’s re-election, and another swing of the wrecking ball he brings to the western alliance.
Yet at the 11th hour there may be a reversal of fortunes, largely caused by China behaving as foolishly as Trump.
The Telegraph received leaked documents about the “dystopian censorship machine” that is Douyin, the Chinese version of TikTok.
Laurence Dodds explains:
One system can use facial recognition to scan live streamers‘ broadcasts and guess their age, reporting them to a human moderator if they appear under 16.
Another checks whether users‘ faces match their state ID cards before letting them stream, automatically excluding foreigners and people from Hong Kong.
Another system assigns streamers, who are expected to uphold “public order and good customs“, a “safety rating“, similar to a “credit score”. If the score dips below a certain level, they are punished automatically.
That’s quite the surveillance system, if it works as described. It’s not hard to imagine how suppressive it could be and how well it’d create cultural hegemony. That’s especially important, given that the platform has an overwhelmingly young audience.
The article goes onto to mention how a live stream was shut down because a British man appeared on camera for a few minutes. “Foreigners without government ‘permission’” aren’t allowed on the platform.
Dodds asked TikTok how much, if any, of this tech is part of their app. The company wouldn’t answer. They’ve been trying to seperate themselves from Douyin for a while now. Dodging these questions isn’t the way to do it.
Meanwhile, speech and text recognition is used to ferret out sins such as “feudal superstition“, defamation of the Communist Party and even ASMR, which is banned because it has become too “pornographic“.
Even a broken clock is right twice a day.»
Facebook hired two civil rights experts – Laura Murphy and Megan Cacace – to write a report on the company’s practices. It went as well as you’d expect.
Judd Legum covered the report for Popular Information:
The report zeroes in on what’s troubling about Facebook’s policies, as articulated by Clegg and Zuckerberg. They do not represent a commitment to free expression. The policies privilege expression of the powerful over all other people. This is not just unfair — it makes it even more challenging to protect civil rights on the platform.
Elevating free expression is a good thing, but it should apply to everyone. When it means that powerful politicians do not have to abide by the same rules that everyone else does, a hierarchy of speech is created that privileges certain voices over less powerful voices. The prioritization of free expression over all other values, such as equality and non-discrimination, is deeply troubling to the Auditors. Mark Zuckerberg’s speech and Nick Clegg’s announcements deeply impacted our civil rights work and added new challenges to reining in voter suppression.
The report recommends Facebook reverse these policies, but it’s a recommendation that Facebook will almost certainly ignore. And with two sets of policies around speech — one for the powerful and one for everyone else — can Facebook ever effectively protect civil rights?
Free speech without safeguards isn’t free speech: it’s the status quo, with all the impediments to free speech (for some) and failures that entails.
Prioritising free expression above all else feels like a simple way to ensure that everyone can speak freely. But it doesn’t work. Not in a world where things like discrimination, bigotry and marginalisation exist. They all, explicitly and implicitly, create an environment where only certain people can speak and even fewer people will be heard.»
A tech exec by the name of Chris Larsen wants to install high-def security cameras all around San Fransisco to help battle the city’s crime. He thinks that’s a good idea.
Nellie Bowles spoke to him for the New York Times:
In San Francisco, where many locals push for this kind of police reform, those same locals are tired of the break-ins. So how do they reconcile “defund the police” with “stop the smash and grabs”?
Mr. Larsen believes he has the answer: Put security cameras in the hands of neighborhood groups. Put them everywhere. He’s happy to pay for it.
There are countless reasons to have a problem with this. Pointing out that Nextdoor – a decentralised social media platform for communities – is a cesspit of racism, fear mongering, and profiling is the most glib.
There’s no reason to think that pervasive surveillance in the hands of “neighbourhood groups” won’t be the same.
Unfortunately, Bowles’s article doesn’t engage with any of the potential pitfalls in Larsen’s plan outside of privacy concerns. The whole thing reads more like a puff piece or an ad than a measured assessment of something that would affect the lives of everyone in the city, if Larsen follows through with his idea.
He argued that trust [with law enforcement] will come in the form of full city camera coverage, so police can play a smaller, more subtle role. Individual vigilantism will not work, he argued, but strong neighborhoods with continuous video feeds on every corner will.
“That’s the winning formula,” Mr. Larsen said. “Pure coverage.”
Police do need to play a smaller role in people’s lives. But they need to be replaced with well-funded support structures, like mental health facilities, safe injecting rooms and drug rehabilitation groups, and all the other things that make up a robust safety net. That’s how you restore trust in local communities.
Pervasive surveillance isn’t a sign of trust. Nor is it a path to it. We won’t build a healthy community by watching each other all of the time.»
It may soon be illegal to make discriminatory, racially biased 911 calls in San Francisco.
The “CAREN Act” (Caution Against Racially Exploitative Non-Emergencies) was introduced on Tuesday at a San Francisco Board of Supervisors meeting by Supervisor Shamann Walton.
The ordinance‘s name is a twist on “Karen,” the name social media gives people making racially biased 911 calls.
Beautiful. I’ve never wanted to be the person who comes up with acronyms for things more.»
Facebook and Google are weighing up their options after a new security law out of Beijing that “mandates police censorship and covert digital surveillance” in Hong Kong came into practice.
They seem to have three options:
- Comply, giving up on their ideals of free speech
- Refuse and face fines and potentially jail time
- Pull out of Hong Kong
TikTok have taken path three. What would it mean if Facebook and Google did the same?
Karen Chiu explains how popular the platforms are for the South China Morning Post (which is owned by Alibaba, a multibillion dollar Chinese tech company):
Facebook, Hong Kong’s most popular social network, has a penetration rate of over 80 per cent, according to the latest available data from Statista. WhatsApp, the top messaging app, trails not far behind at just under 80 per cent.
Instagram comes in at around 60 per cent. On the other hand, the mainland’s unrivalled social king, WeChat, is used by just 54 per cent in Hong Kong.
Tsao said Google, which pulled its search engine out of the mainland in 2010 after the company suffered a major hack, is ubiquitous in Hong Kong.
People in Hong Kong would face a terrible situation if Facebook, Google, and Twitter left the region. There are few viable options for information sharing if China’s Great Firewall arrives in earnest. VPNs would only be so effective: the law applies regardless of where the platform or server is located.
This, of course, is the point. The law put in place by Beijing is about stifling dissent. It’s about controlling information, one way or the other.»
Samantha Floreani, arguing that consternation over TikTok ignores the reality that other, US-based social media platforms doing the same things TikTok is being criticised for:
This is not to defend TikTok – their data collection, use and disclosure practices are undoubtedly invasive, and present acute privacy and security risks. We should absolutely be thinking critically about how any app handles information, especially those targeted specifically at users under 18. Of course, the context of an app such as this and its links to China should also be considered – we all know China has a deeply concerning record when it comes to respecting people’s privacy or human rights more broadly.
But the data habits of big tech companies should not be framed as an “over there” problem.
Some of those who recently expressed concern about data harvesting are likely to be bellwether friends. What we’re seeing is international politics thinly veiled as a data ethics issue, without any gumption to actually address the underlying problems that are much, much bigger than TikTok.
If you’re going to use something as a proxy to get after China, pick a better target than TikTok.»
Fuji-Q Highland near Tokyo re-opened last month after its virus shutdown.
It asked riders to avoid screaming when they go on its rollercoasters, to minimise spreading droplets, and instead “scream inside your heart”.
Aren’t we all?»