'Is my phone listening?': Facebook mystery
If our social media apps aren't listening to our offline chats, how do tech companies explain all those bizarrely specific ads?
That was one of the curlier questions thrown to the panel on ABC's Q&A on Monday night as it discussed the perils and cultural implications of social media.
Many of the topics raised in Monday's program were inspired by the hugely successful Netflix documentary/drama The Social Dilemma, which explored the impact of social media platforms on everyday life as well as wider issues of election interference, fake news and the global culture wars.
But it was the last, simple question posed to the panel that drew the loudest knowing chuckle from the audience and panellists.
"How do you explain a recent visit I made to a car dealership to have my car serviced where I was asked if I wanted windshield wipers and I declined," the question began.
"Within an hour I was getting requests for them and I never searched and it is not in my daily conversation. If my phone is not listening, how did I get the ads?"
Tech reporter and author Marc Fennell said that was a subject he had covered many times on his Download This Show podcast.
"They are adamant they are not using your microphone - and it is very hard to believe them sometimes, but they are adamant about it," Fennell said.
"Without knowing the complete ins and outs of (Facebook's) algorithm - which we should know, because it's marketed to us - I would say they probably gathered your location data because you were at a place and they probably concluded you didn't buy a car, so you were probably interested in accessories but that is at a guess."
Over on Twitter, however, many punters weren't buying it.
if they’re not listening to us how do they always hear when we say “hey siri”?#qanda— WWJD (@sassandra_d) October 19, 2020
I’ve turned off Siri and microphone access to all apps, no more targeted ads #qanda they’re definitely listening— Lisa Remato (@lisaremato) October 19, 2020
#qanda they’re not listening because the collect so much data they don’t need to.— Dan Stinton (@danstinton) October 19, 2020
it's not the phones that are listening, unless they're self aware. #qanda— jansant (@Jansant) October 19, 2020
My phone would be bored to death if it was listening to me. #qanda— Jennifer Brown (@urallagirl) October 19, 2020
Fennell said the confusion from the public about Facebook's algorithms, and what the app was really up to, pointed to a larger issue: the need for more transparency from tech companies about why they fed us certain information.
"Part of my issue, and why I would like a thing underneath saying, 'This is why you are being advertised this' is so it doesn't feel creepy," Fennell said.
"I think it's better for the platforms if you take the creepiness out of it, and you have some transparency about why they're selling you stuff. I think it's better for trust."
Also on the panel was former Google design ethicist Tristan Harris, one of the stars of The Social Dilemma, who lifted the lid on techniques social media apps used to keep us engaged, and the algorithms that drove specific content.
Another member of the audience said he was compelled to delete all his social media apps after watching the documentary, but wondered if the lack of connection to social groups and relevant information meant he was missing out.
"The message of the film is not expecting that everyone will delete social media apps," Mr Harris replied.
"But that's what makes these technology platforms inhumane: we are actually forced to use platforms that are contaminated or toxic for the public sphere and our mental health. We don't really have a choice … you'll be socially excluded if you leave by yourself."
Mr Harris said "non-extractive, non-privately held companies" could be the answer, like Wikipedia - a collaborative project built on "collective intelligence for bottom-up sense making".
On the other hand, apps like Facebook, YouTube and Instagram were solely interested in keeping users consuming content so they could make their profits.
"Ultimately the business model of unchecked morality and attention harvesting is what has to change fundamentally but when you have that DNA it is hard to change it mid-flight," Mr Harris said.
"They say you have to fire 70 per cent of employees to restart the culture of the company and until they change that, everything else is like band aids on broken elbows."
Many on the panel, which also included Australia's eSafety Commissioner Julie Inman Grant and cyberpsychologist Jocelyn Brewer, agreed the responsibility fell on tech companies to protect users from misinformation and online harassment.
Ms Inman Grant said while the internet had been an essential resource for information, entertainment and social connection during COVID-19 in particular, it was a double-edged sword.
"It has become a much more hostile, much more toxic place because I think people can hurl abuse and it's mostly targeted abuse, targeting women, Aboriginal and Torres Strait Islanders those who identify as LBGTIQ and other vulnerable entities, and they can do so with relative impunity," she said.
"Just having your Twitter or Facebook account taken down is really not a deterrent.
"I also don't think we're going to regulate or arrest our way out of the problem. Social media holds a magnifying glass up to society, so the societal ills of misogyny and prejudice and racism are surfaced and amplified. I absolutely agree with Tristan, that we have to shift the
responsibility back onto platforms."
Another question came from writer and activist Carly Findlay, who said she had endured an "enormous amount of online abuse, from disability and race hate speech to death threats".
She said while the response from the eSafety Commissioner and police has been compassionate when she had reported abuse, she wanted to know when they would have more power to act.
Ms Inman Grant explained the Federal Government was currently reforming the Online Safety Act so the commission had the power to compel take-downs for adult cyber abuse, as it already does in areas of youth-based cyber bullying and image-based abuse, with success.
Ms Inman Grant also discussed eSafety's Safety by Design initiative, which encourages companies to put user safety and rights at the forefront.
"The whole idea of Safety by Design is that if you build the digital roads, you also need to erect the guard rails, you need to police the roads dangerous drivers so other users don't end up being the casualties," Ms Inman Grant said.
"But I strongly believe it takes cultural change (from) moving fast and breaking things and profits at all cost, to actually building platforms and designing them with safety and human dignity at the core.
"I think that is an easier proposition and something they can do and a journey we have been going on with the companies with Safety By Design versus what Tristan is suggesting and changing wildly successful revenue models.
"What government is going to take the economic engine of the US economy during the worst depression and tell them you've got to change the way you make money? They are businesses looking for profit and they answer to shareholders."
Originally published as 'Creepy' Facebook move has us stumped