<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Search Results for &#8220;facial recognition&#8221; &#8211; Australian Privacy Foundation</title>
	<atom:link href="https://privacy.org.au/search/facial+recognition/feed/rss2/" rel="self" type="application/rss+xml" />
	<link>https://privacy.org.au</link>
	<description>Defending your right to be free from intrusion</description>
	<lastBuildDate>Wed, 06 Mar 2024 08:46:00 +0000</lastBuildDate>
	<language>en-AU</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Your face for sale: anyone can legally gather and market your facial data without explicit consent</title>
		<link>https://privacy.org.au/2024/03/06/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent/</link>
		
		<dc:creator><![CDATA[Margarita Vladimirova]]></dc:creator>
		<pubDate>Wed, 06 Mar 2024 08:46:00 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5341</guid>

					<description><![CDATA[Margarita Vladimirova, PhD in Privacy Law and Facial Recognition Technology, Deakin University The morning started with a message from a friend: “I used your photos to train my local version of Midjourney. I hope you don’t mind”, followed up with generated pictures of me wearing a flirty steampunk costume. I did in fact mind. I&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2024/03/06/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><a href="https://www.shutterstock.com/image-photo/futuristic-technological-scanning-face-beautiful-woman-1554013514"><img fetchpriority="high" decoding="async" src="https://images.theconversation.com/files/579102/original/file-20240301-28-tzp738.jpg?ixlib=rb-1.1.0&#038;rect=956%2C85%2C6119%2C4218&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" width="754" height="520" alt="" /></a><figcaption class="wp-caption-text">Kitreel/Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/margarita-vladimirova-1514577">Margarita Vladimirova</a>, PhD in Privacy Law and Facial Recognition Technology, <em><a href="https://theconversation.com/institutions/deakin-university-757">Deakin University</a></em></span></p>

<p>The morning started with a message from a friend: “I used your photos to train my local version of Midjourney. I hope you don’t mind”, followed up with generated pictures of me wearing a flirty steampunk costume.</p>

<p>I did in fact mind. I felt violated. Wouldn’t you? I bet Taylor Swift did when <a href="https://theconversation.com/taylor-swift-deepfakes-new-technologies-have-long-been-weaponised-against-women-the-solution-involves-us-all-222268">deepfakes of her hit the internet</a>. But is the legal status of my face different from the face of a celebrity?</p>

<p>Your facial information is a unique form of personal sensitive information. It can identify you. Intense profiling and mass government surveillance <a href="https://www.forbes.com/sites/kalevleetaru/2019/05/06/as-orwells-1984-turns-70-it-predicted-much-of-todays-surveillance-society/?sh=38a97b4e11de">receives much attention</a>. But businesses and individuals are also using tools that <a href="https://www.sbs.com.au/news/article/creepy-and-invasive-kmart-bunnings-and-the-good-guys-accused-of-using-facial-recognition-technology/h08q8evb1">collect</a>, <a href="https://www.afr.com/technology/how-clearview-ai-unleashed-a-global-dystopia-20230929-p5e8lc">store</a> and modify facial information, and we’re facing an unexpected wave of <a href="https://deepai.org/machine-learning-model/text2img">photos</a> and <a href="https://theconversation.com/what-is-sora-a-new-generative-ai-tool-could-transform-video-production-and-amplify-disinformation-risks-223850">videos</a> generated with artificial intelligence (AI) tools.</p>

<p>The development of legal regulation for these uses is lagging. At what levels and in what ways should our facial information be protected?</p>

<h2>Is implied consent enough?</h2>

<p>The Australian <a href="https://www.legislation.gov.au/C2004A03712/latest/text">Privacy Act</a> considers biometric information (which would include your face) to be a part of our personal sensitive information. However, the act doesn’t <em>define</em> biometric information.</p>

<p>Despite its drawbacks, the act is currently the main legislation in Australia aimed at facial information protection. It states biometric information cannot be collected without a person’s consent.</p>

<p>But the law doesn’t specify whether it should be <a href="https://www.ipc.nsw.gov.au/fact-sheet-consent">express or implied consent</a>. Express consent is given explicitly, either orally or in writing. Implied consent means consent may reasonably be inferred from the individual’s actions in a given context. For example, if you walk into a store that has a sign “facial recognition camera on the premises”, your consent is implied.</p>

<figure class="align-right zoomable">
            <figure style="width: 590px" class="wp-caption alignright"><a href="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img decoding="async" alt="A poster at a supermarket that says camera technology trial in progress, partially obscured by a couple of bins." src="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=237&#038;fit=clip" srcset="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="600" height="1067" /></a><figcaption class="wp-caption-text">An inconspicuous sign that flags camera technology trial is in progress counts as implied consent. &#8211; Margarita Vladimirova</figcaption></figure>
            <figcaption>
             
            </figcaption>
        </figure>

<p>But using implied consent opens our facial data up to potential exploitation. <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store">Bunnings, Kmart</a> and <a href="https://www.theguardian.com/business/2023/feb/19/woolworths-expands-self-checkout-ai-that-critics-say-treats-every-customer-as-a-suspect">Woolworths</a> have all used easy-to-miss signage that facial recognition or camera technology is used in their stores.</p>

<h2>Valuable and unprotected</h2>

<p>Our facial information has become so valuable, <a href="https://www.theguardian.com/australia-news/2023/oct/24/australian-federal-police-afp-pimeyes-facial-recognition-facecheck-id-search-engine-platform">data companies such as Clearview AI and PimEye</a> are mercilessly hunting it down on the internet <a href="https://onezero.medium.com/i-got-my-file-from-clearview-ai-and-it-freaked-me-out-33ca28b5d6d4">without our consent</a>.</p>

<p>These companies put together databases for sale, used not only by the police in various countries, <a href="https://www.theguardian.com/australia-news/2023/oct/24/australian-federal-police-afp-pimeyes-facial-recognition-facecheck-id-search-engine-platform">including Australia</a>, but also by <a href="https://www.clearview.ai/developer-api">private companies</a>.</p>

<p>Even if you deleted all your facial data from the internet, you could easily be captured in public and appear in some database anyway. Being in someone’s TikTok video <a href="https://www.abc.net.au/news/2022-07-14/tiktok-video-maree-melbourne-flowers/101228418">without your consent</a> is a prime example – in Australia this is legal.</p>



<p>Furthermore, we’re also now contending with generative AI programs such as Midjourney, DALL-E 3, Stable Diffusion and others. Not only the collection, but the modification of our facial information can be easily performed by anyone.</p>

<p>Our faces are unique to us, they’re part of what we perceive as ourselves. But they don’t have special legal status or special legal protection.</p>

<p>The only action you can take to protect your facial information from aggressive collection by a store or private entity <a href="https://www.oaic.gov.au/privacy/privacy-complaints/lodge-a-privacy-complaint-with-us">is to complain</a> to the office of the Australian Information Commissioner, which may or may not result in an investigation.</p>

<p>The same applies to deepfakes. The Australian Competition and Consumer Commission will consider only activity that applies to trade and commerce, for example if a <a href="https://www.theguardian.com/technology/2022/mar/18/accc-takes-meta-to-court-over-facebook-scam-ads-depicting-australian-identities">deepfake is used for false advertising</a>.</p>

<p>And the Privacy Act doesn’t protect us from other people’s actions. I didn’t consent to have someone train an AI with my facial information and produce made-up images. But there is no oversight on such use of generative AI tools, either.</p>

<p>There are currently no laws that <em>prevent</em> other people from collecting or modifying your facial information.</p>



<h2>Catching up the law</h2>

<p>We need a range of regulations on the collection and modification of facial information. We also need a stricter status of facial information itself. Thankfully, some developments in this area are looking promising.</p>

<p>Experts at the University of Technology Sydney have proposed a comprehensive legal framework for <a href="https://www.uts.edu.au/human-technology-institute/projects/facial-recognition-technology-towards-model-law">regulating the use of facial recognition technology</a> under Australian law.</p>

<p>It contains proposals for regulating the first stage of non-consensual activity: the collection of personal information. That may help in the development of new laws.</p>

<p>Regarding photo modification using AI, we’ll have to wait for announcements from the newly established government <a href="https://www.minister.industry.gov.au/ministers/husic/media-releases/new-artificial-intelligence-expert-group">AI expert group</a> working to develop “safe and responsible AI practices”.</p>

<p>There are no specific discussions about a higher level of protection for our facial information in general. However, the government’s recent <a href="https://www.ag.gov.au/rights-and-protections/publications/government-response-privacy-act-review-report">response to the Attorney-General’s Privacy Act review</a> has some promising provisions.</p>

<p>The government has agreed further consideration should be given to enhanced risk assessment requirements in the context of facial recognition technology and other uses of biometric information. This work should be coordinated with the government’s ongoing work on Digital ID and the National Strategy for Identity Resilience.</p>

<p>As for consent, the government has agreed in principle that the definition of consent required for biometric information collection should be amended to specify it must be voluntary, informed, current, specific and unambiguous.</p>

<p>As facial information is increasingly exploited, we’re all waiting to see whether these discussions do become law – hopefully sooner rather than later.</p>

<hr />

<p><em>Correction: we have amended a sentence to clarify Woolworths use camera technology but not necessarily facial recognition technology.</em><!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/224643/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent-224643">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Not Big Brother, but close: a surveillance expert explains some of the ways we’re all being watched, all the time</title>
		<link>https://privacy.org.au/2022/12/19/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time/</link>
		
		<dc:creator><![CDATA[Ausma Bernot]]></dc:creator>
		<pubDate>Mon, 19 Dec 2022 01:23:03 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5255</guid>

					<description><![CDATA[Ausma Bernot, PhD Candidate, School of Criminology and Criminal Justice, Griffith University A group of researchers studied 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality. That&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2022/12/19/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<h1 class="legacy"></h1>

<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/499955/original/file-20221209-20279-c0jq3z.jpeg?ixlib=rb-1.1.0&#038;rect=95%2C107%2C7893%2C4383&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="419" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure>
</figure>
<p><span><a href="https://theconversation.com/profiles/ausma-bernot-963292">Ausma Bernot</a>, PhD Candidate, School of Criminology and Criminal Justice, <em><a href="https://theconversation.com/institutions/griffith-university-828">Griffith University</a></em></span></p>

<p>A group of <a href="https://www.nature.com/articles/srep01376;">researchers studied</a> 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality.</p>

<p>That was back in 2013.</p>

<p>Nearly ten years on, surveillance technologies permeate all aspects of our lives. They collect swathes of data from us in various forms, and often without us knowing.</p>

<p>I’m a surveillance researcher with a focus on technology governance. Here’s my round-up of widespread surveillance systems I think everyone should know about.</p>

<h2>CCTV and open-access cameras</h2>

<p>Although China has more than 50% of <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">all surveillance cameras installed</a> in the world (about 34 cameras per 1,000 people), Australian cities are <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">catching up</a>. In 2021, Sydney had 4.67 cameras per 1,000 people and Melbourne had 2.13.</p>

<p>While CCTV cameras can be used for legitimate purposes, such as promoting safety in cities and assisting police with criminal investigations, their use also poses serious concerns.</p>

<p>In 2021, New South Wales police <a href="https://www.innovationaus.com/facial-recognition-and-the-nsw-protest-crowds/">were suspected of</a> having used CCTV footage paired with facial recognition to find people attending anti-lockdown protests. When questioned, they didn’t confirm or deny if they had (or if they would in the future).</p>

<p>In August 2022, the United Nations confirmed CCTV is <a href="https://www.ohchr.org/en/documents/country-reports/ohchr-assessment-human-rights-concerns-xinjiang-uyghur-autonomous-region">being used to</a> carry out “serious human rights violations” against Uyghur and other predominantly Muslim ethnic minorities in the Xinjiang region of Northwest China.</p>

<p>The CCTV cameras in China don’t just record real-time footage. Many are equipped with facial recognition to <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">keep tabs on</a> the movements of minorities. And some have reportedly been trialled to <a href="https://www.bbc.com/news/technology-57101248">detect emotions</a>.</p>

<p>The US also has a long history of using CCTV cameras to support racist policing practices. In 2021, Amnesty International <a href="https://www.amnesty.org/en/latest/news/2021/06/scale-new-york-police-facial-recognition-revealed/">reported</a> areas with a higher proportion of non-white residents had more CCTV cameras.</p>

<p>Another issue with CCTV is security. Many of these cameras are open-access, which means they don’t have password protection and can often be easily accessed online. So I could spend all day watching a livestream of someone’s porch, as long as there was an open camera nearby.</p>

<p>Surveillance artist Dries Depoorter’s recent project <a href="https://driesdepoorter.be/thefollower/">The Follower</a> aptly showcases the vulnerabilities of open cameras. By coupling open camera footage with AI and Instagram photos, Depoorter was able to match people’s photos with the footage of where and when they were taken.</p>

<p>There was pushback, with one of the <a href="https://www.inverse.com/input/culture/dries-depoorters-ai-surveillance-art-the-follower-instagram-influencers-photos">identified people saying</a>:</p>

<blockquote>
<p>It’s a crime to use the image of a person without permission.</p>
</blockquote>

<p>Whether or not it is illegal will depend on the specific circumstances and where you live. Either way, the issue here is that Depoorter was able to do this in the first place.</p>

<h2>IoT devices</h2>

<p>An IoT (“Internet of Things”) device is any device that connects to a wireless network to function – so think smart home devices such as Amazon Echo or Google Dot, a baby monitor, or even smart traffic lights.</p>

<p>It’s estimated global spending on IoT devices will <a href="https://acola.org/hs5-internet-of-things-australia/">have reached</a> US$1.2 trillion by some point this year. Around 18 billion connected devices form the IoT network. Like unsecured CCTV cameras, IoT devices are easy to hack into if they use default passwords or passwords that have <a href="https://haveibeenpwned.com/">been leaked</a>.</p>

<p>In some examples, hackers have hijacked baby monitor cameras to <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">stalk</a> breastfeeding mums, <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">threaten</a> parents that their baby was being kidnapped, and say creepy things like “<a href="https://www.nbcnews.com/news/us-news/stranger-hacks-baby-monitor-tells-child-i-love-you-n1090046">I love you</a>” to children.</p>

<figure>
            <iframe loading="lazy" width="440" height="260" src="https://www.youtube.com/embed/xbk3OdYBLHA?wmode=transparent&#038;start=0" frameborder="0" allowfullscreen="allowfullscreen"></iframe>
</figure>

<p>Beyond hacking, businesses can also use data collected through IoT devices to further target customers with products and services.</p>

<p>Privacy experts raised the alarm in September over Amazon’s merger agreement with robot vacuum company iRobot. <a href="https://www.fightforthefuture.org/news/2022-09-09-letter-to-the-ftc-challenge-amazon-irobot-deal">A letter</a> to the US Federal Trade Commission signed by 26 civil rights and privacy advocacy groups said:</p>

<blockquote>
<p>Linking iRobot devices to the already intrusive Amazon home system incentivizes more data collection from more connected home devices, potentially including private details about our habits and our health that would endanger human rights and safety.</p>
</blockquote>

<p>IoT-collected data can also change hands with third parties through data partnerships (which are very common), and this too without customers’ explicit consent.</p>

<figure class="align-center zoomable">
            <figure style="width: 744px" class="wp-caption alignnone"><a href="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="424" /></a><figcaption class="wp-caption-text">Smart speakers with digital assistants consistently raise data privacy concerns among experts.</figcaption></figure>
</figure>

<h2>Big tech and big data</h2>

<p>In 2017, the <a href="https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data">value of big data exceeded</a> that of oil. Private companies have driven the majority of that growth.</p>

<p>For tech platforms, the expansive collection of users’ personal information is business as usual, literally, because more data mean more precise analytics, more effective targeted ads <a href="https://www.facebook.com/business/help/716180208457684?id=1792465934137726">and more revenue</a>.</p>

<p>This logic of profit-making through targeted advertising has been <a href="https://journals.sagepub.com/doi/full/10.1177/1095796018819461">dubbed</a> “surveillance capitalism”. As <a href="https://quoteinvestigator.com/2017/07/16/product/">the old saying</a> goes, if you’re not paying for it, then you’re the product.</p>

<p>Meta (which owns both Facebook and Instagram) <a href="https://www.forbes.com/sites/bradadgate/2022/11/03/revenue-of-alphabet-and-meta-the-digital-duopoly-have-been-slipping/?sh=2ebf3dad2fed">generated</a> almost US$23 billion in advertising revenue in the third quarter of this year.</p>

<p>The vast machinery behind this is illustrated well in the 2021 documentary The Social Dilemma, even if in a dramatised way. It <a href="https://theconversation.com/netflixs-the-social-dilemma-highlights-the-problem-with-social-media-but-whats-the-solution-147351">showed us how</a> social media platforms rely on our psychological weaknesses to keep us online for as long as possible, measuring our actions down to the seconds we spend hovering over an ad.</p>

<figure class="align-center ">
            <figure style="width: 744px" class="wp-caption alignnone"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="310" /><figcaption class="wp-caption-text">A graphic excerpt from Social Dilemma.</figcaption></figure>
</figure>

<h2>Loyalty programs</h2>

<p>Although many people don’t realise it, loyalty programs are one of the biggest personal data collection gimmicks out there.</p>

<p>In a particularly intrusive example, in 2012 one <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=706b0cd96668">US retailer</a> sent a teenage girl a catalogue dotted with pictures of smiling infants and nursery furniture. The girl’s angered father went to confront managers at the local store, and learned that predictive analytics knew more about his daughter than he did.</p>

<p>It’s estimated 88% of Australian consumers <a href="https://www.oaic.gov.au/privacy/privacy-assessments/loyalty-program-assessment-woolworths-rewards-woolworths-limited">over age 16 are members</a> of a loyalty program. These schemes build your consumer profile to sell you more stuff. Some might even charge you <a href="https://www.abc.net.au/everyday/making-loyalty-cards-worth-your-time-and-money/10998806">sneaky fees</a> and lure you in with future perks to sell you at steep prices.</p>

<p>As technology journalist <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">Ros Page notes</a>:</p>

<blockquote>
<p>[T]he data you hand over at the checkout can be shared and sold to businesses you’ve never dealt with.</p>
</blockquote>

<p>As a cheeky sidestep, you could find a buddy to swap your loyalty cards with. Predictive analytics is only strong when it can recognise behavioural patterns. When the patterns are disrupted, the data turn into noise. <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/194917/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time-194917">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>What do TikTok, Bunnings, eBay and Netflix have in common? They’re all hyper-collectors</title>
		<link>https://privacy.org.au/2022/07/23/what-do-tiktok-bunnings-ebay-and-netflix-have-in-common-theyre-all-hyper-collectors/</link>
		
		<dc:creator><![CDATA[Brendan Walker-Munro]]></dc:creator>
		<pubDate>Sat, 23 Jul 2022 00:16:40 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5080</guid>

					<description><![CDATA[You walk into a shopping centre to buy some groceries. Without your knowledge, an electronic scan of your face is taken by in-store surveillance cameras and stored in an online database. Each time you return to that store, your “faceprint” is compared with those of people wanted for shoplifting or violence. This might sound like science fiction but it’s the reality for many of us. By failing to take our digital privacy seriously – as former human rights commissioner Ed Santow has warned – Australia is “sleepwalking” its way into mass surveillance. <span class="excerpt-more"><a href="https://privacy.org.au/2022/07/23/what-do-tiktok-bunnings-ebay-and-netflix-have-in-common-theyre-all-hyper-collectors/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/474987/original/file-20220719-6978-2qdmfk.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="503" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/brendan-walker-munro-1326958">Brendan Walker-Munro</a>, Senior Research Fellow, <em><a href="https://theconversation.com/institutions/the-university-of-queensland-805">The University of Queensland</a></em></span></p>

<p>You walk into a shopping centre to buy some groceries. Without your knowledge, an electronic scan of your face is taken by in-store surveillance cameras and stored in an online database. Each time you return to that store, your “faceprint” is compared with those of people wanted for shoplifting or violence.</p>

<p>This might sound like science fiction but it’s the reality for many of us. By failing to take our digital privacy seriously – as former human rights commissioner Ed Santow has warned – Australia is “<a href="https://www.theage.com.au/national/we-must-not-sleepwalk-into-mass-surveillance-20220630-p5ay0q.html">sleepwalking</a>” its way into mass surveillance.</p>

<h2>Privacy and the digital environment</h2>

<p>Of course, companies have been collecting personal information for decades. If you’ve ever signed up to a loyalty program like FlyBuys then you’ve performed what marketing agencies call a “<a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">value exchange</a>”. In return for benefits from the company (like discounted prices or special offers), you’ve handed over details of who you are, what you buy, and how often you buy it.</p>

<p>Consumer data is big business. In 2019, a <a href="https://www.webfx.com/blog/internet/what-are-data-brokers-and-what-is-your-data-worth-infographic/">report</a> from digital marketers WebFX showed that data from around 1,400 loyalty programs was routinely being traded across the globe as part of an industry <a href="https://clearcode.cc/blog/what-is-data-broker/">worth around US$200 billion</a>. That same year, the Australian Competition and Consumer Commission’s <a href="https://www.accc.gov.au/publications/customer-loyalty-schemes-final-report">review of loyalty schemes</a> revealed how many of these loyalty schemes lacked data transparency and even discriminated against vulnerable customers.</p>

<p>But the digital environment is making data collection even easier. When you <a href="https://onlinemasters.ohio.edu/blog/netflix-data/">watch Netflix</a>, for example, the company knows what you watch, when you watch it, and how long you watch it for. But they go further, also <a href="https://seleritysas.com/blog/2019/04/05/how-netflix-used-big-data-and-analytics-to-generate-billions/">capturing data</a> on which scenes or episodes you watch repeatedly, the ratings of your content, the number of searches you perform and what you search for.</p>



<h2>Hyper-collection: a new challenge to privacy</h2>

<p>Late last year, the controversial tech company ClearView AI was <a href="https://www.oaic.gov.au/updates/news-and-media/clearview-ai-breached-australians-privacy">ordered</a> by the Australian information commissioner to stop “scraping” social media for the pictures it was collecting in its massive facial recognition database. Just this month, the commissioner was investigating several retailers for <a href="https://www.abc.net.au/news/2022-07-13/bunnings-kmart-investigated-over-facial-recognition-technology/101233372">creating facial profiles</a> of the customers in their stores.</p>

<p>This new phenomenon – “hyper-collection” – represents a growing trend by large companies to collect, sort, analyse and use more information than they need, usually in covert or passive ways. In many cases, hyper-collection is not supported by a truly legitimate commercial or legal purpose.</p>

<h2>Digital privacy laws and hyper-collection</h2>

<p>Hyper-collection is a major problem in Australia for three reasons.</p>

<p>First, Australia’s privacy law wasn’t prepared for the likes of Netflix and TikTok. Despite <a href="https://www.oaic.gov.au/privacy/the-privacy-act/history-of-the-privacy-act">numerous amendments</a>, the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">Privacy Act</a> dates back to the late 1980s. Although former Attorney-General Christian Porter <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">announced a review</a> of the Act in late 2019, it has been held up by the recent change of government.</p>

<p>Second, Australian privacy laws are unlikely on their own to threaten the profit base of foreign companies, especially those located in China. The Information Commissioner has the power to order companies to take certain actions – like it <a href="https://www.afr.com/policy/foreign-affairs/australia-s-tiktok-data-vulnerable-to-access-by-china-staff-20220712-p5b10f">did with Uber in 2021</a> – and can enforce these through court orders. But the penalties aren’t really big enough to discourage companies with profits in the billions of dollars.</p>



<p>Third, hyper-collection is often enabled by the vague consents we give to get access to the services these companies provide. Bunnings, for example, argued that its collection of your faceprint was allowed because <a href="https://ia.acs.org.au/article/2022/bunnings-doubles-down-on-facial-recognition.html">signs at the entry to their stores</a> told customers facial recognition might be used. Online marketplaces like eBay, Amazon, Kogan and Catch, meanwhile, supply “<a href="https://www.accc.gov.au/media-release/concerning-issues-for-consumers-and-sellers-on-online-marketplaces">bundled consents</a>” – basically, you have to consent to their privacy policies as a condition of using their services. No consent, no access.</p>

<h2>TikTok and hyper-collection</h2>

<p>TikTok (owned by Chinese company ByteDance) has largely replaced YouTube as a way of creating and sharing online videos. The app is powered by an algorithm has already drawn <a href="https://theconversation.com/tiktoks-secret-algorithm-is-its-greatest-strength-and-could-also-be-its-undoing-176605">criticism</a> for routinely collecting data about users, as well as the ByteDance’s secretive approach to <a href="https://www.lowyinstitute.org/the-interpreter/unique-power-tiktok-s-algorithm">content moderation and censorship</a>.</p>

<p>For years, TikTok executives have been telling governments that <a href="https://www.aspistrategist.org.au/its-time-tiktok-australia-came-clean/">data isn’t stored in servers on the Chinese mainland</a>. But these promises might be hollow in the wake of recent allegations.</p>



<p>Cybersecurity experts now claim that not only does the TikTok app <a href="https://www.smartcompany.com.au/technology/tiktok-chinese-servers-aussie-cybersecurity/">routinely connect to Chinese servers</a>, but that users’ data is accessible by ByteDance employees, including the mysterious Beijing-based “Master Admin”, which has <a href="https://www.buzzfeednews.com/article/emilybakerwhite/tiktok-tapes-us-user-data-china-bytedance-access">access to every user’s personal information</a>.</p>

<p>Then, just this week, it was alleged that TikTok (owned by Chinese company ByteDance) can also access <a href="https://www.abc.net.au/news/2022-07-18/tiktok-users-warned-the-platform-is-harvesting-personal-data/13977370">almost all the data</a> contained on the phone it is installed on – including photos, calendars and emails.</p>

<p>Under China’s national security laws, the government can order tech companies to <a href="https://www.sbs.com.au/news/article/so-what-if-china-can-access-your-tiktok-data/mr1anx97k">pass on that information</a> to police or intelligence agencies.</p>

<h2>What options do we have?</h2>

<p>Unlike a physical store, we don’t get a lot of choice about consenting to digital companies’ privacy policies and how they collect our information.</p>

<p>One option – supported by encryption expert Vanessa Teague at ANU – is for consumers simply to delete offending apps until their creators are <a href="https://www.sbs.com.au/news/article/so-what-if-china-can-access-your-tiktok-data/mr1anx97k">willing to submit to greater data transparency</a>. Of course, this means locking ourselves out of those services, and it will only have a big impact in the company if enough Australians join in.</p>



<p>Another option is “opting-out” of intrusive data collection. We’ve done this before – when My Health records became mandatory in 2019, a record number of us <a href="https://www.yourlifechoices.com.au/health/my-health-record-an-expensive-white-elephant-critics-say/">opted out</a>. Though these opt-outs reduced the usefulness of that <a href="https://www.theguardian.com/commentisfree/2018/jul/20/there-is-no-social-license-for-my-health-record-australians-should-reject-it">digital health record program</a>, they did demonstrate that Australians can take their data privacy seriously.</p>

<p>But how exactly can Australians opt-out of a massive social app like TikTok? Right now, they can’t – perhaps the government needs to explore a solution as part of its review.</p>

<p>A further option being explored by the Privacy Act review is whether to create new laws that would allow individuals to <a href="https://www.ag.gov.au/system/files/2020-10/privacy-act-review-terms-of-reference.pdf">sue companies for damages for breaches of privacy</a>. While lawsuits are expensive and time-consuming, they might just deliver the kind of financial damage to big companies that could change their behaviour.</p>

<p>No matter which option we take, Australians need to start getting more savvy with their data privacy. This might just mean we actually read those terms and conditions before agreeing, and being prepared to “vote with our feet” if companies won’t be honest about what they’re doing with our personal information.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/187274/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/what-do-tiktok-bunnings-ebay-and-netflix-have-in-common-theyre-all-hyper-collectors-187274">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Facial recognition is on the rise – but the law is lagging a long way behind</title>
		<link>https://privacy.org.au/2022/07/04/facial-recognition-is-on-the-rise-but-the-law-is-lagging-a-long-way-behind/</link>
		
		<dc:creator><![CDATA[Mark Andrejevic]]></dc:creator>
		<pubDate>Mon, 04 Jul 2022 00:18:05 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5070</guid>

					<description><![CDATA[Private companies and public authorities are quietly using facial recognition systems around Australia. Despite the growing use of this controversial technology, there is little in the way of specific regulations and guidelines to govern its use. <span class="excerpt-more"><a href="https://privacy.org.au/2022/07/04/facial-recognition-is-on-the-rise-but-the-law-is-lagging-a-long-way-behind/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure><figure style="width: 744px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/471008/original/file-20220627-14-q7vf1z.jpg?ixlib=rb-1.1.0&#038;rect=0%2C0%2C4481%2C3216&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="541" /><figcaption class="wp-caption-text"><a href="https://www.shutterstock.com/image-photo/iot-machine-learning-human-object-recognition-794528230">Image from Shutterstock</a></figcaption></figure></figure><p><span><a href="https://theconversation.com/profiles/mark-andrejevic-567958">Mark Andrejevic</a>, Professor, School of Media, Film, and Journalism, Monash University, <em><a href="https://theconversation.com/institutions/monash-university-1065">Monash University</a></em> and <a href="https://theconversation.com/profiles/gavin-jd-smith-195220">Gavin JD Smith</a>, Associate Professor in Sociology, <em><a href="https://theconversation.com/institutions/australian-national-university-877">Australian National University</a></em></span></p><p>Private companies and public authorities are quietly using facial recognition systems around Australia.</p><p>Despite the growing use of this controversial technology, there is little in the way of specific regulations and guidelines to govern its use.</p><h2>Spying on shoppers</h2><p>We were reminded of this fact recently when consumer advocates at CHOICE <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store">revealed</a> that major retailers in Australia are using the technology to identify people claimed to be thieves and troublemakers.</p><p>There is no dispute about the goal of reducing harm and theft. But there is also little transparency about how this technology is being used.</p><p>CHOICE found that most people have no idea their faces are being scanned and matched to stored images in a database. Nor do they know how these databases are created, how accurate they are, and how secure the data they collect is.</p><p>As CHOICE discovered, the notification to customers is inadequate. It comes in the form of small, hard-to-notice signs in some cases. In others, the use of the technology is announced in online notices rarely read by customers.</p><p>The companies clearly don’t want to draw attention to their use of the technology or to account for how it is being deployed.</p><h2>Police are eager</h2><p>Something similar is happening with the use of the technology by Australian police. Police in New South Wales, for example, have embarked on a “low-volume” <a href="https://www.theguardian.com/australia-news/2021/jul/01/calls-to-stop-nsw-police-trial-of-national-facial-recognition-system-over-lack-of-legal-safeguards">trial</a> of a nationwide face-recognition database. This trial took place despite the fact that the enabling legislation for the national database has not yet been passed.</p><p>In South Australia, controversy over Adelaide’s plans to upgrade its CCTV system with face-recognition capability led the city council to <a href="https://www.abc.net.au/news/2022-06-22/adelaide-city-council-votes-no-to-facial-recognition-in-cctv/101172924?utm_source=pocket_mylist">vote</a> not to purchase the necessary software. The council has also asked South Australia Police not to use face-recognition technology until legislation is in place to govern its use.</p><p>However, SA Police have <a href="https://www.abc.net.au/news/2022-06-22/adelaide-city-council-votes-no-to-facial-recognition-in-cctv/101172924?utm_source=pocket_mylist">indicated</a> an interest in using the technology.</p><p>In a public <a href="https://www.itnews.com.au/news/sa-police-ignore-adelaide-council-plea-for-facial-recognition-ban-on-cctv-581559">statement</a>, the police described the technology as a potentially useful tool for criminal investigations. The statement also noted:</p><blockquote><p>There is no legislative restriction on the use of facial recognition technology in South Australia for investigations.</p></blockquote><h2>A controversial tool</h2><p>Adelaide City Council’s call for regulation is a necessary response to the expanding use of automated facial recognition.</p><p>This is a powerful technology that promises to fundamentally change our experience of privacy and anonymity. There is already a large gap between the amount of personal information collected about us every day and our own knowledge of how this information is being used, and facial recognition will only make the gap bigger.</p><p>Recent events suggest a reluctance on the part of retail outlets and public authorities alike to publicise their use of the technology.</p><p>Although it is seen as a potentially useful tool, it can be a controversial one. A world in which remote cameras can identify and track people as they move through public space seems alarmingly Orwellian.</p><p>The technology has also been criticised for being invasive and, in some cases, <a href="https://www.marketplace.org/shows/marketplace-tech/bias-in-facial-recognition-isnt-hard-to-discover-but-its-hard-to-get-rid-of/">biased</a> and inaccurate. In the US, for example, people have already been <a href="https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/">wrongly arrested</a> based on matches made by face-recognition systems.</p><h2>Public pushback</h2><p>There has also been widespread public opposition to the use of the technology in some cities and states in the US, which have gone so far as to impose <a href="https://www.wired.com/story/face-recognition-banned-but-everywhere/">bans</a> on its use.</p><p>Surveys show the Australian public have <a href="https://securitybrief.com.au/story/australians-uneasy-about-facial-recognition-tech-report">concerns</a> about the invasiveness of the technology, but that there is also support for its potential use to increase public safety and security.</p><p>Facial-recognition technology isn’t going away. It’s likely to become less expensive and more accurate and powerful in the near future. Instead of implementing it piecemeal, under the radar, we need to directly confront both the potential harms and benefits of the technology, and to provide clear rules for its use.</p><h2>What would regulations look like?</h2><p>Last year, then human rights commissioner Ed Santow called for <a href="https://www.itnews.com.au/news/human-rights-commission-calls-for-temporary-ban-on-high-risk-govt-facial-recognition-565173">a partial ban</a> on the use of facial-recognition technology. He is now developing model legislation for how it might be regulated in Australia.</p><p>Any regulation of the technology will need to consider both the potential benefits of its use and the risks to privacy rights and civic life.</p><p>It will also need to consider enforceable standards for its proper use. These could include the right to correct inaccurate information, the need to provide human confirmation for automated forms of identification, and the setting of minimum standards of accuracy.</p><p>They could also entail improving public consultation and consent around the use of the technology, and a requirement for the performance of systems to be accountable to an independent authority and to those researching the technology.</p><p>As the reach of facial recognition expands, we need more public and parliamentary debate to develop appropriate regulations for governing its use.</p><hr /><p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/facial-recognition-is-on-the-rise-but-the-law-is-lagging-a-long-way-behind-185510">original article</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Insurance firms can skim your online data to price your insurance — and there’s little in the law to stop this</title>
		<link>https://privacy.org.au/2022/06/20/insurance-firms-can-skim-your-online-data-to-price-your-insurance-and-theres-little-in-the-law-to-stop-this/</link>
		
		<dc:creator><![CDATA[Zofia Bednarz]]></dc:creator>
		<pubDate>Mon, 20 Jun 2022 06:13:06 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5063</guid>

					<description><![CDATA[What if your insurer was tracking your online data to price your car insurance? Seems far-fetched, right? Yet there is predictive value in the digital traces we leave online. And insurers may use data collection and analytics tools to find our data and use it to price insurance services. Looking at several examples from customer loyalty schemes and social media, we found insurers can access vast amounts of consumer data under Australia’s weak privacy laws. <span class="excerpt-more"><a href="https://privacy.org.au/2022/06/20/insurance-firms-can-skim-your-online-data-to-price-your-insurance-and-theres-little-in-the-law-to-stop-this/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption alignnone"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/469391/original/file-20220617-24-txo2j0.jpeg?ixlib=rb-1.1.0&#038;rect=58%2C69%2C7684%2C5084&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="499" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/zofia-bednarz-1348045">Zofia Bednarz</a>, Lecturer in Commercial Law, <em><a href="https://theconversation.com/institutions/university-of-sydney-841">University of Sydney</a></em>; <a href="https://theconversation.com/profiles/kayleen-manwaring-8735">Kayleen Manwaring</a>, Senior Research Fellow, UNSW Allens Hub for Technology, Law &amp; Innovation and Senior Lecturer, School of Private &amp; Commercial Law, UNSW Law &amp; Justice, <em><a href="https://theconversation.com/institutions/unsw-sydney-1414">UNSW Sydney</a></em>, and <a href="https://theconversation.com/profiles/kimberlee-weatherall-3524">Kimberlee Weatherall</a>, Professor of Law, <em><a href="https://theconversation.com/institutions/university-of-sydney-841">University of Sydney</a></em></span></p>

<p>What if your insurer was tracking your online data to price your car insurance? Seems far-fetched, right?</p>

<p>Yet there is predictive value in the digital traces we leave online. And insurers may use data collection and analytics tools to find our data and use it to price insurance services.</p>

<p>For instance, <a href="https://pubmed.ncbi.nlm.nih.gov/27849366/">some</a> <a href="https://www.researchgate.net/publication/350525424_Smartphone_Operating_System_Preference_Based_On_Different_Personality_Lifestyle_Traits_Of_The_Consumer">studies</a> <a href="https://www.nber.org/papers/w24771#fromrss">have</a> found a correlation between whether an individual uses an Apple or Android phone and their likelihood of exhibiting certain personality traits.</p>

<p>In one example, US insurance broker Jerry analysed the driving behaviour of some 20,000 people to conclude Android users are <a href="https://getjerry.com/studies/sorry-iphone-fans-android-users-are-safer-drivers">safer drivers</a> than iPhone users. What’s stopping insurers from referring to such reports to price their insurance?</p>

<p>Our latest <a href="https://www.sciencedirect.com/science/article/abs/pii/S0267364922000152">research</a> shows Australian consumers have no real control over how data about them, and posted by them, might be collected and used by insurers.</p>

<p>Looking at several examples from customer loyalty schemes and social media, we found insurers can access vast amounts of consumer data under Australia’s <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">weak privacy laws</a>.</p>

<figure class="align-center zoomable">
            <figure style="width: 744px" class="wp-caption alignnone"><a href="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="A person's hands are visible holding an Apple phone on the left (screen facing forward), and a generic Android on the right." src="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=450&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=450&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=450&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=566&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=566&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/469394/original/file-20220617-21-k84rhp.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=566&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="566" /></a><figcaption class="wp-caption-text">How would you feel if a detail as menial as the brand of your phone was used to price your car insurance? &#8211; Shutterstock</figcaption></figure>
            
        </figure>

<h2>Your data is already out there</h2>

<p>Insurers are already using big data to price consumer insurance through personalised pricing, according to evidence gathered by industry regulators in the <a href="https://www.fca.org.uk/publication/feedback/fs16-05.pdf">United Kingdom</a>, <a href="https://register.eiopa.europa.eu/Publications/EIOPA_BigDataAnalytics_ThematicReview_April2019.pdf">European Union</a> and <a href="https://www.dfs.ny.gov/industry_guidance/circular_letters/cl2019_01">United States</a>.</p>

<p>Consumers often “agree” to all kinds of data collection and privacy policies, such as those used in loyalty schemes (who doesn’t like freebies?) and by social media companies. But they have no control over how their data are used once it’s handed over.</p>

<p>There are far-reaching inferences that can be drawn from data collected through loyalty programs and social media platforms – and these may be uncomfortable, or even highly sensitive.</p>

<p>Researchers using data analytics and machine learning have claimed to build models that can guess a person’s sexual orientation from pictures of <a href="https://osf.io/zn79k/">their face</a>, or their suicidal tendencies from <a href="https://www.sciencedirect.com/science/article/pii/S2214782915000160">posts on Twitter</a>.</p>

<p>Think about all the details revealed from a grocery shopping history alone: diet, household size, addictions, health conditions and social background, among others. In the case of social media, a user’s posts, pictures, likes, and links to various groups can be used to draw a precise picture of that individual.</p>

<p>What’s more is Australia has a <a href="https://www.cdr.gov.au/">Consumer Data Right</a> which already requires banks to share consumers’ banking data (at the consumer’s request) with another bank or app, such as to access a new service or offer.</p>

<p>The regime is actively being expanded to other parts of the economy including the energy sector, with the idea being competitors could use information on energy usage to make competitive offers.</p>

<p>The Consumer Data Right is advertised as <a href="https://www.cdr.gov.au/">empowering</a> for consumers – enabling access to new services and offers, and providing people with choice, convenience and control over their data.</p>

<p>In practice, however, it means insurance firms accredited under the program can require you to share your banking data in exchange for insurance services.</p>

<p>The previous Coalition government also <a href="https://ministers.treasury.gov.au/ministers/jane-hume-2020/media-releases/more-power-compare-and-switch-telco-providers-and-share">proposed “open finance”</a>, which would expand the Consumer Data Right to include access to your insurance and superannuation data. This hasn’t happened yet, but it’s likely the new Albanese government will look into it.</p>



<h2>Why more data in insurers’ hands may be bad news</h2>

<p>There are plenty of reasons to be concerned about insurers collecting and using increasingly detailed data about people for insurance pricing and claims management.</p>

<p>For one, large-scale data collection provides incentives for cyber attacks. Even if data is held in anonymised form, it can be <a href="https://techcrunch.com/2019/07/24/researchers-spotlight-the-lie-of-anonymous-data/">re-identified</a> with the right tools.</p>

<p>Also, insurers may be able to infer (or at least think they can infer) facts about an individual which they want to keep private, such as their sexual orientation, <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=794d21176668">pregnancy</a> status or religious beliefs.</p>

<p>There’s plenty of evidence the outputs of artificial intelligence tools employed in mass data analytics can be inaccurate and discriminatory. Insurers’ decisions may then be based on misleading or untrue data. And these tools are so complex it’s often difficult to work out if, or where, errors or bias are present.</p>

<figure class="align-center zoomable">
            <figure style="width: 744px" class="wp-caption alignnone"><a href="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="A magnifying glass hovers over a Facebook post's likes" src="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=400&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=400&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=400&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=503&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=503&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/469396/original/file-20220617-13-58ptct.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=503&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="503" /></a><figcaption class="wp-caption-text">Each day, people post personal information online. And much of it can be easily accessed by others. &#8211; Shutterstock</figcaption></figure>
            
        </figure>

<p>Although insurers are meant to pool risk and compensate the unlucky, some might use data to only offer affordable insurance to very low-risk people. Vulnerable consumers may face <a href="https://actuaries.logicaldoc.cloud/download-ticket?ticketId=09c77750-aa90-4ba9-835e-280ae347487b">exclusion</a>.</p>

<p>A more widespread use of data, especially via the Consumer Data Right, will especially disadvantage those who are unable or unwilling to share data with insurers. These people may be low risk, but if they can’t or won’t prove this, they’ll have to pay more than a fair price for their insurance cover.</p>

<p>They may even pay more than what they would have in a pre-Consumer Data Right world. So insurance may move <em>further</em> from a fair price when more personal data are available to insurance firms.</p>

<h2>We need immediate action</h2>

<p>Our <a href="http://www5.austlii.edu.au/au/journals/SydLawRw/2021/20.html">previous research</a> demonstrated that apart from anti-discrimination laws, there are inadequate constraints on how insurers are allowed to use consumers’ data, such as those taken from online sources.</p>

<p>The more insurers base their assessments on data a consumer didn’t directly provide, the harder it will be for that person to understand how their “riskiness” is being assessed. If an insurer requests your transaction history from the last five years, would you know what they are looking for? Such problems will be exacerbated by the expansion of the Consumer Data Right.</p>

<p>Interestingly, insurance firms themselves might <a href="https://www.nature.com/news/can-we-open-the-black-box-of-ai-1.20731">not know</a> how collected data translates into risk for a specific consumer. If their approach is to simply feed data into a complex and opaque artificial intelligence system, all they’ll know is they’re getting a supposedly “better” risk assessment with more data.</p>

<p>Recent <a href="https://theconversation.com/bunnings-kmart-and-the-good-guys-say-they-use-facial-recognition-for-loss-prevention-an-expert-explains-what-it-might-mean-for-you-185126">reports</a> of retailers collecting shopper data for facial recognition have highlighted how important it is for the Albanese government to urgently reform <a href="https://www.ag.gov.au/integrity/consultations/review-privacy-act-1988">our privacy laws</a>, and take a close look at other data laws, including proposals to <a href="https://treasury.gov.au/review/statutory-review-consumer-data-right">expand the Consumer Data Right</a>.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/185038/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/insurance-firms-can-skim-your-online-data-to-price-your-insurance-and-theres-little-in-the-law-to-stop-this-185038">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Govt ‘steamrolling’ ahead with facial recognition plan</title>
		<link>https://www.innovationaus.com/govt-steamrolling-ahead-with-facial-recognition-plan/</link>
		
		<dc:creator><![CDATA[Denham Sadler]]></dc:creator>
		<pubDate>Mon, 31 Jan 2022 22:17:40 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4960</guid>

					<description><![CDATA[Despite the necessary legislation not having been passed, the federal government is “streamrolling” ahead with its plan to launch a national facial recognition database and is already on the hunt for a private provider to build out the new services. The Department of Home Affairs last week approached the market for a provider to deliver identity matching services, including to host the new National Driver Licence Facial Recognition Solution (NDLFRS). This database, incorporating drivers licence photos from state and territory governments, has been launched but is not yet in operation because the necessary legislation has still not passed through Parliament. The move to issue a tender for this week, before the legislation has been passed, has been criticised by privacy advocates as putting the cart before the horse, with calls for a moratorium on facial recognition technology until better privacy precautions are in place. <span class="excerpt-more"><a href="https://www.innovationaus.com/govt-steamrolling-ahead-with-facial-recognition-plan/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[Despite the necessary legislation not having been passed, the federal government is “streamrolling” ahead with its plan to launch a national facial recognition database and is already on the hunt for a private provider to build out the new services. The Department of Home Affairs last week approached the market for a provider to deliver identity matching services, including to host the new National Driver Licence Facial Recognition Solution (NDLFRS). This database, incorporating drivers licence photos from state and territory governments, has been launched but is not yet in operation because the necessary legislation has still not passed through Parliament. The move to issue a tender for this week, before the legislation has been passed, has been criticised by privacy advocates as putting the cart before the horse, with calls for a moratorium on facial recognition technology until better privacy precautions are in place. <span class="excerpt-more"><a href="https://www.innovationaus.com/govt-steamrolling-ahead-with-facial-recognition-plan/">Read More</a></span>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Information Commissioner issues determination into 7-Eleven Stores for APP breaches through use of facial recognition technology of unsuspecting customers</title>
		<link>http://www.peteraclarke.com.au/2021/10/19/information-commissioner-issues-determination-into-7-eleven-stores-pty-ltd-2021-aicmr-50-29-september-2021-for-breaches-of-australian-privacy-principles-3-and-5-through-use-of-facial-recognition/#new_tab</link>
		
		<dc:creator><![CDATA[Peter Clarke]]></dc:creator>
		<pubDate>Tue, 19 Oct 2021 07:20:30 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4849</guid>

					<description><![CDATA[The Australian Information Commissioner has issued a very significant determination resulting from a Commissioner initiated investigation into 7-Eleven, where she found that the company had breached Australian Privacy Principle (APP) 3 and 5 of the Privacy Act 1988. <span class="excerpt-more"><a href="http://www.peteraclarke.com.au/2021/10/19/information-commissioner-issues-determination-into-7-eleven-stores-pty-ltd-2021-aicmr-50-29-september-2021-for-breaches-of-australian-privacy-principles-3-and-5-through-use-of-facial-recognition/#new_tab">Read More</a></span>]]></description>
										<content:encoded><![CDATA[The Australian Information Commissioner has issued a very significant determination resulting from a Commissioner initiated investigation into 7-Eleven, where she found that the company had breached Australian Privacy Principle (APP) 3 and 5 of the Privacy Act 1988. <span class="excerpt-more"><a href="http://www.peteraclarke.com.au/2021/10/19/information-commissioner-issues-determination-into-7-eleven-stores-pty-ltd-2021-aicmr-50-29-september-2021-for-breaches-of-australian-privacy-principles-3-and-5-through-use-of-facial-recognition/#new_tab">Read More</a></span>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>I’d prefer an ankle tag: why home quarantine apps are a bad idea</title>
		<link>https://privacy.org.au/2021/09/19/id-prefer-an-ankle-tag-why-home-quarantine-apps-are-a-bad-idea/</link>
		
		<dc:creator><![CDATA[Toby Walsh]]></dc:creator>
		<pubDate>Sat, 18 Sep 2021 22:38:04 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4826</guid>

					<description><![CDATA[South Australia has begun a trial of a new COVID app to monitor arrivals into the state. SA Premier Steven Marshall claimed “every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app”. But why are we developing such home-quarantine apps in the first place, when we already have a cheap technology to do this? If we want to monitor that people are at home (and that’s a big if), wouldn’t one of the ankle tags already used by our corrective services for home detention be much simpler, safer and more robust? There are many reasons to be concerned about home-quarantine apps. <span class="excerpt-more"><a href="https://privacy.org.au/2021/09/19/id-prefer-an-ankle-tag-why-home-quarantine-apps-are-a-bad-idea/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p><span><a href="https://theconversation.com/profiles/toby-walsh-51">Toby Walsh</a>, Professor of AI at UNSW, Research Group Leader, <em><a href="https://theconversation.com/institutions/unsw-1414">UNSW</a></em></span></p>

<p>South Australia has begun <a href="https://www.abc.net.au/news/2021-08-23/how-will-south-australias-home-quarantine-trial-work/100398878">a trial</a> of a new COVID app to monitor arrivals into the state. SA Premier Steven Marshall claimed “every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app”.</p>

<p>He then doubled down with the boast that he was “pretty sure the technology that we have developed within the South Australia government will become the national standard and will be rolled out across the country.”</p>

<p>Victoria too has announced impending “<a href="https://www.abc.net.au/news/2021-09-09/victoria-border-area-exemptions-allow-home-quarantine/100444532">technologically supported</a>” home quarantine, though details remain unclear. Home quarantine will also eventually be <a href="https://www.abc.net.au/news/2021-09-09/scott-morrison-message-australians-overseas-covid19-quarantine/100445680">available for international arrivals</a>, according to Prime Minister Scott Morrison.</p>

<p>The South Australian app has received little attention in Australia, but in the US the left-leaning Atlantic magazine called it “<a href="https://www.theatlantic.com/ideas/archive/2021/09/pandemic-australia-still-liberal-democracy/619940/">as Orwellian as any in the free world</a>”. Right-wing outlets such as <a href="https://www.theguardian.com/australia-news/2021/sep/04/south-australia-facial-recognition-trial-covid-app-blasted-by-fox-and-breitbart-criticised-over-lack-of-safeguards">Fox News and Breitbart</a> also joined the attack, and for once I find myself in agreement with them.</p>

<h2>Location tracking and facial recognition</h2>

<figure class="align-right zoomable">
            <figure style="width: 240px" class="wp-caption alignright"><a href="https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=237&#038;fit=clip" srcset="https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=1176&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=1176&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=1176&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=1477&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=1477&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/420161/original/file-20210909-24-zwjrys.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=1477&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="401" height="786" /></a><figcaption class="wp-caption-text">The South Australian home quarantine app uses facial recognition software to identify users. &#8211; <a class="source" href="https://apps.apple.com/au/app/home-quarantine-sa/id1567354245#?platform=iphone">Government of South Australia</a></figcaption></figure>
        </figure>

<p>Despite the SA Premier’s claims, this isn’t the first such app to be used in Australia. A similar <a href="https://www.wa.gov.au/government/publications/covid-19-coronavirus-g2g-now">home-quarantine app</a> is already in use for arrivals into WA, and <a href="https://www.g2gnow.com.au/about">in some cases the Northern Territory</a>.</p>

<p>Both apps uses geolocation and facial recognition software to track and identify those in quarantine. Users are required to prove they are at home when randomly prompted by the application.</p>

<p>In SA, you have 15 minutes to get the face recognition software to verify you’re still at home. In WA, it is more of a race. You have just 5 minutes before you risk a knock on the door from the police.</p>

<p>Another difference is that the SA app is opt-in. Currently. The WA app is already mandatory for arrivals from high risk areas like Victoria. For extreme risk areas like NSW, it’s straight into a quarantine hotel.</p>

<h2>Reasons for concern</h2>

<p>But why are we developing such home-quarantine apps in the first place, when we already have a cheap technology to do this? If we want to monitor that people are at home (and that’s a big <em>if</em>), wouldn’t one of the ankle tags already used by our corrective services for home detention be much simpler, safer and more robust?</p>

<p>There are many reasons to be concerned about home-quarantine apps.</p>

<p>First, they’ll likely be much easier to hack than ankle tags. How many of us have hacked geo-blocks to access Netflix in the US, or to watch other digital content from another country? <a href="https://www.lifewire.com/fake-gps-location-4165524">Faking GPS location on a smartphone</a> is not much more difficult.</p>

<p>Second, facial recognition software is often flawed, and is frequently biased against people of colour and against women. The documentary <a href="https://www.netflix.com/title/81328723">Coded Bias</a> does a great job unpicking these biases.</p>

<figure>
            <iframe loading="lazy" width="440" height="260" src="https://www.youtube.com/embed/jZl55PsfZJQ?wmode=transparent&#038;start=0" frameborder="0" allowfullscreen="allowfullscreen"></iframe>
        </figure>

<p>Despite years of effort, even the big tech giants like Google and Amazon have been <a href="https://venturebeat.com/2021/09/03/bias-persists-in-face-detection-systems-from-amazon-microsoft-and-google/">unable to eliminate these biases from their software</a>. I have little hope the SA government or the WA company GenVis, the developers of the two Australian home-quarantine apps, will have done better.</p>

<p>Indeed, the Australian Human Rights Commission has called for <a href="https://tech.humanrights.gov.au/overview/summary">a moratorium on the use of facial recognition software</a> in high-risk settings such as policing until better regulation is in place to protect human rights and privacy.</p>

<p>Third, there needs to be a much more detailed and public debate around issues like privacy, and safeguards put in place based on this discussion, in advance of the technology being used.</p>

<p>With COVID check-in apps, we were promised the data would only be used for public health purposes. But police forces around Australia have <a href="https://theconversation.com/police-access-to-covid-check-in-data-is-an-affront-to-our-privacy-we-need-stronger-and-more-consistent-rules-in-place-167360">accessed this information for other ends on at least six occasions</a>. This severely undermines the public’s confidence and use of such apps.</p>

<p>Before it was launched, the Commonwealth’s COVIDSafe app had legislative prohibitions put in place on the use of the data collected for anything but contact tracing. This perhaps gave us a false sense of security as the state-produced COVID check-in apps did not have any such legal safeguards. Only some states have retrospectively introduced legislation to provide such protections.</p>

<p>Fourth, we have to worry about how software like this legitimises technologies like facial recognition that ultimately erode fundamental rights such as the right to privacy.</p>

<p>If home-quarantine apps work successfully, will they open the door to facial recognition being used in other settings? To identify shop lifters? To provide access to welfare? Or to healthcare? What Orwellian world will this take us to?</p>

<h2>The perils of facial recognition</h2>

<p>In China, we have already seen <a href="https://www.theguardian.com/news/2019/apr/11/china-hi-tech-war-on-muslim-minority-xinjiang-uighurs-surveillance-face-recognition">facial recognition software used to monitor and persecute the Uighur minority</a>. In the US, at least three Black people have already wrongly ended up in jail due to <a href="https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html">facial recognition errors</a>.</p>

<p>Facial recognition is a technology that is dangerous if it doesn’t work (as it often the case). And dangerous if it does. It changes the speed, scale and cost of surveillance.</p>

<p>With facial recognition software behind the CCTV cameras found on many street corners, you can be tracked 24/7. You are no longer anonymous when you go out to the shops. Or when you protest about Black lives mattering or the climate emergency.</p>

<h2>High technology is not the solution</h2>

<p>High tech software like facial recognition isn’t a fix for the problems that have plagued Australia’s response to the pandemic. It can’t remedy the failure to buy enough vaccines, the failure to build dedicated quarantine facilities, or the in-fighting and point-scoring between states and with the Commonwealth.</p>

<p>I never thought I’d say this but, all in all, I think I’d prefer an ankle tag. And if the image of the ankle tag seems too unsettling for you, we could do what Hong Kong has done and <a href="https://www.nytimes.com/2020/04/08/world/asia/hong-kong-coronavirus-quarantine-wristband.html">make it a wristband</a>.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/167533/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important;" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/id-prefer-an-ankle-tag-why-home-quarantine-apps-are-a-bad-idea-167533">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Large-scale facial recognition is incompatible with a free society</title>
		<link>https://privacy.org.au/2020/07/17/large-scale-facial-recognition-is-incompatible-with-a-free-society/</link>
		
		<dc:creator><![CDATA[Seth Lazar]]></dc:creator>
		<pubDate>Fri, 17 Jul 2020 11:13:15 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4343</guid>

					<description><![CDATA[In the US, tireless opposition to state use of facial recognition algorithms has recently won some victories. Outside the US, however, the tide is heading in the other direction.

Here in Australia, despite pushback from the Human Rights Commission, the trend is towards greater use. The government proposed an ambitious plan for a national face database (including wacky trial balloons about age-verification on porn sites). Some local councils are adding facial recognition into their existing surveillance systems. Police officers have tried out the dystopian services of Clearview AI. Should Australia be using this technology? To decide, we need to answer fundamental questions about the kind of people, and the kind of society, we want to be. <span class="excerpt-more"><a href="https://privacy.org.au/2020/07/17/large-scale-facial-recognition-is-incompatible-with-a-free-society/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<span><a href="https://theconversation.com/profiles/seth-lazar-580623">Seth Lazar</a>, <em><a href="https://theconversation.com/institutions/australian-national-university-877">Australian National University</a></em>; <a href="https://theconversation.com/profiles/claire-benn-963008">Claire Benn</a>, <em><a href="https://theconversation.com/institutions/australian-national-university-877">Australian National University</a></em>, and <a href="https://theconversation.com/profiles/mario-gunther-1134417">Mario Günther</a>, <em><a href="https://theconversation.com/institutions/australian-national-university-877">Australian National University</a></em></span>

<p>In the US, tireless <a href="https://www.technologyreview.com/2020/06/12/1003482/amazon-stopped-selling-police-face-recognition-fight/">opposition</a> to state use of facial recognition algorithms has recently won some victories.</p>

<p>Some progressive cities have <a href="https://edition.cnn.com/2019/07/17/tech/cities-ban-facial-recognition/index.html">banned</a> some uses of the technology. <a href="https://techcrunch.com/2020/06/08/ibm-ends-all-facial-recognition-work-as-ceo-calls-out-bias-and-inequality/">Three</a> <a href="https://www.usatoday.com/story/news/nation/2020/06/10/george-floyd-protests-amazon-police-use-facial-recognition/5338536002/">tech</a> <a href="https://www.washingtonpost.com/technology/2020/06/11/microsoft-facial-recognition/">companies</a> have pulled facial recognition products from the market. <a href="https://edition.cnn.com/2020/06/25/tech/facial-recognition-legislation-markey/index.html">Democrats have advanced a bill</a> for a moratorium on facial recognition. The Association for Computing Machinery (ACM), a leading computer science organisation, <a href="https://www.acm.org/binaries/content/assets/public-policy/ustpc-facial-recognition-tech-statement.pdf">has also come out against the technology</a>.</p>

<p>Outside the US, however, the tide is heading in the other direction. China is deploying <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">facial recognition on a vast scale</a> in its social credit experiments, policing, and suppressing the Uighur population. It is also exporting facial recognition technology (and norms) to partner countries in the <a href="https://www.lowyinstitute.org/the-interpreter/belt-and-road-means-big-data-facial-recognition-too">Belt and Road initiative</a>. The UK High Court ruled its use by South Wales Police <a href="https://www.bbc.com/news/uk-wales-49565287">lawful</a> last September (though the decision is being appealed).</p>

<p>Here in Australia, despite <a href="https://humanrights.gov.au/about/news/media-releases/commission-calls-accountable-ai">pushback from the Human Rights Commission</a>, the trend is also towards greater use. The government proposed an ambitious plan for a <a href="https://www.itnews.com.au/news/three-states-complete-national-face-matching-database-upload-535352">national face database</a> (including wacky trial balloons about <a href="https://www.nytimes.com/2019/10/29/world/australia/pornography-facial-recognition.html">age-verification on porn sites</a>). Some local councils are <a href="https://www.abc.net.au/news/2020-06-17/facial-surveillance-slowly-being-trialled-around-the-country/12308282">adding facial recognition</a> into their existing surveillance systems. Police officers have <a href="https://www.abc.net.au/news/science/2020-04-14/clearview-ai-facial-recognition-tech-australian-federal-police/12146894">tried out the dystopian services of Clearview AI</a>.</p>

<p>Should Australia be using this technology? To decide, we need to answer fundamental questions about the kind of people, and the kind of society, we want to be.</p>

<h2>From facial recognition to face surveillance</h2>

<p>Facial recognition has <a href="https://global-uploads.webflow.com/5e027ca188c99e3515b404b7/5ed1002058516c11edc66a14_FRTsPrimerMay2020.pdf">many uses</a>.</p>

<p>It can verify individual identity by comparing a target image with data held on file to confirm a match – this is “one-to-one” facial recognition. It can also compare a target image with a database of subjects of interest. That’s “one-to-many”. The most ambitious form is “all-to-all” matching. This would mean matching every image to a comprehensive database of every person in a given polity.</p>

<p>Each approach can be carried out asynchronously (on demand, after images are captured) or in real time. And they can be applied to separate (disaggregated) data streams, or used to bring together massive surveillance datasets.</p>

<p>Facial recognition occurring at one end of each of these scales – one-to-one, asynchronous, disaggregated – has well-documented benefits. One-to-one real-time facial recognition can be convenient and relatively safe, like unlocking your phone, or proving your identity at an automated passport barrier. Asynchronous disaggregated one-to-many facial recognition can be useful for law enforcement – analysing CCTV footage to identify a suspect, for example, or finding victims and perpetrators in <a href="https://www.nytimes.com/2020/02/07/business/clearview-facial-recognition-child-sexual-abuse.html">child abuse videos</a>.</p>

<p>However, facial recognition at the other end of these scales – one-to-many or all-to-all, real-time, integrated – amounts to face surveillance, which has less obvious benefits. Several police forces in the UK have trialled real-time one-to-many facial recognition to seek persons of interest, <a href="https://www.ft.com/content/f4779de6-b1e0-11e9-bec9-fdcab53d6959">with mixed results</a>. The benefits of integrated real-time all-to-all face surveillance in China are yet to be seen.</p>

<p>And while the benefits of face surveillance are dubious, it risks fundamentally changing the kind of society we live in.</p>

<figure class="align-center ">
            <img decoding="async" alt="" src="https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=403&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=403&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=403&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=506&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=506&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/346488/original/file-20200709-87071-4dqoia.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=506&#038;fit=crop&#038;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" />
            <figcaption>
             <span class="caption">Real-time facial recognition applied to crowds amounts to face surveillance.</span>
             <span class="attribution"><span class="source">(Image source: Shutterstock)</span></span>
            </figcaption>
        </figure>

<h2>Face surveillance often goes wrong, but it’s bad even when it works</h2>

<p>Most facial recognition algorithms are accurate with head-on, well-lit portraits, but underperform with “faces in the wild”. They are also <a href="https://dam-prod.media.mit.edu/x/2019/01/24/AIES-19_paper_223.pdf">worse at identifying black faces</a>, and <a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf">especially the faces of black women</a>.</p>

<p>The errors tend to be false positives – making incorrect matches, rather than missing correct ones. If face surveillance were used to dole out cash prizes, this would be fine. But a match is almost always used to target interventions (such as arrests) that harm those identified.</p>

<p>More false positives for minority populations means they bear the costs of face surveillance, while any benefits are likely to accrue to majority populations. So using these systems will <a href="https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist/">amplify the structural injustices</a> of the societies that produce them.</p>

<p>Even when it works, face surveillance is still harmful. Knowing where people are and what they are doing enables you to predict and control their behaviour.</p>

<p>You might believe the Australian government wouldn’t use this power against us, but the very fact they have it makes us less free. Freedom isn’t only about making it <em>unlikely</em> others will interfere with you. It’s about making it <a href="https://www.cambridge.org/core/books/on-the-peoples-terms/219DF8F7F166B305318CD9D51FAC45DE">impossible</a> for them to do so.</p>

<h2>Face surveillance is intrinsically wrong</h2>

<p>Face surveillance relies on the idea that others are entitled to extract biometric data from you without your consent when you are in public.</p>

<p>This is false. We have a right to control our own biometric data. This is what is called an underived right, like the right to control your own body.</p>

<p>Of course, rights have limits. You can lose the protection of a right – someone who robs a servo may lose their right to anonymity – or the right may be overridden, if necessary, for a good enough cause.</p>

<p>But the great majority of us have committed no crime that would make us lose the right to control our biometric data. And the possible benefits of using face surveillance on any particular occasion must be discounted by their probability of occurring. Certain rights violations are unlikely to be overridden by hypothetical benefits.</p>

<p><a href="https://openreview.net/forum?id=s-e2zaAlG3I">Many prominent algorithms</a> used for face surveillance were also developed in morally compromised ways. They used datasets containing images used without permission of the rightful owners, as well as harmful images and deeply objectionable labels.</p>

<h2>Arguments for face surveillance don’t hold up</h2>

<p>There will of course be counterarguments, but none of them hold up.</p>

<p><em>You’ve already given up your privacy to Apple or Google – why begrudge police the same kind of information?</em> Just because we have sleepwalked into a surveillance society doesn’t mean we should refuse to wake up.</p>

<p><em>Human surveillance is more biased and error-prone than algorithmic surveillance.</em> Human surveillance is indeed morally problematic. Vast networks of CCTV cameras already compromise our civil liberties. Weaponizing them with software that enables people to be tracked across multiple sites only makes them worse.</p>

<p><em>We can always keep a human in the loop.</em> False positive rates can be reduced by human oversight, but human oversight of automated systems is itself <a href="https://doi.org/10.1016/0005-1098(83)90046-8">flawed</a> and <a href="https://arstechnica.com/tech-policy/2019/09/algorithms-should-have-made-courts-more-fair-what-went-wrong/">biased</a>, and this doesn’t address the other objections against face surveillance.</p>

<p><em>Technology is neither good nor bad in itself; it’s just a tool that can be used for good or bad ends.</em> Every tool makes <a href="https://mitpress.mit.edu/books/how-artifacts-afford">some things easier and some things harder</a>. Facial recognition makes it easier to oppress vulnerable populations and violate everyone’s basic rights.</p>

<h2>It’s time for a moratorium</h2>

<p>Face surveillance is based on morally compromised research, violates our rights, is harmful, and exacerbates structural injustice, both when it works and when it fails. Its adoption harms individuals, and makes our society as a whole more unjust, and less free.</p>

<p>A moratorium on its use in Australia is the least we should demand.

<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/126282/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important;" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>
<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/large-scale-facial-recognition-is-incompatible-with-a-free-society-126282">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>APF Newsletter 8 July 2020</title>
		<link>https://privacy.org.au/2020/07/02/apf-newsletter-8-july-2020/</link>
		
		<dc:creator><![CDATA[Roger Clarke]]></dc:creator>
		<pubDate>Thu, 02 Jul 2020 07:48:58 +0000</pubDate>
				<category><![CDATA[Newsletter]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4319</guid>

					<description><![CDATA[We regret the delay since the last Newsletter.

The primary reason has again been busyness on policy issues, but to some extent the COVID-19 epidemic has also played a role.
The lockdown hasn't greatly affected the workings of an organisation that has operated mostly virtually for decades already. However, it's had a substantial impact on many of our most active volunteers. <span class="excerpt-more"><a href="https://privacy.org.au/2020/07/02/apf-newsletter-8-july-2020/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p>We regret the delay since the last Newsletter.

</p><p>The primary reason has again been busyness on policy issues, but to some extent the COVID-19 epidemic has also played a role.<br />
The lockdown hasn&#8217;t greatly affected the workings of an organisation that has operated mostly virtually for decades already. However, it&#8217;s had a substantial impact on many of our most active volunteers.<br />
The drastic economic slowdown is disproportionately affecting the self-employed. Meanwhile, the government&#8217;s structuring of its support programs to actively exclude the tertiary education sector has harmed the interests even of employed academics. And it&#8217;s doing far more damage to the large proportion of university teaching staff who have been &#8216;casualised&#8217; as a result of the invasion of chancelries by profit-oriented CEOs and their inevitable focus on large marketing overheads.

</p><p>The <b>COVID-19 epidemic</b> has been used as an excuse by elements within the public service to further their desires for social control. Rather than being abashed by <b>the court&#8217;s demolition of the Robo-Debt program</b>, they have turned their attention to establishing <b>the misleadingly-named &#8216;COVIDsafe&#8217; contact &#8216;tracing&#8217; tool</b> that was quite evidently designed to fail, both technically and operationally, <b>opening up the possibility of mass &#8216;tracking&#8217; surveillance</b>.

</p><p>Considerable efforts continue to be invested in a wide range of other privacy matters.

</p><p>The desire among economists to impose a right for corporations to access consumers&#8217; data, dressed up as a <b>‘Consumer Data Right’ (CDR)</b>, is coming to fruition, with the privacy needs largely ignored.

</p><p>The ABS held a nominal consultation process in relation to <b>Census 2021</b>, and then ignored the submissions made to it. It may be necessary for the APF to be more forthright than it has been in the past, and declare the ABS to have not just breached public trust, but to have destroyed it, such that the public can be expected to increasingly treat the agency and the survey with disdain.

</p><p><b>Applications of biometrics</b> continue to be wildly inappropriate and largely uncontrolled.

</p><p>The <b>OAIC</b> continues to protect government agencies and business, not privacy.

</p><p>Among the issues where we&#8217;re still in the ring, influencing policy decisions, are:
</p><ul>
<li><b>Google’s proposed acquisition of Fitbit</b> – a global collection of health, location and other data
</li><li><b>TOLA</b> (the Telecommunications Legislation Amendment (International Production Orders) Bill) – weakly-accountable overseas access to a wide range of Australian data in the cloud
</li><li><b>extensions of the CDR</b> into such areas as electricity and gas
</li></ul><p>

</p><p>On the positive side:
</p><ul>
<li>re <b>RoboDebt</b>, the submissions of APF and so many other advocacy organisations about the iniquitous behaviour of (Achtung: newspeak:) &#8216;Services Australia&#8217; have been vindicated, but far too late to prevent a vast amount of harm among the least well-off. But the harm, and the waste of over a billion dollars, have given rise not to sackings, demotions and prosecutions, but rather to promotions
</li><li>aspects of <b>the ACCC&#8217;s Report on Digital Platforms</b> have been highly privacy-positive, and may lead to some actual benefits for privacy
</li><li>even technology suppliers are realising that the potential of <b>facial recognition</b> has been massively over-sold, and governments need to pull back on their schemes until and unless they are subjected to effective regulation
</li></ul><p>

</p><p>Here is <a href="https://privacy.org.au/publications/by-date/">the index of policy papers</a>.
</p><p>They are also accessible <a href="https://privacy.org.au/publications/by-policy-area/">by topic-area</a>.

</p><p>We&#8217;re working on a wiki page to carry details of all current opportunities to influence policy relevant to privacy. It&#8217;s intended to be maintainable by any contributor. Here&#8217;s <a href="http://rogerclarke.com/wiki/PPP">a pilot implementation</a>.

</p><p>The APF continues to value your support for its work for privacy, and against privacy-abusive initiatives in business and government.

</p><p><b>Membership Renewals</b>: With the turnover of the new financial year, Renewal Notices are due to go out shortly, except of course to Life Members and recent new Members. Here is <a href="https://privacy.org.au/about/members/">Membership-related information</a></p>]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
