<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Commentary &#8211; Australian Privacy Foundation</title>
	<atom:link href="https://privacy.org.au/category/commentary/feed/" rel="self" type="application/rss+xml" />
	<link>https://privacy.org.au</link>
	<description>Defending your right to be free from intrusion</description>
	<lastBuildDate>Wed, 17 Apr 2024 03:07:17 +0000</lastBuildDate>
	<language>en-AU</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Digital ‘death knocks’: is it fair game for journalists to mine social media profiles of victims and their families?</title>
		<link>https://privacy.org.au/2024/04/17/digital-death-knocks-is-it-fair-game-for-journalists-to-mine-social-media-profiles-of-victims-and-their-families/</link>
		
		<dc:creator><![CDATA[Alysson Watson]]></dc:creator>
		<pubDate>Wed, 17 Apr 2024 03:07:17 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5346</guid>

					<description><![CDATA[Alysson Watson, Associate lecturer in journalism, University of Newcastle The family of Ash Good, one of the Bondi stabbing victims and the mother of the nine-month-baby who was also stabbed, issued a plea overnight for media to stop reproducing photos of Ash, her partner and their baby without consent. Good, 38, was an osteopath who&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2024/04/17/digital-death-knocks-is-it-fair-game-for-journalists-to-mine-social-media-profiles-of-victims-and-their-families/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<div class="theconversation-article-body"><p><span><a href="https://theconversation.com/profiles/alysson-watson-1514581">Alysson Watson</a>, Associate lecturer in journalism, <em><a href="https://theconversation.com/institutions/university-of-newcastle-1060">University of Newcastle</a></em></span></p>

<p>The family of Ash Good, one of the Bondi stabbing victims and the mother of the nine-month-baby who was also stabbed, issued a plea overnight for media to stop reproducing photos of Ash, her partner and their baby without consent.</p>

<p>Good, 38, was an osteopath who liked to exercise, post photographs of her young family and share thoughts on new motherhood: the endless nights and blurry days, the joy, the anxiety, <a href="https://www.smh.com.au/national/nsw/indescribable-love-new-mother-ash-good-among-the-bondi-junction-victims-20240414-p5fjmk.html">the “indescribable love”</a>.</p>

<p>Journalists discovered this from “mining” her (and her friends’ and family’s) social media accounts.</p>

<p>As well as Good’s family, federal politician Allegra Spender, whose electorate of Wentworth covers Westfield Bondi Junction, posted on social media a plea for “the media and everyone” to respect the wishes of those affected by the “tragedy at Bondi Junction”.</p>

<p>She wrote: “I have been contacted by Ash’s family. They have asked the media not to publish personal images from social media. I ask the media and everyone to respect their wishes.”</p>

<p>But will the victims’ privacy be respected? <a href="https://intellectdiscover.com/content/journals/10.1386/ajr_00106_7">My research </a>indicates that is unlikely.</p>



<h2>What can, and can’t, journalists do?</h2>

<p>The practice of journalists taking photos from social media, both with and without consent, is now commonplace, and is sanctioned in Australia by law and by professional codes, with some caveats.</p>

<p>Journalists <a href="https://www.oaic.gov.au/engage-with-us/submissions/privacy-act-review-issues-paper-submission/part-4-exemptions">are exempted</a> from the Privacy Act “in the course of journalism”, and while advice from professional bodies such as the <a href="https://presscouncil.org.au/standards/statement-of-principles">Australian Press Council</a> and the Australian Communications and Media Authority (<a href="https://www.acma.gov.au/sites/default/files/2019-12/Privacy%20guidelines%20for%20broadcasters.pdf">ACMA</a>) is to tread with caution when reproducing images from social media, they do permit publication “in the public interest”. So do the guidelines of media companies, including <a href="https://about.abc.net.au/wp-content/uploads/2012/06/EditorialPOL2011.pdf">the ABC</a>.</p>

<p>The ethical code that binds member journalists in Australia, <a href="https://www.meaa.org/meaa-media/code-of-ethics/">the MEAA Code of Ethics</a>, also advises journalists to respect privacy and grief. It gives them the right not to intrude, but tempers this advice with a “guidance clause” about their capacity to override standards if publication is in the public interest.</p>

<p>The “<a href="https://theconversation.com/whose-interests-why-defining-the-public-interest-is-such-a-challenge-84278">public interest</a>” is a nebulous concept that increasingly <a href="https://the-media-leader.com/in-the-public-interest/">extends</a> to “what the public is interested in”.</p>

<h2>The modern-day ‘death knock’</h2>

<p>As citizens and news consumers, we want information about everyone who is impacted, and it is the job of news reporters to feed the hungry beast that is digital news. How can they resist the intensely personal content that is shared on “public” social media accounts which gives such a human face to tragedy? Is it reasonable to expect them to?</p>

<p>These are questions I am exploring though <a href="https://intellectdiscover.com/content/journals/10.1386/ajms_00134_1">my PhD research</a> into the practice journalists informally (and perhaps unpalatably) call <a href="https://www.editorandpublisher.com/stories/death-knocks,204472">the “death knock”. </a></p>

<p>On hearing of a newsworthy death (or crime or major incident), journalists will do whatever they can to get information about the people impacted – the perpetrators, victims, and witnesses.</p>

<p>The job of gathering news is to find the most credible sources, and, in addition to expert voices (such as police, ambulance, health authorities and politicians), those who know something about the event or the people impacted.</p>

<h2>Should journalists ask for permission?</h2>

<p>Increasingly, in the digital age, newsgathering starts (and sometimes ends) with journalists mining social media.</p>

<p>Journalists use social media as a tool to find people they want to interview, but also as a source of information, images and tributes.</p>

<p>If people’s accounts are set to “public”, there is nothing to stop journalists using the photos and comments they find there in their stories.</p>

<p>Some journalists will pause and ask for permission, but not all will, and most do not feel compelled to.</p>

<p>However, my research indicates that journalists, by and large, are not thoughtless when it comes to what some view as stealing images from social media. They face enormous pressure to do so, from colleagues, editors and competitors.</p>

<p>Many argue that if images are in the public domain, they are fair game. And if everyone else is doing it, why wouldn’t they? They may ask themselves “how can I tell my boss I’m not going to do it when our competitors have already done it? If I pause to ask for permission, will I be scooped? What if I don’t hear back? What if permission is refused?”</p>

<p>In the UK, where <a href="https://www.ipso.co.uk/">protections</a> from media harassment are arguably stronger, people impacted by tragedy are <a href="https://www.gov.uk/government/publications/handling-media-attention/handling-media-attention-after-a-major-incident">advised</a> to check their social media privacy settings or delete material altogether.</p>

<p>This though assumes users are social media-literate, <a href="https://www.routledge.com/Journalism-Ethics-at-the-Crossroads-Democracy-Fake-News-and-the-News-Crisis/Patching-Hirst/p/book/9780367197285">but journalists</a> are “very adept at finding ways around privacy settings, and won’t hesitate to do so in pursuit of a story or photo”.</p>



<h2>A better path forward?</h2>

<p>Reporting on tragedy is routine work for many journalists, but it can <a href="https://www.journoresources.org.uk/mental-toll-reporting-tragedy-journalist/">take its toll</a>, sometimes in the form of <a href="https://americanpressinstitute.org/how-moral-injury-is-impacting-the-news-industry-and-what-you-can-do-about-it/#:%7E:text=For%20journalists%2C%20moral%20injury%20can,contravene%20an%20employee's%20moral%20code.">moral injury</a>, when they feel compelled to break their own moral code.</p>

<p>My research indicates journalists want better preparation, guidance, and support from their employers in reporting tragedy, and they want to be listened to about the impacts of this work on them.</p>

<p>However, in the realm of the “<a href="https://intellectdiscover.com/content/journals/10.1386/ajr_00106_7">digital death knock</a>” – the use of social media to report on tragedy – some argue that an ethical approach alone cannot stem what some believe is egregious behaviour, and that legislative (citizen privacy protections) and normative (stronger advice from professional bodies) approaches may be needed to protect journalists from themselves – and better protect people who fall victim to tragedy.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/227784/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/digital-death-knocks-is-it-fair-game-for-journalists-to-mine-social-media-profiles-of-victims-and-their-families-227784">original article</a>.</p></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Your face for sale: anyone can legally gather and market your facial data without explicit consent</title>
		<link>https://privacy.org.au/2024/03/06/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent/</link>
		
		<dc:creator><![CDATA[Margarita Vladimirova]]></dc:creator>
		<pubDate>Wed, 06 Mar 2024 08:46:00 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5341</guid>

					<description><![CDATA[Margarita Vladimirova, PhD in Privacy Law and Facial Recognition Technology, Deakin University The morning started with a message from a friend: “I used your photos to train my local version of Midjourney. I hope you don’t mind”, followed up with generated pictures of me wearing a flirty steampunk costume. I did in fact mind. I&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2024/03/06/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><a href="https://www.shutterstock.com/image-photo/futuristic-technological-scanning-face-beautiful-woman-1554013514"><img fetchpriority="high" decoding="async" src="https://images.theconversation.com/files/579102/original/file-20240301-28-tzp738.jpg?ixlib=rb-1.1.0&#038;rect=956%2C85%2C6119%2C4218&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" width="754" height="520" alt="" /></a><figcaption class="wp-caption-text">Kitreel/Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/margarita-vladimirova-1514577">Margarita Vladimirova</a>, PhD in Privacy Law and Facial Recognition Technology, <em><a href="https://theconversation.com/institutions/deakin-university-757">Deakin University</a></em></span></p>

<p>The morning started with a message from a friend: “I used your photos to train my local version of Midjourney. I hope you don’t mind”, followed up with generated pictures of me wearing a flirty steampunk costume.</p>

<p>I did in fact mind. I felt violated. Wouldn’t you? I bet Taylor Swift did when <a href="https://theconversation.com/taylor-swift-deepfakes-new-technologies-have-long-been-weaponised-against-women-the-solution-involves-us-all-222268">deepfakes of her hit the internet</a>. But is the legal status of my face different from the face of a celebrity?</p>

<p>Your facial information is a unique form of personal sensitive information. It can identify you. Intense profiling and mass government surveillance <a href="https://www.forbes.com/sites/kalevleetaru/2019/05/06/as-orwells-1984-turns-70-it-predicted-much-of-todays-surveillance-society/?sh=38a97b4e11de">receives much attention</a>. But businesses and individuals are also using tools that <a href="https://www.sbs.com.au/news/article/creepy-and-invasive-kmart-bunnings-and-the-good-guys-accused-of-using-facial-recognition-technology/h08q8evb1">collect</a>, <a href="https://www.afr.com/technology/how-clearview-ai-unleashed-a-global-dystopia-20230929-p5e8lc">store</a> and modify facial information, and we’re facing an unexpected wave of <a href="https://deepai.org/machine-learning-model/text2img">photos</a> and <a href="https://theconversation.com/what-is-sora-a-new-generative-ai-tool-could-transform-video-production-and-amplify-disinformation-risks-223850">videos</a> generated with artificial intelligence (AI) tools.</p>

<p>The development of legal regulation for these uses is lagging. At what levels and in what ways should our facial information be protected?</p>

<h2>Is implied consent enough?</h2>

<p>The Australian <a href="https://www.legislation.gov.au/C2004A03712/latest/text">Privacy Act</a> considers biometric information (which would include your face) to be a part of our personal sensitive information. However, the act doesn’t <em>define</em> biometric information.</p>

<p>Despite its drawbacks, the act is currently the main legislation in Australia aimed at facial information protection. It states biometric information cannot be collected without a person’s consent.</p>

<p>But the law doesn’t specify whether it should be <a href="https://www.ipc.nsw.gov.au/fact-sheet-consent">express or implied consent</a>. Express consent is given explicitly, either orally or in writing. Implied consent means consent may reasonably be inferred from the individual’s actions in a given context. For example, if you walk into a store that has a sign “facial recognition camera on the premises”, your consent is implied.</p>

<figure class="align-right zoomable">
            <figure style="width: 590px" class="wp-caption alignright"><a href="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img decoding="async" alt="A poster at a supermarket that says camera technology trial in progress, partially obscured by a couple of bins." src="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=237&#038;fit=clip" srcset="https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=1067&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/578587/original/file-20240228-28-ns24xh.jpg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=1340&#038;fit=crop&#038;dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="600" height="1067" /></a><figcaption class="wp-caption-text">An inconspicuous sign that flags camera technology trial is in progress counts as implied consent. &#8211; Margarita Vladimirova</figcaption></figure>
            <figcaption>
             
            </figcaption>
        </figure>

<p>But using implied consent opens our facial data up to potential exploitation. <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/how-your-data-is-used/articles/kmart-bunnings-and-the-good-guys-using-facial-recognition-technology-in-store">Bunnings, Kmart</a> and <a href="https://www.theguardian.com/business/2023/feb/19/woolworths-expands-self-checkout-ai-that-critics-say-treats-every-customer-as-a-suspect">Woolworths</a> have all used easy-to-miss signage that facial recognition or camera technology is used in their stores.</p>

<h2>Valuable and unprotected</h2>

<p>Our facial information has become so valuable, <a href="https://www.theguardian.com/australia-news/2023/oct/24/australian-federal-police-afp-pimeyes-facial-recognition-facecheck-id-search-engine-platform">data companies such as Clearview AI and PimEye</a> are mercilessly hunting it down on the internet <a href="https://onezero.medium.com/i-got-my-file-from-clearview-ai-and-it-freaked-me-out-33ca28b5d6d4">without our consent</a>.</p>

<p>These companies put together databases for sale, used not only by the police in various countries, <a href="https://www.theguardian.com/australia-news/2023/oct/24/australian-federal-police-afp-pimeyes-facial-recognition-facecheck-id-search-engine-platform">including Australia</a>, but also by <a href="https://www.clearview.ai/developer-api">private companies</a>.</p>

<p>Even if you deleted all your facial data from the internet, you could easily be captured in public and appear in some database anyway. Being in someone’s TikTok video <a href="https://www.abc.net.au/news/2022-07-14/tiktok-video-maree-melbourne-flowers/101228418">without your consent</a> is a prime example – in Australia this is legal.</p>



<p>Furthermore, we’re also now contending with generative AI programs such as Midjourney, DALL-E 3, Stable Diffusion and others. Not only the collection, but the modification of our facial information can be easily performed by anyone.</p>

<p>Our faces are unique to us, they’re part of what we perceive as ourselves. But they don’t have special legal status or special legal protection.</p>

<p>The only action you can take to protect your facial information from aggressive collection by a store or private entity <a href="https://www.oaic.gov.au/privacy/privacy-complaints/lodge-a-privacy-complaint-with-us">is to complain</a> to the office of the Australian Information Commissioner, which may or may not result in an investigation.</p>

<p>The same applies to deepfakes. The Australian Competition and Consumer Commission will consider only activity that applies to trade and commerce, for example if a <a href="https://www.theguardian.com/technology/2022/mar/18/accc-takes-meta-to-court-over-facebook-scam-ads-depicting-australian-identities">deepfake is used for false advertising</a>.</p>

<p>And the Privacy Act doesn’t protect us from other people’s actions. I didn’t consent to have someone train an AI with my facial information and produce made-up images. But there is no oversight on such use of generative AI tools, either.</p>

<p>There are currently no laws that <em>prevent</em> other people from collecting or modifying your facial information.</p>



<h2>Catching up the law</h2>

<p>We need a range of regulations on the collection and modification of facial information. We also need a stricter status of facial information itself. Thankfully, some developments in this area are looking promising.</p>

<p>Experts at the University of Technology Sydney have proposed a comprehensive legal framework for <a href="https://www.uts.edu.au/human-technology-institute/projects/facial-recognition-technology-towards-model-law">regulating the use of facial recognition technology</a> under Australian law.</p>

<p>It contains proposals for regulating the first stage of non-consensual activity: the collection of personal information. That may help in the development of new laws.</p>

<p>Regarding photo modification using AI, we’ll have to wait for announcements from the newly established government <a href="https://www.minister.industry.gov.au/ministers/husic/media-releases/new-artificial-intelligence-expert-group">AI expert group</a> working to develop “safe and responsible AI practices”.</p>

<p>There are no specific discussions about a higher level of protection for our facial information in general. However, the government’s recent <a href="https://www.ag.gov.au/rights-and-protections/publications/government-response-privacy-act-review-report">response to the Attorney-General’s Privacy Act review</a> has some promising provisions.</p>

<p>The government has agreed further consideration should be given to enhanced risk assessment requirements in the context of facial recognition technology and other uses of biometric information. This work should be coordinated with the government’s ongoing work on Digital ID and the National Strategy for Identity Resilience.</p>

<p>As for consent, the government has agreed in principle that the definition of consent required for biometric information collection should be amended to specify it must be voluntary, informed, current, specific and unambiguous.</p>

<p>As facial information is increasingly exploited, we’re all waiting to see whether these discussions do become law – hopefully sooner rather than later.</p>

<hr />

<p><em>Correction: we have amended a sentence to clarify Woolworths use camera technology but not necessarily facial recognition technology.</em><!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/224643/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/your-face-for-sale-anyone-can-legally-gather-and-market-your-facial-data-without-explicit-consent-224643">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>70% of Australians don’t feel in control of their data as companies hide behind meaningless privacy terms</title>
		<link>https://privacy.org.au/2024/03/06/70-of-australians-dont-feel-in-control-of-their-data-as-companies-hide-behind-meaningless-privacy-terms/</link>
		
		<dc:creator><![CDATA[Katharine Kemp]]></dc:creator>
		<pubDate>Wed, 06 Mar 2024 08:41:33 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5338</guid>

					<description><![CDATA[Katharine Kemp, Associate Professor, Faculty of Law &#38; Justice, UNSW Sydney Australian consumers don’t understand how companies – including data brokers – track, target and profile them. This is revealed in new research on consumer understanding of privacy terms, released by the non-profit Consumer Policy Research Centre and UNSW Sydney today. Our report also reveals&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2024/03/06/70-of-australians-dont-feel-in-control-of-their-data-as-companies-hide-behind-meaningless-privacy-terms/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><a href="https://www.shutterstock.com/image-photo/smart-technologies-your-smartphone-collection-analysis-1490310101"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/577785/original/file-20240226-26-ihj4ej.jpg?ixlib=rb-1.1.0&#038;rect=697%2C117%2C4296%2C3034&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" width="754" height="533" alt="" /></a><figcaption class="wp-caption-text">Trismegist san/Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/katharine-kemp-402096">Katharine Kemp</a>, Associate Professor, Faculty of Law &amp; Justice, <em><a href="https://theconversation.com/institutions/unsw-sydney-1414">UNSW Sydney</a></em></span></p>

<p>Australian consumers don’t understand how companies – including data brokers – track, target and profile them. This is revealed in new research on consumer understanding of privacy terms, released by the non-profit <a href="https://cprc.org.au/">Consumer Policy Research Centre</a> and UNSW Sydney today.</p>

<p><a href="https://cprc.org.au/report/singled-out">Our report</a> also reveals 70% of Australians feel they have little or no control over how their data is disclosed between companies. Many expressed anger, frustration and distrust.</p>

<p>These findings are particularly important as the government considers <a href="https://www.ag.gov.au/rights-and-protections/publications/government-response-privacy-act-review-report">long-overdue reforms to our privacy legislation</a>, and the consumer watchdog finalises its <a href="https://www.accc.gov.au/inquiries-and-consultations/digital-platform-services-inquiry-2020-25/march-2024-interim-report">upcoming report on data brokers</a>.</p>

<p>If Australians are to have any hope of fair and trustworthy data handling, the government must stop companies from <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3432769">hiding their practices</a> behind confusing and misleading privacy terms and mandate fairness in data handling.</p>

<h2>We are all being tracked</h2>

<p>Our activities online and offline are constantly tracked by various companies, including <a href="https://cprc.org.au/wp-content/uploads/2023/09/CPRC-Submission-Data-brokers-ACCC-August-2023.pdf">data brokers</a> that trade in our personal information.</p>

<p>This includes data about our activity and purchases on websites and apps, relationship status, children, financial circumstances, life events, health concerns, search history and location.</p>

<p>Many businesses focus their efforts on finding new ways to track and profile us, despite repeated evidence that consumers view this as <a href="https://www.accc.gov.au/about-us/publications/digital-platforms-inquiry-final-report">misuse of their personal information</a>.</p>

<p>Companies describe the data they collect in confusing and unfamiliar terms. Much of this wording seems designed to prevent us from understanding or objecting to the use and disclosure of our personal information, often collected in surreptitious ways.</p>

<p>Businesses can use your data <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3432769">to make more profit at your expense</a>. This includes</p>

<ul>
<li>charging you a higher price</li>
<li>preventing you from seeing better offers</li>
<li>micro-targeting political messages or ads based on your health information</li>
<li>reducing the priority you’re given in customer service</li>
<li>creating a profile (which you’ll never see) to share with a prospective employer, insurer or landlord.</li>
</ul>



<h2>Anonymised, pseudonymised, hashed</h2>

<p>Businesses commonly try to argue this information is “<a href="https://www5.austlii.edu.au/au/legis/cth/consol_act/pa1988108/s6.html#de-identified">de-identified</a>” or not “<a href="https://www5.austlii.edu.au/au/legis/cth/consol_act/pa1988108/s6.html#personal_information">personal</a>”, to avoid running afoul of the federal Privacy Act in which these terms are defined.</p>

<p>But many privacy policies muddy the waters by using other, undefined terms. They create the impression data can’t be used to single out the consumer or influence what they’re shown online – even when it can.</p>



<p>Privacy policies commonly refer to:</p>

<ul>
<li>anonymised data</li>
<li>pseudonymised information</li>
<li>hashed emails</li>
<li>audience data</li>
<li>aggregated information.</li>
</ul>

<p>These terms have no legal definition and no fixed meaning in practice.</p>

<p>Data brokers and other companies may use “pseudonymised information” or “hashed email addresses” (essentially, encrypted addresses) to create detailed profiles. These will be shared with other businesses without our knowledge. They do this by matching the information collected about us by various companies in different parts of our lives.</p>

<p>“Anonymised information” – not a legal term in Australia – may sound like it wouldn’t reveal anything about an individual consumer. Some companies use it when only a person’s name and email have been removed, but we can still be identified by other unique or rare characteristics.</p>

<h2>What did our survey find?</h2>

<p>Our survey showed Australians do not feel in control of their personal information. More than 70% of consumers believe they have very little or no control over what personal information online businesses share with other companies.</p>

<p><iframe width="100%" style="border: none;" id="mJYfr" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/mJYfr/" frameborder="0"><span style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" data-mce-type="bookmark" class="mce_SELRES_start">﻿</span></iframe></p>

<p>Only a third of consumers feel they have at least moderate control over whether businesses use their personal information to create a profile about them.</p>

<p>Most consumers have no understanding of common terms in privacy notices, such as “hashed email address” or “advertising ID” (a unique ID usually assigned to one’s device).</p>

<p><iframe loading="lazy" width="100%" height="400px" style="border: none;" id="52P8q" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/52P8q/" frameborder="0"></iframe></p>

<p>And it’s likely to be worse than these statistics suggest, since some consumers may overestimate their knowledge.</p>

<p>The terms refer to data widely used to track and influence us without our knowledge. However, when consumers don’t recognise descriptions of personal information, they’re less likely to know whether that data could be used to single them out for tracking, influencing, profiling, discrimination or exclusion.</p>

<p>Most consumers either don’t know, or think it unlikely, that “pseudonymised information”, a “hashed email address” or “advertising ID” can be used to single them out from the crowd. They can.</p>

<p>Most consumers think it’s unacceptable for businesses they have no direct relationship with to use their email address, IP address, device information, search history or location data. However, data brokers and other “data partners” not in direct contact with consumers commonly use such data.</p>

<p><iframe width="100%" style="border: none;" id="AgFEP" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/AgFEP/" frameborder="0"></iframe></p>

<p>Consumers are understandably frustrated, anxious and angry about the unfair and untrustworthy ways organisations make use of their personal information and expose them to increased risk of data misuse.</p>

<h2>Fairness, not ‘education’</h2>

<p>Simply educating consumers about the terms used by companies and the ways their data is shared may seem an obvious solution.</p>

<p>However, we don’t recommend this for three reasons. Firstly, we can’t be sure of the meaning of undefined terms. Companies will likely keep coming up with new ones.</p>

<p>Secondly, it’s unreasonable to place the burden of understanding complex data ecosystems on consumers who naturally lack expertise in these areas.</p>

<p>Thirdly, “education” is pointless when consumers are not given real choices about the use of their data.</p>

<p>Urgent law reform is needed to make Australian privacy protections fit for the digital era. This should include clarifying that information that <a href="https://brusselsprivacyhub.eu/publications/BPH-Working-Paper-VOL6-N24.pdf">singles an individual out from the crowd</a> is “personal information”.</p>

<p>We also need a “fair and reasonable” test for data handling, instead of take-it-or-leave-it privacy “consents”.</p>

<p>Most of us can’t avoid participating in the digital economy. These changes would help ensure that instead of confusing privacy terms, there are substantial, meaningful legal requirements for how our personal information is handled.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/224072/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/70-of-australians-dont-feel-in-control-of-their-data-as-companies-hide-behind-meaningless-privacy-terms-224072">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>For domestic violence victim-survivors, a data or privacy breach can be extraordinarily dangerous</title>
		<link>https://privacy.org.au/2023/12/05/for-domestic-violence-victim-survivors-a-data-or-privacy-breach-can-be-extraordinarily-dangerous/</link>
		
		<dc:creator><![CDATA[Catherine Fitzpatrick]]></dc:creator>
		<pubDate>Tue, 05 Dec 2023 12:32:23 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5332</guid>

					<description><![CDATA[A suite of recent cybersecurity data breaches highlight an urgent need to overhaul how companies and government agencies handle our data. But these incidents pose particular risks to victim-survivors of domestic violence. The onus is on service providers – such as utilities, telcos, internet companies and government agencies – to ensure they don’t risk the safety of their most vulnerable customers by being careless with their data. <span class="excerpt-more"><a href="https://privacy.org.au/2023/12/05/for-domestic-violence-victim-survivors-a-data-or-privacy-breach-can-be-extraordinarily-dangerous/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p><span><a href="https://theconversation.com/profiles/catherine-fitzpatrick-1450440">Catherine Fitzpatrick</a>, Adjunct Associate Professor, School of Social Sciences, <em><a href="https://theconversation.com/institutions/unsw-sydney-1414">UNSW Sydney</a></em></span></p>

<p>A suite of recent cybersecurity data breaches highlight an urgent need to overhaul how companies and government agencies handle our data. But these incidents pose particular risks to victim-survivors of domestic violence.</p>

<p>In fact, authorities across Australia and the United Kingdom are raising concerns about how privacy breaches have endangered these customers.</p>

<p>The onus is on service providers – such as utilities, telcos, internet companies and government agencies – to ensure they don’t risk the safety of their most vulnerable customers by being careless with their data.</p>



<h2>A suite of incidents</h2>

<p>Earlier this year, the UK Information Commissioner reported it had <a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/09/data-breaches-put-domestic-abuse-victims-lives-at-risk-uk-information-commissioner-warns/">reprimanded</a> seven organisations since June 2022 for privacy breaches affecting victims of domestic abuse.</p>

<p>These included organisations revealing the safe addresses of the victims to their alleged abuser. In one case, a family had to be moved immediately to emergency accommodation.</p>

<p>In another case, an organisation disclosed the home address of two children to their birth father (who was in prison for raping their mother).</p>

<p>The UK Information Commissioner has called for better training and processes. This includes regular verification of contact information and securing data against unauthorised access.</p>

<p>In 2021, the Australian Information Commissioner and Privacy Commissioner <a href="https://www.austlii.edu.au/cgi-bin/viewdoc/au/cases/cth/AICmr/2021/12.html">took action against Services Australia</a> for disclosing a victim-survivor’s new address to her former partner.</p>

<p>The commissioner ordered a written apology and a A$19,980 compensation payment. It also ordered an independent audit of how Services Australia updates contact details for separating couples with shared records.</p>

<p>An <a href="https://www.oaic.gov.au/about-the-OAIC/our-corporate-information/oaic-annual-reports/annual-report-201819/part-2-performance">earlier case</a> involved a telecommunications company and the publisher of a public directory.</p>

<p>The commissioner ordered them each to pay $20,000 to a victim of domestic violence whose details were made public, which jeopardised her safety.</p>

<p>More recently, the Energy and Water Ombudsman Victoria reported a <a href="https://www.ewov.com.au/reports/annual-report-2023">case</a> where an electricity provider inadvertently provided a woman’s new address to her ex-partner. The woman had to buy security cameras for protection. The company has since revised its procedures.</p>

<p>The Energy and Water Ombudsman Victoria has also <a href="https://www.ewov.com.au/reports/annual-report-2023">reviewed complaints</a> received in 2022-23 related to domestic violence. These include failing to flag accounts of victims who disclosed abuse, as well as potentially unsafe consumer automation and data governance processes.</p>

<p>The Victorian Essential Services Commission <a href="https://www.esc.vic.gov.au/water/sector-performance-and-reporting/compliance-and-enforcement-water-sector/south-east-water-corporation-enforceable-undertaking-2023">accepted a court-enforceable undertaking</a> from a water company that it would improve processes after allegations its actions put customers affected by family violence at risk.</p>

<p>The commission found the company failed to adequately protect the personal information of two separate customers in 2021 and 2022, by sending correspondence with their personal information to the wrong addresses.</p>

<p>In both cases, the customer had not disclosed their experience of domestic violence. Nevertheless, the regulator noted these “erroneous information disclosures put these customers at risk of harm”.</p>

<p>Australia’s Telecommunications Industry Ombudsman received about <a href="https://www.tio.com.au/news/better-consumer-protection-rules-are-needed-telco-consumers-suffering-family-violence">300 complaints</a> involving domestic violence in 2022-23, with almost two-thirds relating to mobile phones.</p>

<p>Complaints included instances of telcos disclosing the addresses of victim-survivors to perpetrators or of frontline staff not believing victim-survivors. There were also cases of telcos insisting a consumer experiencing family violence contact the perpetrator of family violence. The report noted:</p>

<blockquote>
<p>For example, one person was asked by her telco to bring her abusive ex-partner into a store to change her number to her new account.</p>

<p>We’ve also had complaints about telcos disconnecting the services of a consumer experiencing family violence – sometimes at the request of the account holder who is the perpetrator of the violence – despite access to those services being critical to the consumer staying safe.</p>
</blockquote>

<p>The Australian Financial Complaints Authority <a href="https://www.afca.org.au/news/information-for-consumer-advocates/supporting-people-impacted-by-domestic-violence">resolved more than 500 complaints</a> from people experiencing domestic and family violence in 2021-22, including those related to privacy breaches.</p>



<h2>Change is slowly under way</h2>

<p>In May, <a href="https://www.aemc.gov.au/sites/default/files/2022-09/RRC0042%20-%20Protecting%20customers%20affected%20by%20family%20violence%20-%20Final%20Determination_clean.pdf">new national rules</a> came into force to provide better protection and support to energy customers experiencing domestic violence.</p>

<p>These rules mandate retailers prioritise customer safety and protect their personal information. This includes account security measures to prevent perpetrators from accessing victim-survivors’ sensitive data.</p>

<p>They also prohibit the disclosure of information without consent. In issuing its rules, the Australian Energy Markets Commission noted the heightened risk of partner homicides following separations.</p>

<p>The Telecommunications Industry Ombudsman has called for <a href="https://www.tio.com.au/news/better-consumer-protection-rules-are-needed-telco-consumers-suffering-family-violence">mandatory, uniform and enforceable rules</a>. The current voluntary industry code and guidelines fall short in protecting phone and internet customers experiencing domestic violence.</p>

<p>New rules should include training, policies and recognition of violence as a cause of payment difficulties. They should also factor in how service suspension or disconnection affects victim-survivors.</p>

<p>The Australian Information and Privacy Commissioner <a href="https://www.oaic.gov.au/newsroom/communications-law-bulletin-interview-with-commissioner-falk">said</a> last year:</p>

<blockquote>
<p>Sadly, we continue to receive cases of improper disclosure of personal information off line by businesses to ex partners who target women in family disputes and domestic violence. All of these issues reinforce the need for privacy by design.</p>
</blockquote>

<p>In its response to a <a href="https://www.ag.gov.au/sites/default/files/2023-09/government-response-privacy-act-review-report.PDF">review of the Privacy Act</a>, the government has agreed the Office of the Australian Information Commissioner should help develop guidance to reduce risk to customers.</p>

<p>We must work harder to ensure data and privacy breaches do not leave victim-survivors of domestic violence at greater risk from perpetrators.</p>

<p><em>The National Sexual Assault, Family and Domestic Violence Counselling Line – 1800 RESPECT (1800 737 732) – is available 24 hours a day, seven days a week for any Australian who has experienced, or is at risk of, family and domestic violence and/or sexual assault.</em><!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/216630/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/for-domestic-violence-victim-survivors-a-data-or-privacy-breach-can-be-extraordinarily-dangerous-216630">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Popular fertility apps are engaging in widespread misuse of data, including on sex, periods and pregnancy</title>
		<link>https://privacy.org.au/2023/03/22/popular-fertility-apps-are-engaging-in-widespread-misuse-of-data-including-on-sex-periods-and-pregnancy/</link>
		
		<dc:creator><![CDATA[Katharine Kemp]]></dc:creator>
		<pubDate>Wed, 22 Mar 2023 11:24:05 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5285</guid>

					<description><![CDATA[Fertility apps collect deeply sensitive data about consumers’ sex lives, health, emotional states and menstrual cycles. And many of them are intended for use by children as young as 13. An analysis by UNSW's Katharine Kemp has uncovered a number of concerning practices by these apps including: confusing and misleading privacy messages, a lack of choice in how data are used, inadequate de-identification measures when data are shared with other organisations, and retention of data for years even after a consumer stops using the app, exposing them to unnecessary risk from potential data breaches. <span class="excerpt-more"><a href="https://privacy.org.au/2023/03/22/popular-fertility-apps-are-engaging-in-widespread-misuse-of-data-including-on-sex-periods-and-pregnancy/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <img decoding="async" src="https://images.theconversation.com/files/516601/original/file-20230321-690-se9b8m.jpeg?ixlib=rb-1.1.0&#038;rect=24%2C58%2C3210%2C2095&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" class="aligncenter" />
</figure>

<p><span><a href="https://theconversation.com/profiles/katharine-kemp-402096">Katharine Kemp</a>, Senior Lecturer, Faculty of Law &amp; Justice, <em><a href="https://theconversation.com/institutions/unsw-sydney-1414">UNSW Sydney</a></em></span></p>

<p>New research reveals serious privacy flaws in fertility apps used by Australian consumers – emphasising the need for urgent reform of the Privacy Act.</p>

<p>Fertility apps provide a number of features. For instance, they may help users track their periods, identify a “fertile window” if they’re trying to conceive, track different stages and symptoms of pregnancy, and prepare for parenthood up until the baby’s birth.</p>

<p>These apps collect deeply sensitive data about consumers’ sex lives, health, emotional states and menstrual cycles. And many of them are intended for use by children as young as 13.</p>

<p>My report <a href="https://allenshub.unsw.edu.au/sites/default/files/2023-03/KKemp%20Your%20Body%20Our%20Data%2022.03.23.pdf">published today</a> analysed the privacy policies, messages and settings of 12 of the most popular fertility apps used by Australian consumers (excluding apps that require a connection with a wearable device).</p>

<p>This analysis uncovered a number of concerning practices by these apps including:</p>

<ul>
<li>confusing and misleading privacy messages</li>
<li>a lack of choice in how data are used</li>
<li>inadequate de-identification measures when data are shared with other organisations</li>
<li>retention of data for years even after a consumer stops using the app, exposing them to unnecessary risk from potential data breaches.</li>
</ul>



<h2>The data collected</h2>

<p>The apps in this study collect intimate data from consumers, such as:</p>

<ul>
<li>their pregnancy test results</li>
<li>when they have sex and whether they had an orgasm</li>
<li>whether they used a condom or “withdrawal” method</li>
<li>when they have their period</li>
<li>how their moods change (including anxiety, panic and depression)</li>
<li>and if they have health conditions such as polycystic ovary syndrome, endometriosis or uterine fibroids.</li>
</ul>

<p>Some ask for unnecessary details, such as when a user smokes and drinks alcohol, their education level, whether they struggle to pay their bills, if they feel safe at home, and whether they have stable housing.</p>

<p>They also track which support groups you join, what you add to your “to-do list” or “questions for doctor”, and which articles you read. All of this creates a more detailed picture of your health, family situation and intentions.</p>

<h2>Confusing or misleading privacy messages</h2>

<p>Consumers should expect the clearest information about how such data are collected, used and disclosed. Yet we found some of the messaging is highly confusing or misleading.</p>

<p>Some apps say “we will never sell your data”. But the fine print of the privacy policy contains a term that allows them to sell all your data as part of the sale of the app or database to another company.</p>

<p>This possibility is not just theoretical. Of the 12 apps included in the study, one was previously taken over by a drug development company, and another two by a digital media company.</p>

<p>Other apps explain privacy settings using language that makes it almost impossible for a consumer to understand what they are choosing, or obscure the privacy settings by placing them numerous clicks and scrolls away from the home screen.</p>

<h2>Keeping sensitive data for too long</h2>

<p>The <a href="https://www.abc.net.au/news/2022-10-21/medibank-optus-data-hack/101558932">major data breaches</a> of the past six months highlight the risks of companies holding onto personal data longer than necessary.</p>

<p>Breaches of highly sensitive information about health and sexual activities could lead to <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4387341">discrimination, exploitation, humiliation or blackmail</a>.</p>

<p>Most of the apps we analysed keep user data for at least three years after the user quits the app – or seven years in the case of one brand. Some apps give no indication of when user data will be deleted.</p>

<h2>Can’t count on ‘de-identification’</h2>

<p>Some apps also give consumers no choice regarding whether their “de-identified” health data will be sold or transferred to other companies for research or business. Or, they have consumers opted-in to these extra uses by default, putting the onus on users to opt out.</p>

<p>Moreover, some of these data are not truly de-identified. For example, removing your name and email address and replacing it with a unique number is not de-identification for legal purposes. Someone would only need to work out the link between your name and that number in order to link your whole record with you.</p>

<p>When supposedly de-identified Medicare records were published in 2016, <a href="https://www.unimelb.edu.au/newsroom/news/2017/december/research-reveals-de-identified-patient-data-can-be-re-identified">University of Melbourne researchers</a> showed how just a few data points can connect a de-identified record to a unique individual.</p>



<h2>Need for reform</h2>

<p>This research highlights the unfair and unsafe data practices consumers are subjected to when they use fertility apps. And these findings reinforce the need for Australia’s privacy laws to be updated.</p>

<p>We need improvements in what data are covered by the Privacy Act, what choices consumers can make about their data, what data uses are prohibited, and what security systems companies must have in place.</p>

<p>The government is seeking <a href="https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report">submissions</a> on potential privacy law reforms until March 31.</p>

<p>In the meantime, if you’re using a fertility app, there are some steps you can take to help reduce some of the privacy risks: <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/202127/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<ol>
<li>when launching the app for the first time, don’t agree to tracking of your data, or you can limit ad tracking via iPhone device settings</li>
<li>don’t log in via a social media account</li>
<li>don’t answer questions or add data you don’t need to for your own purposes</li>
<li>don’t share your Apple Health or FitBit data</li>
<li>if the app provides privacy choices, opt out of tracking and having your data sold or used for research, and delete your data when you stop using the app</li>
<li>bear in mind that every article you read, and how long you spend on it, and every group you join and comment you make there may be added to a profile about you.</li>
</ol>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/popular-fertility-apps-are-engaging-in-widespread-misuse-of-data-including-on-sex-periods-and-pregnancy-202127">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Proposed privacy reforms could help Australia play catch-up with other nations. But they fail to tackle targeted ads</title>
		<link>https://privacy.org.au/2023/02/21/proposed-privacy-reforms-could-help-australia-play-catch-up-with-other-nations-but-they-fail-to-tackle-targeted-ads/</link>
		
		<dc:creator><![CDATA[Katharine Kemp]]></dc:creator>
		<pubDate>Tue, 21 Feb 2023 03:36:25 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5276</guid>

					<description><![CDATA[In the recently released Privacy Act Review Report, the Attorney-General’s Department makes numerous important proposals that could see the legislation, enacted in 1988, begin to catch up to leading privacy laws globally. However, the report’s proposals on targeted advertising don’t properly address the power imbalance between companies and consumers. Instead, they largely accept a status quo that sacrifices consumer privacy to the demands of online targeted ad businesses. <span class="excerpt-more"><a href="https://privacy.org.au/2023/02/21/proposed-privacy-reforms-could-help-australia-play-catch-up-with-other-nations-but-they-fail-to-tackle-targeted-ads/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure><figure style="width: 744px" class="wp-caption alignright"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/511077/original/file-20230220-19-p8vr96.jpeg?ixlib=rb-1.1.0&#038;rect=123%2C6%2C4461%2C3052&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="516" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure><figcaption></figcaption><p>In the recently released <a href="https://www.ag.gov.au/sites/default/files/2023-02/privacy-act-review-report.pdf">Privacy Act Review Report</a>, the Attorney-General’s Department makes numerous important proposals that could see the legislation, enacted in 1988, begin to catch up to leading privacy laws globally.</p></figure><p>Among the positive proposed changes are: more realistic definitions of personal information and consent, tighter limits on data retention, a right to erasure, and a requirement for data practices to be fair and reasonable.</p><p>However, the report’s proposals on targeted advertising don’t properly address the power imbalance between companies and consumers. Instead, they largely accept a status quo that sacrifices consumer privacy to the demands of online targeted ad businesses.</p><h2>Capturing personal information used to track and profile</h2><p>Obligations under the existing Privacy Act only apply to “personal information”, but there has been legal uncertainty about what exactly constitutes “personal information”.</p><p>Currently, companies can track an individual’s online behaviour across different websites and connect it with their offline movements by matching their data with data collected from third parties, such as retailers or <a href="https://www.oracle.com/au/cx/advertising/data-enrichment-measurement/#data-enrichment">data brokers</a>.</p><p>Some of these companies claim they’re not dealing in “personal information” since they don’t use the individual’s name or email address. Instead, the matching is done based on a unique identifier allocated to that person – such as a <a href="https://help.abc.net.au/hc/en-us/articles/4402890310671">hashed email</a>, for example.</p><p>The report proposes an expanded definition of “personal information” that clearly includes the various technical and online identifiers being used to track and profile consumers. Under this definition, companies could no longer claim such data collection and sharing are outside the scope of the Privacy Act.</p><h2>Improved consent (when required)</h2><p>The report also proposes higher standards for how consent is sought, in cases where the act requires it. This would require voluntary, informed, current, specific and unambiguous consent.</p><p>This would work against organisations claiming consumers have consented to unexpected data uses just because they used a website or an app with a link to a broadly worded privacy policy with take-it-or-leave-it terms.</p><p>For example, companies would need to demonstrate the higher standard of consent to collect sensitive information about someone’s mental health or sexual orientation. The report also proposes that some further data practices, such as precise geolocation tracking, should require consent.</p><p>However, it specifically states consent should not be required for some targeted ad practices. Yet <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">surveys</a> show most consumers regard these as misuses of their personal information.</p><h2>‘Fair and reasonable’ data practices</h2><p>The report proposes a “fair and reasonable” test for dealings with personal information in general.</p><p>This recognises that consumers are saddled with too much of the responsibility for managing how their personal information is collected and used, while they lack the information, resources, expertise and control to do this effectively.</p><p>Instead, organisations covered by the Privacy Act should ensure their data handling practices are “fair and reasonable”, regardless of whether they have consumer consent. This would include considering whether a reasonable person would expect the data to be collected, used or disclosed in that way, and whether any dealing with children’s information is in the best interests of the child.</p><h2>Prohibiting targeted ads based on sensitive information</h2><p>The report proposes the prohibition of targeting based on sensitive information and traits. However, it’s not always easy to draw the line between “sensitive” information or traits, and other personal information.</p><p>For instance, is having an interest in “cosmetic procedures” or “rapid weight loss” a sensitive trait, or a general reading interest? Companies may exploit such grey areas. So while prohibiting targeting based on sensitive information is appropriate, it’s not enough in itself.</p><p>Another loophole arises in the report’s proposal that consumer consent should be necessary before an organisation trades in their personal information. The report leaves open an exception to this consent requirement where the “trading” is reasonably necessary for an organisation’s functions or activities.</p><p>This may be a substantial exception: data brokers, for example, might argue their trade in personal information (without consumers’ knowledge or consent) is necessary.</p><h2>Opt out only, not opt in</h2><p>Both the <a href="https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf">ACCC</a> and the <a href="https://assets.publishing.service.gov.uk/media/5fa557668fa8f5788db46efc/Final_report_Digital_ALT_TEXT.pdf">UK Competition &amp; Markets Authority</a> have recommended consumers should opt <em>in</em> to the use of their personal information for targeted advertising if they wish to see this content.</p><p>But the report proposes individuals should only be allowed to opt <em>out</em> of “seeing” targeted ads. This still wouldn’t stop companies from collecting, using and disclosing a user’s personal information for broader targeting purposes.</p><p>Even if a consumer opts out of seeing targeted ads, a business may continue to collect their personal information to create “lookalike audiences” and target other people with similar attributes.</p><p>Although having the option to opt out of seeing targeted ads gives consumers some limited control, companies still control the “<a href="https://www.accc.gov.au/system/files/DPB%20-%20DPSI%20-%20September%202021%20-%20Full%20Report%20-%2030%20September%202021%20%283%29_1.pdf">choice architecture</a>” of such settings. They can use their control to make opting out <a href="https://cprc.org.au/dupedbydesign/">confusing and difficult</a> for users, by forcing them to navigate through multiple pages or websites with obscurely labelled settings.</p><h2>Are targeted ads necessary to support online services?</h2><p>This limitation of consumers’ choices was partly explained by the view of the Attorney-General’s Department that targeted ads are necessary to fund “free” services. This refers to services where consumers “pay” with their attention and data (which companies use to make revenue from targeted advertising).</p><p>However, many companies using customers’ personal information for targeted ad businesses aren’t providing free services. Consider online marketplaces such as Amazon or eBay, or subscription-based products of media companies such as NewsCorp and Nine.</p><p>Meta (Facebook) and the Interactive Advertising Bureau Australia argued that if consumers opt out of targeted ads, a company should be able to stop offering them the service in question. This proposal was rejected on the basis that a platform can still show non-targeted ads to such consumers.</p><p>Inconsistently, the report failed to question broader claims that targeted advertising – as opposed to less intrusive forms of advertising – must be protected for online services to be viable.</p><h2>Real change is needed</h2><p>The reform of our privacy laws is long overdue. The government should avoid watering down potential improvements by attempting to preserve the status quo dictated by large businesses.</p><p>The government is seeking <a href="https://ministers.ag.gov.au/media-centre/landmark-privacy-act-review-report-released-16-02-2023">feedback on the report</a> until March 31. It will then decide on the final form of the reforms it proposes, before these are debated in Parliament. <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/200166/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><code style="display: none;"></code><span><em><a href="https://theconversation.com/institutions/unsw-sydney-1414"></a></em></span></p><p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/proposed-privacy-reforms-could-help-australia-play-catch-up-with-other-nations-but-they-fail-to-tackle-targeted-ads-200166">original article</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Government’s privacy review has some strong recommendations – now we really need action</title>
		<link>https://privacy.org.au/2023/02/17/governments-privacy-review-has-some-strong-recommendations-now-we-really-need-action/</link>
		
		<dc:creator><![CDATA[Bruce Baer Arnold]]></dc:creator>
		<pubDate>Fri, 17 Feb 2023 05:51:50 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5272</guid>

					<description><![CDATA[Attorney-General Mark Dreyfus yesterday released a report with 30 proposals for updating Australia’s privacy regime. The proposals are practical, necessary and overdue. However, they are just proposals, which have been made several times in the past before disappearing into the “too hard basket” of the Australian, state and territory governments.
We can expect to see lots of noise about specific proposals and hope the Albanese government (copied by state/territory counterparts) gives us the legislation we need. <span class="excerpt-more"><a href="https://privacy.org.au/2023/02/17/governments-privacy-review-has-some-strong-recommendations-now-we-really-need-action/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p>Attorney-General Mark Dreyfus yesterday <a href="https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report">released a report</a> with 30 proposals for updating Australia’s privacy regime. The proposals are practical, necessary and overdue. However, they are just proposals, which have been made several times in the past before disappearing into the “too hard basket” of the Australian, state and territory governments.</p><p>We can expect to see lots of noise about specific proposals and hope the Albanese government (copied by state/territory counterparts) gives us the legislation we need.</p><h2>Making sense of the report</h2><p>At a superficial level, the report gives effect to an election commitment – a promise to do something about federal privacy law, which is centred on public/private data collection and use (often online), rather than <a href="https://www.oaic.gov.au/privacy/privacy-in-your-state">state/territory</a> law dealing with activity such as strip searches, public hospital records, hidden cameras in toilets or senior figures distributing nude <a href="https://www.theguardian.com/australia-news/2023/feb/15/nsw-premier-stands-by-mp-peter-poulos-who-leaked-explicit-photos-of-female-rival">photos</a> of rivals.</p><p>More deeply, it is a recognition that, as part of the global economy where data and investment flow across borders, Australia continues to limp behind law and administration where protecting privacy is concerned. Updating the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">Privacy Act</a> also reflects recognition of challenges facing business and government in the world of ransomware, big data and artificial intelligence.</p><p>Unhappiness with the “she’ll be right, mate” approach of some large organisations and the failure of the key national privacy regulator (under-resourced, under-skilled and slow to act) was evident in the recent Optus and Medibank data breaches.</p><p>The proposals are not new. They have been voiced in detailed law reform commission reports, national and state parliamentary committee reports, statements by independent bodies such as the Law Council and academics over the past 20 years. The lack of action to date means Australians might be sceptical about what will happen once the government is lobbied by those whose interests are served by keeping things as they are, and it is again tempted to kick the can down the road.</p><h2>What do the proposals cover?</h2><p>It is important to remember that states and territories have significant responsibilities regarding privacy. The proposal to set up a working party involving those governments provokes thought about why that hasn’t been done already.</p><p>The initial proposal calls for changing the <a href="https://www.oaic.gov.au/privacy/the-privacy-act">1988 Privacy Act</a> to explicitly recognise that privacy is in the public interest, something that shouldn’t be controversial and offsets the absence of a human rights framework in the national constitution. After that, we are into some positive steps forward. However, these are tempered by a lot of “let’s wait and see the administration” before starting to celebrate.</p><p>The report retains the overall structure of the 1988 Act but, crucially, extends its coverage, in particular on what is “personal information”. It calls for consultation about criminal penalties and for prohibiting some of the ways organisations have got around restrictions.</p><p>It proposes consultation about removing the exemption for small businesses (those under A$3million) and about the handling of employee records. The major <a href="https://www.alrc.gov.au/publication/for-your-information-australian-privacy-law-and-practice-alrc-report-108/41-political-exemption/exemption-for-registered-political-parties-political-acts-and-practices/">exclusion</a> of political parties – a common source of unhappiness – would be modified. Journalists would be expected to behave better.</p><p>The report emphasises meaningful consent. In the collection of personal information, consent must be</p><blockquote><p>voluntary, informed, current, specific and unambiguous.</p></blockquote><p>This would bring Australia into line with Europe and indeed with much of our existing law, such as that administered by the Australian Competition and Consumer Commission.</p><p>We can expect controversy about a proposed right of “erasure” and about “de-indexing”. This is referred to as the “right to obscurity” in Europe, and means some personal information stays online but is not highlighted in search engine results. Individuals would need to ask for that obscurity, and it would not be granted for serious criminal offences.</p><p>There have been recurrent proposals for a “privacy tort”: this means people whose privacy has been seriously invaded could take action in a court to stop the invasion and/or gain compensation.</p><p>The report endorses <a href="https://www.alrc.gov.au/publication/serious-invasions-of-privacy-in-the-digital-era-alrc-report-123/4-a-new-tort-in-a-new-commonwealth-act-2/">this</a> recommendation by the Australian Law Reform Commission. It also proposes a “direct right of action” under the current act. This implicitly offsets the weakness of the Office of the Australian Information Commissioner (OAIC), one of the two national information privacy watchdogs.</p><p>The report grapples with data breaches such as the recent Optus and Medibank incidents. Proposals regarding mandatory reporting of such breaches tweak the current regime.</p><p>There is likely to be more push-back from business and public sector organisations regarding a proposed requirement for those bodies to “identify, mitigate and redress actual and reasonably foreseeable loss”. This is a first step towards persuading organisations to meaningfully lift their game and compensate for harms.</p><h2>It’s too soon to cheer</h2><p>On the surface, the report is a major step forward, something that business and the community should strongly endorse. In practice, we need to look beyond the headlines and see the details of how the proposals would be written into law, and whether the attorney-general can harness support in the face of the usual strong lobbying.</p><p>Proposals that there will be discussion, yet again, don’t provide much comfort. More worryingly, the proposals centre on the development and implementation of guidelines and standards by the OAIC.</p><p>In practice, the report proposes to perpetuate existing problems involving a regulator with a <a href="https://www.sciencedirect.com/science/article/abs/pii/S0167739X20329940">timid</a> corporate culture and a commitment to interpreting the legislation through the eyes of the bodies it is meant to <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4083468">regulate</a>. Change is better than good intentions.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/200079/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p><p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/governments-privacy-review-has-some-strong-recommendations-now-we-really-need-action-200079">original article</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>ChatGPT is a data privacy nightmare. If you’ve ever posted online, you ought to be concerned</title>
		<link>https://privacy.org.au/2023/02/08/chatgpt-is-a-data-privacy-nightmare-if-youve-ever-posted-online-you-ought-to-be-concerned/</link>
		
		<dc:creator><![CDATA[Uri Gal]]></dc:creator>
		<pubDate>Wed, 08 Feb 2023 03:19:51 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5266</guid>

					<description><![CDATA[Within two months of its release ChatGTP has reached 100 million active users, making it the fastest-growing consumer application ever launched. ChatGPT is underpinned by a large language model that requires massive amounts of data to function and improve, posing a privacy risks to each and every one of us who has ever posted online. <span class="excerpt-more"><a href="https://privacy.org.au/2023/02/08/chatgpt-is-a-data-privacy-nightmare-if-youve-ever-posted-online-you-ought-to-be-concerned/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[
<figure>
    <figure style="width: 744px" class="wp-caption alignnone"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/508567/original/file-20230207-13-uu7jfn.jpeg?ixlib=rb-1.1.0&#038;rect=35%2C0%2C5955%2C3988&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="505" /><figcaption class="wp-caption-text">(Image from Shutterstock)</figcaption></figure>
    
</figure>

<p>ChatGPT has taken the world by storm. Within two months of its release it reached 100 million <a href="https://news.yahoo.com/chatgpt-100-million-users-january-130619073.html">active users</a>, making it the fastest-growing consumer <a href="https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/">application ever launched</a>. Users are attracted to the tool’s <a href="https://oneusefulthing.substack.com/p/chatgtp-is-my-co-founder">advanced capabilities</a> – and concerned by its potential to cause disruption in <a href="https://theconversation.com/chatgpt-students-could-use-ai-to-cheat-but-its-a-chance-to-rethink-assessment-altogether-198019">various sectors</a>.</p>

<p>A much less discussed implication is the privacy risks ChatGPT poses to each and every one of us. Just yesterday, <a href="https://blog.google/technology/ai/bard-google-ai-search-updates/">Google unveiled</a> its own conversational AI called Bard, and others will surely follow. Technology companies working on AI have well and truly entered an arms race.</p>

<p>The problem is it’s fuelled by our personal data.</p>

<h2>300 billion words. How many are yours?</h2>

<p>ChatGPT is underpinned by a large language model that requires massive amounts of data to function and improve. The more data the model is trained on, the better it gets at detecting patterns, anticipating what will come next and generating plausible text.</p>

<p>OpenAI, the company behind ChatGPT, fed the tool some <a href="https://www.sciencefocus.com/future-technology/gpt-3/">300 billion words</a> systematically scraped from the internet: books, articles, websites and posts – including personal information obtained without consent.</p>

<p>If you’ve ever written a blog post or product review, or commented on an article online, there’s a good chance this information was consumed by ChatGPT.</p>

<h2>So why is that an issue?</h2>

<p>The data collection used to train ChatGPT is problematic for several reasons.</p>

<p>First, none of us were asked whether OpenAI could use our data. This is a clear violation of privacy, especially when data are sensitive and can be used to identify us, our family members, or our location.</p>

<p>Even when data are publicly available their use can breach what we call <a href="https://digitalcommons.law.uw.edu/wlr/vol79/iss1/10/">textual integrity</a>. This is a fundamental principle in legal discussions of privacy. It requires that individuals’ information is not revealed outside of the context in which it was originally produced.</p>

<p>Also, OpenAI offers no procedures for individuals to check whether the company stores their personal information, or to request it be deleted. This is a guaranteed right in accordance with the European General Data Protection Regulation (<a href="https://gdpr-info.eu/art-17-gdpr/">GDPR</a>) – although it’s still under debate whether ChatGPT is compliant <a href="https://blog.avast.com/chatgpt-data-use-legal">with GDPR requirements</a>.</p>

<p>This “right to be forgotten” is particularly important in cases where the information is inaccurate or misleading, which seems to be a <a href="https://www.fastcompany.com/90833017/openai-chatgpt-accuracy-gpt-4">regular occurrence</a> with ChatGPT.</p>

<p>Moreover, the scraped data ChatGPT was trained on can be proprietary or copyrighted. For instance, when I prompted it, the tool produced the first few paragraphs of Peter Carey’s novel “True History of the Kelly Gang” – a copyrighted text.</p>

<figure class="align-center zoomable">
            <figure style="width: 744px" class="wp-caption alignnone"><a href="https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=465&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=465&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=465&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=584&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=584&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/508517/original/file-20230206-23-t7mwbt.png?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=584&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="584" /></a><figcaption class="wp-caption-text">ChatGPT doesn’t consider copyright protection when generating outputs. Anyone using the outputs elsewhere could be inadvertently plagiarising. (ChatGPT, Author provided)</figcaption></figure>
            
        </figure>

<p>Finally, OpenAI did not pay for the data it scraped from the internet. The individuals, website owners and companies that produced it were not compensated. This is particularly noteworthy considering OpenAI was recently <a href="https://www.nasdaq.com/articles/microsofts-%2410-billion-investment-in-openai%3A-how-it-could-impact-the-ai-industry-and-stock">valued at US$29 billion</a>, more than double its <a href="https://www.forbes.com/sites/nicholasreimann/2023/01/05/chatgpt-creator-openai-discussing-offer-valuing-company-at-29-billion-report-says/?sh=f2ca73b11e04">value in 2021</a>.</p>

<p>OpenAI has also just <a href="https://openai.com/blog/chatgpt-plus/">announced ChatGPT Plus</a>, a paid subscription plan that will offer customers ongoing access to the tool, faster response times and priority access to new features. This plan will contribute to expected <a href="https://www.reuters.com/business/chatgpt-owner-openai-projects-1-billion-revenue-by-2024-sources-2022-12-15/">revenue of $1 billion by 2024</a>.</p>

<p>None of this would have been possible without data – our data – collected and used without our permission.</p>

<h2>A flimsy privacy policy</h2>

<p>Another privacy risk involves the data provided to ChatGPT in the form of user prompts. When we ask the tool to answer questions or perform tasks, we may inadvertently hand over <a href="https://www.forbes.com/sites/lanceeliot/2023/01/27/generative-ai-chatgpt-can-disturbingly-gobble-up-your-private-and-confidential-data-forewarns-ai-ethics-and-ai-law/?sh=5d7dd7ce7fdb">sensitive information</a> and put it in the public domain.</p>

<p>For instance, an attorney may prompt the tool to review a draft divorce agreement, or a programmer may ask it to check a piece of code. The agreement and code, in addition to the outputted essays, are now part of ChatGPT’s database. This means they can be used to further train the tool, and be included in responses to other people’s prompts.</p>

<p>Beyond this, OpenAI gathers a broad scope of other user information. According to the company’s <a href="https://openai.com/privacy/">privacy policy</a>, it collects users’ IP address, browser type and settings, and data on users’ interactions with the site – including the type of content users engage with, features they use and actions they take.</p>

<p>It also collects information about users’ browsing activities over time and across websites. Alarmingly, OpenAI states it may <a href="https://openai.com/privacy/">share users’ personal information</a> with unspecified third parties, without informing them, to meet their business objectives.</p>



<h2>Time to rein it in?</h2>

<p>Some experts believe ChatGPT is <a href="https://hbr.org/2022/12/chatgpt-is-a-tipping-point-for-ai">a tipping point for AI</a> – a realisation of technological development that can revolutionise the way we work, learn, write and even think. Its potential benefits notwithstanding, we must remember OpenAI is a private, for-profit company whose interests and commercial imperatives do not necessarily align with greater societal needs.</p>

<p>The privacy risks that come attached to ChatGPT should sound a warning. And as consumers of a growing number of AI technologies, we should be extremely careful about what information we share with such tools.</p>

<hr />

<p><em>The Conversation reached out to OpenAI for comment, but they didn’t respond by deadline.</em><!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/199283/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>


<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/chatgpt-is-a-data-privacy-nightmare-if-youve-ever-posted-online-you-ought-to-be-concerned-199283">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Not Big Brother, but close: a surveillance expert explains some of the ways we’re all being watched, all the time</title>
		<link>https://privacy.org.au/2022/12/19/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time/</link>
		
		<dc:creator><![CDATA[Ausma Bernot]]></dc:creator>
		<pubDate>Mon, 19 Dec 2022 01:23:03 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5255</guid>

					<description><![CDATA[Ausma Bernot, PhD Candidate, School of Criminology and Criminal Justice, Griffith University A group of researchers studied 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality. That&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2022/12/19/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<h1 class="legacy"></h1>

<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/499955/original/file-20221209-20279-c0jq3z.jpeg?ixlib=rb-1.1.0&#038;rect=95%2C107%2C7893%2C4383&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="419" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure>
</figure>
<p><span><a href="https://theconversation.com/profiles/ausma-bernot-963292">Ausma Bernot</a>, PhD Candidate, School of Criminology and Criminal Justice, <em><a href="https://theconversation.com/institutions/griffith-university-828">Griffith University</a></em></span></p>

<p>A group of <a href="https://www.nature.com/articles/srep01376;">researchers studied</a> 15 months of human mobility movement data taken from 1.5 million people and concluded that just four points in space and time were sufficient to identify 95% of them, even when the data weren’t of excellent quality.</p>

<p>That was back in 2013.</p>

<p>Nearly ten years on, surveillance technologies permeate all aspects of our lives. They collect swathes of data from us in various forms, and often without us knowing.</p>

<p>I’m a surveillance researcher with a focus on technology governance. Here’s my round-up of widespread surveillance systems I think everyone should know about.</p>

<h2>CCTV and open-access cameras</h2>

<p>Although China has more than 50% of <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">all surveillance cameras installed</a> in the world (about 34 cameras per 1,000 people), Australian cities are <a href="https://www.comparitech.com/vpn-privacy/the-worlds-most-surveilled-cities/">catching up</a>. In 2021, Sydney had 4.67 cameras per 1,000 people and Melbourne had 2.13.</p>

<p>While CCTV cameras can be used for legitimate purposes, such as promoting safety in cities and assisting police with criminal investigations, their use also poses serious concerns.</p>

<p>In 2021, New South Wales police <a href="https://www.innovationaus.com/facial-recognition-and-the-nsw-protest-crowds/">were suspected of</a> having used CCTV footage paired with facial recognition to find people attending anti-lockdown protests. When questioned, they didn’t confirm or deny if they had (or if they would in the future).</p>

<p>In August 2022, the United Nations confirmed CCTV is <a href="https://www.ohchr.org/en/documents/country-reports/ohchr-assessment-human-rights-concerns-xinjiang-uyghur-autonomous-region">being used to</a> carry out “serious human rights violations” against Uyghur and other predominantly Muslim ethnic minorities in the Xinjiang region of Northwest China.</p>

<p>The CCTV cameras in China don’t just record real-time footage. Many are equipped with facial recognition to <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html">keep tabs on</a> the movements of minorities. And some have reportedly been trialled to <a href="https://www.bbc.com/news/technology-57101248">detect emotions</a>.</p>

<p>The US also has a long history of using CCTV cameras to support racist policing practices. In 2021, Amnesty International <a href="https://www.amnesty.org/en/latest/news/2021/06/scale-new-york-police-facial-recognition-revealed/">reported</a> areas with a higher proportion of non-white residents had more CCTV cameras.</p>

<p>Another issue with CCTV is security. Many of these cameras are open-access, which means they don’t have password protection and can often be easily accessed online. So I could spend all day watching a livestream of someone’s porch, as long as there was an open camera nearby.</p>

<p>Surveillance artist Dries Depoorter’s recent project <a href="https://driesdepoorter.be/thefollower/">The Follower</a> aptly showcases the vulnerabilities of open cameras. By coupling open camera footage with AI and Instagram photos, Depoorter was able to match people’s photos with the footage of where and when they were taken.</p>

<p>There was pushback, with one of the <a href="https://www.inverse.com/input/culture/dries-depoorters-ai-surveillance-art-the-follower-instagram-influencers-photos">identified people saying</a>:</p>

<blockquote>
<p>It’s a crime to use the image of a person without permission.</p>
</blockquote>

<p>Whether or not it is illegal will depend on the specific circumstances and where you live. Either way, the issue here is that Depoorter was able to do this in the first place.</p>

<h2>IoT devices</h2>

<p>An IoT (“Internet of Things”) device is any device that connects to a wireless network to function – so think smart home devices such as Amazon Echo or Google Dot, a baby monitor, or even smart traffic lights.</p>

<p>It’s estimated global spending on IoT devices will <a href="https://acola.org/hs5-internet-of-things-australia/">have reached</a> US$1.2 trillion by some point this year. Around 18 billion connected devices form the IoT network. Like unsecured CCTV cameras, IoT devices are easy to hack into if they use default passwords or passwords that have <a href="https://haveibeenpwned.com/">been leaked</a>.</p>

<p>In some examples, hackers have hijacked baby monitor cameras to <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">stalk</a> breastfeeding mums, <a href="https://www.npr.org/sections/thetwo-way/2018/06/05/617196788/s-c-mom-says-baby-monitor-was-hacked-experts-say-many-devices-are-vulnerable/">threaten</a> parents that their baby was being kidnapped, and say creepy things like “<a href="https://www.nbcnews.com/news/us-news/stranger-hacks-baby-monitor-tells-child-i-love-you-n1090046">I love you</a>” to children.</p>

<figure>
            <iframe loading="lazy" width="440" height="260" src="https://www.youtube.com/embed/xbk3OdYBLHA?wmode=transparent&#038;start=0" frameborder="0" allowfullscreen="allowfullscreen"></iframe>
</figure>

<p>Beyond hacking, businesses can also use data collected through IoT devices to further target customers with products and services.</p>

<p>Privacy experts raised the alarm in September over Amazon’s merger agreement with robot vacuum company iRobot. <a href="https://www.fightforthefuture.org/news/2022-09-09-letter-to-the-ftc-challenge-amazon-irobot-deal">A letter</a> to the US Federal Trade Commission signed by 26 civil rights and privacy advocacy groups said:</p>

<blockquote>
<p>Linking iRobot devices to the already intrusive Amazon home system incentivizes more data collection from more connected home devices, potentially including private details about our habits and our health that would endanger human rights and safety.</p>
</blockquote>

<p>IoT-collected data can also change hands with third parties through data partnerships (which are very common), and this too without customers’ explicit consent.</p>

<figure class="align-center zoomable">
            <figure style="width: 744px" class="wp-caption alignnone"><a href="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=1000&#038;fit=clip"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=338&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/499953/original/file-20221209-25000-9tmah6.jpeg?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=424&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="424" /></a><figcaption class="wp-caption-text">Smart speakers with digital assistants consistently raise data privacy concerns among experts.</figcaption></figure>
</figure>

<h2>Big tech and big data</h2>

<p>In 2017, the <a href="https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data">value of big data exceeded</a> that of oil. Private companies have driven the majority of that growth.</p>

<p>For tech platforms, the expansive collection of users’ personal information is business as usual, literally, because more data mean more precise analytics, more effective targeted ads <a href="https://www.facebook.com/business/help/716180208457684?id=1792465934137726">and more revenue</a>.</p>

<p>This logic of profit-making through targeted advertising has been <a href="https://journals.sagepub.com/doi/full/10.1177/1095796018819461">dubbed</a> “surveillance capitalism”. As <a href="https://quoteinvestigator.com/2017/07/16/product/">the old saying</a> goes, if you’re not paying for it, then you’re the product.</p>

<p>Meta (which owns both Facebook and Instagram) <a href="https://www.forbes.com/sites/bradadgate/2022/11/03/revenue-of-alphabet-and-meta-the-digital-duopoly-have-been-slipping/?sh=2ebf3dad2fed">generated</a> almost US$23 billion in advertising revenue in the third quarter of this year.</p>

<p>The vast machinery behind this is illustrated well in the 2021 documentary The Social Dilemma, even if in a dramatised way. It <a href="https://theconversation.com/netflixs-the-social-dilemma-highlights-the-problem-with-social-media-but-whats-the-solution-147351">showed us how</a> social media platforms rely on our psychological weaknesses to keep us online for as long as possible, measuring our actions down to the seconds we spend hovering over an ad.</p>

<figure class="align-center ">
            <figure style="width: 744px" class="wp-caption alignnone"><img loading="lazy" decoding="async" alt="" src="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" srcset="https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=1 600w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=2 1200w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=600&#038;h=247&#038;fit=crop&#038;dpr=3 1800w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=45&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=1 754w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=30&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=2 1508w, https://images.theconversation.com/files/497297/original/file-20221124-24-idgeki.gif?ixlib=rb-1.1.0&#038;q=15&#038;auto=format&#038;w=754&#038;h=310&#038;fit=crop&#038;dpr=3 2262w" sizes="auto, (min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" width="754" height="310" /><figcaption class="wp-caption-text">A graphic excerpt from Social Dilemma.</figcaption></figure>
</figure>

<h2>Loyalty programs</h2>

<p>Although many people don’t realise it, loyalty programs are one of the biggest personal data collection gimmicks out there.</p>

<p>In a particularly intrusive example, in 2012 one <a href="https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/?sh=706b0cd96668">US retailer</a> sent a teenage girl a catalogue dotted with pictures of smiling infants and nursery furniture. The girl’s angered father went to confront managers at the local store, and learned that predictive analytics knew more about his daughter than he did.</p>

<p>It’s estimated 88% of Australian consumers <a href="https://www.oaic.gov.au/privacy/privacy-assessments/loyalty-program-assessment-woolworths-rewards-woolworths-limited">over age 16 are members</a> of a loyalty program. These schemes build your consumer profile to sell you more stuff. Some might even charge you <a href="https://www.abc.net.au/everyday/making-loyalty-cards-worth-your-time-and-money/10998806">sneaky fees</a> and lure you in with future perks to sell you at steep prices.</p>

<p>As technology journalist <a href="https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/loyalty-program-data-collection">Ros Page notes</a>:</p>

<blockquote>
<p>[T]he data you hand over at the checkout can be shared and sold to businesses you’ve never dealt with.</p>
</blockquote>

<p>As a cheeky sidestep, you could find a buddy to swap your loyalty cards with. Predictive analytics is only strong when it can recognise behavioural patterns. When the patterns are disrupted, the data turn into noise. <!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/194917/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/not-big-brother-but-close-a-surveillance-expert-explains-some-of-the-ways-were-all-being-watched-all-the-time-194917">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>After the Optus data breach, Australia needs mandatory disclosure laws</title>
		<link>https://privacy.org.au/2022/10/21/after-the-optus-data-breach-australia-needs-mandatory-disclosure-laws/</link>
		
		<dc:creator><![CDATA[Jane Andrew]]></dc:creator>
		<pubDate>Fri, 21 Oct 2022 01:47:49 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=5183</guid>

					<description><![CDATA[The Optus data breach, which has affected close to 10 million Australians, has sparked calls for changes to Australia’s privacy laws, placing limits on what and for how long organisations can hold our personal data. Equally important is to strengthen obligations for organisations to publicly disclose data breaches. Optus made a public announcement about its breach, but was not legally required to do so. In fact, beyond the aggregated data produced by the Office of the Australian Information Commissioner, the public is not made aware of the vast majority of data breaches that occur in Australia every year. <span class="excerpt-more"><a href="https://privacy.org.au/2022/10/21/after-the-optus-data-breach-australia-needs-mandatory-disclosure-laws/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<figure>
    <figure style="width: 744px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://images.theconversation.com/files/490238/original/file-20221017-17274-9l47al.jpg?ixlib=rb-1.1.0&#038;rect=0%2C895%2C6709%2C3571&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip" alt="" width="754" height="401" /><figcaption class="wp-caption-text">Image from Shutterstock</figcaption></figure>
</figure>

<p><span><a href="https://theconversation.com/profiles/jane-andrew-10314">Jane Andrew</a>, Professor, University of Sydney Business School, <em><a href="https://theconversation.com/institutions/university-of-sydney-841">University of Sydney</a></em>; <a href="https://theconversation.com/profiles/max-baker-25553">Max Baker</a>, Senior lecturer, <em><a href="https://theconversation.com/institutions/university-of-sydney-841">University of Sydney</a></em>, and <a href="https://theconversation.com/profiles/monique-sheehan-1387048">Monique Sheehan</a>, Research officer, <em><a href="https://theconversation.com/institutions/university-of-sydney-841">University of Sydney</a></em></span></p>

<p>The Optus data breach, which has affected close to 10 million Australians, has sparked calls for changes to Australia’s privacy laws, placing limits on what and for how long organisations can hold our personal data.</p>

<p>Equally important is to strengthen obligations for organisations to publicly disclose data breaches. Optus made a public announcement about its breach, but was not legally required to do so.</p>



<p>In fact, beyond the aggregated data produced by the Office of the Australian Information Commissioner, the public is not made aware of the vast majority of data breaches that occur in Australia every year.</p>

<p>Australia has had a “<a href="https://www.oaic.gov.au/privacy/notifiable-data-breaches">Notifiable Data Breaches</a>” scheme since February 2018 that requires all organisation to notify affected individuals as well as the Office of the Australian Information Commissioner in the case a breach of personal information likely to result in serious harm.</p>

<p>However, no notification is required if the organisation takes remedial action to prevent harm. Most importantly, public disclosure is never required.</p>

<p>This gives a lot of discretion to organisations. They can make their own assessment about the risks and decide not to disclose a breach at all.</p>

<p>Companies listed on the Australian Securities Exchange (ASX) are also obliged to disclose any data breach expected to have a “material economic impact” on a company’s share price. But it is notoriously difficult to measure material economic impact. So these announcements are not a reliable source of information for the public.</p>

<h2>Notified data breaches</h2>

<p>While the <a href="https://www.oaic.gov.au/privacy/notifiable-data-breaches">Notifiable Data Breaches</a> scheme is a step in the right direction, it’s impossible to know if the disclosures made reflect the scale and scope of data breaches.</p>

<p>The most recent <a href="https://www.oaic.gov.au/__data/assets/pdf_file/0010/12205/Final-Notifiable-Data-Breaches-Report-Jul-Dec-2021.pdf">Notifiable Data Breaches Report</a>, covering the six months from July to December 2021, lists 464 notifications (up 6% from the previous period).</p>

<p>Of these, 256 (55%) were attributed to malicious or criminal attacks, and 190 (41%) to human error, such as emailing personal information to the wrong recipient, publishing information by accident, or losing data storage devices <a href="https://www.oaic.gov.au/__data/assets/pdf_file/0010/12205/Final-Notifiable-Data-Breaches-Report-Jul-Dec-2021.pdf">or paperwork</a>. Another 18 (4%) were attributed to system errors.</p>

<p>The sectors that reported the most breaches were the health care service (83 notifications); finance (56); and legal, accounting and management services (51).</p>

<p>About 70% of all incidents reportedly affected fewer than 100 people. But one event affected at least a million people. Despite the scale, the public has not been provided details of these events, or the identities of the organisations responsible.</p>

<hr />

<p align="center"><iframe loading="lazy" width="100%" height="400px" style="border: none;" id="Ccd5f" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/Ccd5f/1/" frameborder="0"></iframe></p>

<hr />

<p>Regardless of the scale or reason, all data breaches have an impact on people and organisations. Despite this, we rarely learn about anything other than the most spectacular and most criminal of these events.</p>

<p>Without mandatory disclosure, there is insufficient public accountability.</p>

<h2>How should minimum disclosure work?</h2>

<p>A minimum disclosure framework <a href="https://www.sciencedirect.com/science/article/pii/S1045235421001155">should include</a> information about the type of data breached, the sensitivity of the data, the cause and size of the breach, and the risk-mitigation strategies the organisation has adopted.</p>

<p>The framework should require both a standardised public announcement when any significant data breach occurs, as well as a mandatory annual public report of data breaches. Reports and announcement should be published on the company’s website (just like an annual report) and filed with the Office of the Australian Information Commissioner.</p>



<p>This would ensure public access to a coherent historical record of breach-related events and organisational responses. The disclosures would allow community groups, regulators and interested parties to analyse breaches of our data and act accordingly.</p>

<p>At its simplest, a mandatory disclosure framework encourages annual disclosures that are comparable and publicly available. At the very least it creates opportunities for scrutiny and discussion.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img loading="lazy" decoding="async" src="https://counter.theconversation.com/content/192612/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" referrerpolicy="no-referrer-when-downgrade" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/after-the-optus-data-breach-australia-needs-mandatory-disclosure-laws-192612">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
