<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Jake Goldenfein &#8211; Australian Privacy Foundation</title>
	<atom:link href="https://privacy.org.au/author/jake-goldenfein/feed/" rel="self" type="application/rss+xml" />
	<link>https://privacy.org.au</link>
	<description>Defending your right to be free from intrusion</description>
	<lastBuildDate>Tue, 17 Mar 2020 09:26:00 +0000</lastBuildDate>
	<language>en-AU</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Australian police are using the Clearview AI facial recognition system with no accountability</title>
		<link>https://privacy.org.au/2020/03/17/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability/</link>
		
		<dc:creator><![CDATA[Jake Goldenfein]]></dc:creator>
		<pubDate>Tue, 17 Mar 2020 09:26:00 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=4193</guid>

					<description><![CDATA[Jake Goldenfein, Swinburne University of Technology Australian police agencies are reportedly using a private, unaccountable facial recognition service that combines machine learning and wide-ranging data-gathering practices to identify members of the public from online photographs. The service, Clearview AI, is like a reverse image search for faces. You upload an image of someone’s face and&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2020/03/17/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<span><a href="https://theconversation.com/profiles/jake-goldenfein-10998">Jake Goldenfein</a>, <em><a href="https://theconversation.com/institutions/swinburne-university-of-technology-767">Swinburne University of Technology</a></em></span>

<p>Australian police agencies are <a href="https://www.buzzfeed.com/hannahryan/clearview-ai-australia-police">reportedly using</a> a private, unaccountable facial recognition service that combines machine learning and wide-ranging data-gathering practices to identify members of the public from online photographs.</p>

<p>The service, <a href="http://www.clearview.ai/">Clearview AI</a>, is like a reverse image search for faces. You upload an image of someone’s face and Clearview searches its database to find other images that contain the same face. It also tells you where the image was found, which might help you determine the name and other information about the person in the picture.</p>

<p>Clearview AI built this system by collecting several billion publicly available images from the web, including from <a href="https://www.vice.com/en_au/article/5dmkyq/heres-the-file-clearview-ai-has-been-keeping-on-me-and-probably-on-you-too">social media sites</a> such as Facebook and YouTube. Then they used machine learning to make a biometric template for each face and match those templates to the online sources of the images.</p>

<p>It was <a href="https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html">revealed</a> in January that hundreds of US law enforcement agencies are using Clearview AI, starting a storm of discussion about the system’s privacy implications and the legality of the web-scraping used to build the database.</p>

<p>Australian police agencies <a href="https://www.abc.net.au/news/2020-01-23/australian-founder-of-clearview-facial-recognition-interview/11887112">initially denied</a> they were using the service. The denial held until a list of Clearview AI’s customers was <a href="https://www.buzzfeednews.com/article/ryanmac/clearview-ai-fbi-ice-global-law-enforcement">stolen and disseminated</a>, revealing users from the Australian Federal Police as well as the state police in Queensland, Victoria and South Australia.</p>

<h2>Lack of accountability</h2>

<p>This development is particularly concerning as the Department of Home Affairs, which oversees the federal police, is seeking to increase the use of <a href="https://beta.idmatch.gov.au">facial recognition</a> and other biometric identity systems. (An attempt to introduce new legislation was <a href="https://www.afr.com/politics/federal/facial-recognition-bill-knocked-back-20191024-p533s6">knocked back</a> last year for not being adequately transparent or privacy-protecting.)</p>

<p>Gaining trust in the proper use of biometric surveillance technology ought to be important for Home Affairs. And being deceptive about the use of these tools is a bad look.</p>

<p>But the lack of accountability may go beyond poor decisions at the top. It may be that management at law enforcement agencies did not know their employees were using Clearview AI. The company <a href="https://clearviewai.typeform.com/to/SFnULY">offers free trials</a> to “active law enforcement personnel”, but it’s unclear how they verify this beyond requiring a government email address.</p>

<p>Why aren’t law enforcement agencies enforcing rules about which surveillance tools officers can use? Why aren’t their internal accountability mechanisms working?</p>

<p>There are also very real concerns around security when using Clearview AI. It monitors and logs every search, and we know it has already had <a href="https://www.thedailybeast.com/clearview-ai-facial-recognition-company-that-works-with-law-enforcement-says-entire-client-list-was-stolen">one data breach</a>. If police are going to use powerful surveillance technologies, there must be systems in place for ensuring those technological tools do what they say they do, and in a secure and accountable way.</p>

<h2>Is it even accurate?</h2>

<p>Relatively little is known about how the Clearview AI system actually works. To be accountable, a technology used by law enforcement should be tested by a standards body to ensure it is fit for purpose.</p>

<p>Clearview AI, on the other hand, has had its own testing done – and as a result its developers claim it is 100% accurate.</p>

<p><a href="https://www.documentcloud.org/documents/6772775-Clearveiw-Ai-Accuracy-Test-Oct-2019.html">That report</a> does not represent the type of testing that an entity seeking to produce an accountable system would undertake. In the US at least, there are agencies like the National Institute for Standards and Technology that do precisely that kind of accuracy testing. There are also many qualified researchers in universities and labs that could properly evaluate the system.</p>

<p>Instead, Clearview AI gave the task to a trio composed of a retired judge turned private attorney, an urban policy analyst who wrote some open source software in the 1990s, and a former computer science professor who is now a Silicon Valley entrepreneur. There is no discussion of why those individuals were chosen.</p>

<p>The method used to test the system also leaves a lot to be desired. Clearview AI based their testing on a test by the American Civil Liberties Union of Amazon’s Rekognition image analysis tool.</p>

<p>However, the ACLU test was a <a href="https://www.buzzfeednews.com/article/carolinehaskins1/clearview-ai-facial-recognition-accurate-aclu-absurd">media stunt</a>. The ACLU ran headshots of 28 members of congress against a mugshot database. None of the politicians were in the database, meaning any match returned would be an error. However, the test only required the system to be 80% certain of its results, making it quite likely to return a match.</p>

<p>The Clearview AI test also used headshots of politicians taken from the web (front-on, nicely framed, well-lit images), but ran them across their database of several billion images, which did include those politicians.</p>

<p>The hits returned by the system were then confirmed visually by the three report authors as 100% accurate. But what does 100% mean here?</p>

<p>The report stipulates that the first two hits provided by the system were accurate. But we don’t know how many other hits there were, or at what point they stopped being accurate. Politicians have lots of smiling headshots online, so finding two images should not be complex.</p>

<p>What’s more, law enforcement agencies are unlikely to be working with nice clean headshots. Poor-quality images taken from strange angles – the kind you get from surveillance or CCTV cameras – would be more like what law enforcement agencies are actually using.</p>

<p>Despite these and other criticisms, Clearview AI CEO Hoan Ton-That <a href="https://www.buzzfeednews.com/article/carolinehaskins1/clearview-ai-facial-recognition-accurate-aclu-absurd">stands by the testing</a>, telling Buzzfeed News he believes it is diligent and thorough.</p>

<h2>More understanding and accountability are needed</h2>

<p>The Clearview AI case shows there is not enough understanding or accountability around how this and other software tools work in law enforcement. Nor do we know enough about the company selling it and their security measures, nor about who in law enforcement is using it or under what conditions.</p>

<p>Beyond the ethical arguments around facial recognition, Clearview AI reveals Australian law enforcement agencies have such limited technical and organisational accountability that we should be questioning their competency even to evaluate, let alone use, this kind of technology.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img decoding="async" src="https://counter.theconversation.com/content/132667/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important;" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https://theconversation.com/republishing-guidelines --></p>

<p>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/australian-police-are-using-the-clearview-ai-facial-recognition-system-with-no-accountability-132667">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Close up: the government’s facial recognition plan could reveal more than just your identity</title>
		<link>https://privacy.org.au/2018/03/06/close-up-the-governments-facial-recognition-plan-could-reveal-more-than-just-your-identity/</link>
		
		<dc:creator><![CDATA[Jake Goldenfein]]></dc:creator>
		<pubDate>Tue, 06 Mar 2018 04:30:57 +0000</pubDate>
				<category><![CDATA[Commentary]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=3046</guid>

					<description><![CDATA[Jake Goldenfein, Swinburne University of Technology A Bill to set up the federal government’s biometric identity system is currently going through Parliament. But there are concerns over just how much information the system would be allowed to gather, and how that might be used to establish more than just the identity of a person. Strongly&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2018/03/06/close-up-the-governments-facial-recognition-plan-could-reveal-more-than-just-your-identity/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<span><a href="https://theconversation.com/profiles/jake-goldenfein-10998">Jake Goldenfein</a>, <em><a href="http://theconversation.com/institutions/swinburne-university-of-technology-767">Swinburne University of Technology</a></em></span>

<p>A Bill to set up the federal government’s biometric identity system is currently going through Parliament. But there are concerns over just how much information the system would be allowed to gather, and how that might be used to establish more than just the identity of a person.</p>

<p>Strongly based on the FBI model in the United States, the <a href="http://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22legislation%2Fbillhome%2Fr6031%22">Identity Matching Services Bill</a> and its <a href="http://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22legislation%2Fems%2Fr6031_ems_faf1ba43-d9a2-46d6-b626-e5df9ec6f504%22">Explanatory Memoranda</a> prescribe what data can be collected, shared and processed, by who and for what purposes.</p>

<p>The Bill is based on the <a href="https://www.coag.gov.au/meeting-outcomes/special-meeting-council-australian-governments-counter-terrorism-communique">Council of Australian Governments (COAG) agreement</a>, signed in October 2017.</p>

<p>The public purpose of the system is to provide identity-matching services to government agencies and some private entities (such as banks and telcos). But the Bill will also establish the Department of Home Affairs as an incredibly data-rich law-enforcement and security agency, with a wide remit for data collection and use.</p>

<h2>Accessing the ‘hub’</h2>

<p>The first layer of the identity matching system is what’s called the “interoperability hub”. This is the interface for those government and private entities seeking access to identity services.</p>

<p>These identity services effectively answer the questions: “is this person who they say they are?” and “who is this person?”.</p>

<p>The hub works on a query and response model. This means that users of the system do not have access to any of the underlying data powering the biometric processing. They won’t be able to browse the databases; they will only have their identity verification questions answered.</p>

<p>The second layer of the system, underneath the hub, is the databases that drive the biometric identity matching. These include passport and citizenship information as well as the new National Drivers License Facial Recognition Solution database, which will be housed in the Department of Home Affairs.</p>

<p>Along with images, these databases include an extraordinary amount of personal information. Roads agencies, like VicRoads in Victoria, hold rich databases of biographical information including names and addresses, age and gender. Those records are also linked to information about vehicle ownership and registration.</p>

<p>Commonwealth criminal intelligence agencies have been seeking access to state-held driver’s licence images and associated personal information for years. The 2017 COAG agreement is what will finally enable a Commonwealth agency to have custodianship over this data on behalf of states.</p>

<h2>You ask, it collects</h2>

<p>But the use of the hub for identity-matching services means that the amount of data in these database will grow. Each time a user makes a request for identity-matching services, the hub will supply more data to the Department.</p>

<p>The Department can collect and process all information included in an identity document that has a photograph. It can also collect all of the information associated with that document held by the authority that created it.</p>

<p>When an entity (like law enforcement) seeks identity verification, it will likely supply images from its own camera or CCTV systems (or supplied by other parties), along with whatever data associated with those images that might help identify the person.</p>

<p>That could include where and when the images were taken or supplied, and potentially what a person was doing at the time the image was taken or supplied. All of those data are provided to the Department of Home Affairs when an identity verification is done.</p>

<p>Similarly, when banks and telecommunications companies use the hub, that potentially links those records to the Department databases &#8211; or at least facilitates those linkages down the track.</p>

<p>This creates the possibility of aggregated criminal and civil histories in a single identity record, like what has occurred with the <a href="https://www.eff.org/wp/law-enforcement-use-face-recognition">FBI’s biometric system in the US</a>.</p>

<p>This is all without access to the largest, most sophisticated facial recognition database in existence: Facebook. If sources such as public CCTV and social media are eventually linked into the system, its significance changes again, radically.</p>

<h2>Joining the dots</h2>

<p>So what is all this data for? On one hand, it provides identity services to hub users. But on the other hand, it generates insights on behalf of the Department of Home Affairs for the sake of policing. Data at a large scale, and especially when used in the context of security and intelligence, means insights and predictions.</p>

<p>The purposes for which the Department can use the information it gathers are very broad. They include preventing and detecting identity fraud, law enforcement, national security, protective security (protection of government assets, persons or facilities), community safety (for instance where a person is acting suspiciously in a crowded public place), and road safety.</p>

<p>Those categories include criminal intelligence gathering and profiling, policing of public spaces and public events, and policing of activist communities and protests.</p>

<p>Many of these policing exercises are highly data-driven, using new predictive techniques to identify criminal suspects and political agitators before any activity has even occurred.</p>

<p>Identity technologies have historically been used by governments to answer two questions:</p>

<p>1) What person is that?
2) What kind of person is that?</p>

<p>Identity is more than merely biographical information. It is a narrative that we tell about ourselves and that others tell about us. That narrative is what is at stake in this type of security surveillance.</p>

<p>Beyond facial recognition, we have already seen machine vision systems designed to <a href="http://psycnet.apa.org/doi/10.1037/pspa0000098">predict sexuality on the basis of a person’s image</a>. It’s research that has already <a href="https://www.vox.com/science-and-health/2018/1/29/16571684/michal-kosinski-artificial-intelligence-faces">generated plenty of controversy</a>.</p>

<p>If the data sets can be construed, and the results are to be accepted, there is no reason why machine vision systems cannot be targeted to answer questions about criminal propensity, IQ, suitability for certain tasks, political leanings, or anything else.</p>

<p><img decoding="async" src="https://counter.theconversation.com/content/92261/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" />We need to understand what these new database arrangements will enable in terms of high-level or political policing by the Department of Home Affairs, and what this new technical and bureaucratic architecture means for Australia’s broader surveillance arrangements.<span><em><a href="http://theconversation.com/institutions/swinburne-university-of-technology-767"></a></em></span></p>

<p>This article was originally published on <a href="http://theconversation.com">The Conversation</a>. Read the <a href="https://theconversation.com/close-up-the-governments-facial-recognition-plan-could-reveal-more-than-just-your-identity-92261">original article</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>MEDIA RELEASE: We need to have a serious talk about snooping</title>
		<link>https://privacy.org.au/2018/02/01/media-release-we-need-to-have-a-serious-talk-about-snooping/</link>
		
		<dc:creator><![CDATA[Kat Lane]]></dc:creator>
		<pubDate>Thu, 01 Feb 2018 03:50:12 +0000</pubDate>
				<category><![CDATA[Media Release]]></category>
		<guid isPermaLink="false">https://privacy.org.au/?p=3021</guid>

					<description><![CDATA[Protection from public sector snoops is okay for people in Britain but not for Australians? That’s the question being asked by the Australian Privacy Foundation – the nation’s independent privacy advocate. For more than 30 years the Foundation has been fighting for a respectful privacy regime. Just because something is politically advantageous, administratively convenient or&#8230; <span class="excerpt-more"><a href="https://privacy.org.au/2018/02/01/media-release-we-need-to-have-a-serious-talk-about-snooping/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p>Protection from public sector snoops is okay for people in Britain but not for Australians?<br /> <br />That’s the question being asked by the Australian Privacy Foundation – the nation’s independent privacy advocate. For more than 30 years the Foundation has been fighting for a respectful privacy regime.<br /> <br />Just because something is politically advantageous, administratively convenient or commercially attractive doesn’t mean politicians, officials and businesses should do what they like. <br /> <br />Just because your nosy neighbour has a digital camera or a drone doesn’t mean private snooping is okay.<br /> <br />We need a national conversation – one informed by respect rather than fear and political opportunism – about Australia’s privacy regime. We can’t let privacy be eroded drip by drop.<br /> <br />This week a UK Court damned that nation’s surveillance regime. The court savaged indiscriminate official access to personal data based on mandatory retention of mobile phone traffic and internet searcher. The UK ‘snoopers charter’ (under the Data Retention and Investigatory Powers Act and Investigatory Powers Act) covers records of internet use, location-tracking of mobile phone use, and records of who people call and when they call. <br /> <br />The UK regime is similar to the mandatory retention of metadata in Australia and to creeping access – one step after the other in the shadows through quiet changes to Commonwealth, State and Territory law – by a growing range of public and private bodies. <br /> <br />The UK court said that the UK regime is legally wrong. Access was not restricted to fighting serious crime and there was no meaningful safeguarding by prior authorisation by a court or independent body. It is not good enough to say that we can rely on an official or a minister: in Australia, just like elsewhere, those people sometimes get it wrong.<br /> <br />News about the UK coincides with the ABC reporting that secret Commonwealth government documents were left in a filing cabinet sold on the second-hand market. The report highlights sensitive documents left behind in offices. If we can’t trust the servants of the people to take more care, we need to talk about changing the rules. <br /> <br />The Foundation is calling for better law – more coherent, more transparent, real remedies – and better administration. If you are in Western Australia or South Australia you might ask your state government why you still don’t have an Act of Parliament that deals with the information that state and local government collect about you. Every Australian should be asking why the national government (and the Opposition parties) haven’t done anything about the Australian Law Reform Commission’s major report on digital snooping and snapping. <br /> <br />We need to talk about such things and we shouldn’t have to wait for courts to come to the rescue.</p><p>Contacts:</p><table><tbody><tr><td>Kat Lane</td><td>0447 620 694</td><td>Kat.Lane@privacy.org.au</td></tr><tr><td>Dr Jake Goldenfein</td><td>(03) 9214 8942</td><td>Jake.Goldenfein@privacy.org.au</td></tr></tbody></table><p>
<br /> <br />
</p>


]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Federal Court Decision Guts the Privacy Act</title>
		<link>https://privacy.org.au/2017/01/19/federal-court-decision-guts-the-privacy-act/</link>
		
		<dc:creator><![CDATA[Jake Goldenfein]]></dc:creator>
		<pubDate>Thu, 19 Jan 2017 02:00:21 +0000</pubDate>
				<category><![CDATA[Media Release]]></category>
		<guid isPermaLink="false">http://privacy.org.au/wpfiles/?p=657</guid>

					<description><![CDATA[The judgment of the Federal Court of Australia, in Privacy Commissioner v Telstra Corporation Pty Ltd, that key metadata is not "personal information" is  “disastrous for ordinary Australians and misunderstands how a digital footprint identifies person”, the Australian Privacy Foundation said today.
This decision will severely impair the Privacy Commissioner in regulating privacy on line. <span class="excerpt-more"><a href="https://privacy.org.au/2017/01/19/federal-court-decision-guts-the-privacy-act/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p>The judgment of the Federal Court of Australia, in Privacy Commissioner v Telstra Corporation Pty Ltd, that key metadata is not &#8220;personal information&#8221; is  “disastrous for ordinary Australians and misunderstands how a digital footprint identifies person”, the Australian Privacy Foundation said today.</p>
<p>This decision will severely impair the Privacy Commissioner in regulating privacy on line.</p>
<p>In what may be the most important Australian privacy decision to date, the Federal Court has effectively gutted the Privacy Act.</p>
<p>The judgment means that a person&#8217;s internet browsing history (URL addresses visited), assigned IP addresses, and geo-location (cell tower) data is not personal information because it did not identify that person&#8217;s name or telephone number.  The reality however is that those pieces of information taken together do identify a person.</p>
<p>The judgment effectively introduces a new hurdle in determining whether personal information is  protected by requiring a person to be the subject matter of each piece of information. This<br />
very narrow reading ignores how data matching and data linking works, permitting a person to be identified from pieces of information each of which does not necessarily specifically refer to<br />
him or her.</p>
<p>&#8220;The approach taken by the Federal Court ignores the effect of significant technological developments in data matching and linking.  It is an analog decision for a digital age.  The decision leaves Australia with one of the weakest and least effective privacy protections in the Western World&#8221; said Jake Goldenfein, board member of the Australian Privacy Foundation. Dr Goldenfein said that the approach taken by the Court ignores overseas court decisions which have identified and recognised the reality of linking different items of digital information which has the effect of identify<br />
ing a person.</p>
<p>Dr Goldenfein said today&#8217;s judgement was a bad day for privacy protection in Australia. This data is very valuable to security and law enforcement but there are now no protections for<br />
ordinary citizens as to the use of such data.</p>
<p>&#8220;This judgment has rendered Australia an outlier in the protection of privacy on line&#8221; said Mr Goldenfein. &#8220;It reduces the protections ordinary Australians should expect and have. It is a<br />
boon for organisations to collect vast amounts of information about individuals but not be accountable to the regulator for it.&#8221;</p>
<p>The Australian Privacy Foundation called on the Commonwealth Government to amend the Privacy Act to fix the damage the Federal Court judgement may cause without delay.</p>
<p>Background<br />
This case began when in 2013 Ben Grubb, then a reporter with the Fairfax press, sought metadata information regarding his mobile phone held by Telstra Corporation. Telstra refused to provide it to him claiming it was not personal information.  He complained to the Privacy Commissioner who held that the metadata was personal information.  Telstra successfully appealed that decision in the Administrative Appeals Tribunal in 2015. In 2016 the Privacy Commissioner appealed that decision to the Full Bench of the Federal Court.</p>
<p>The Australian Privacy Foundation, together with the New South Wales Council for Civil Liberties, applied to be heard as an amicus curiae, a friend of the court.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Decision in Privacy Commissioner v Telstra Corporation to be handed down tomorrow morning</title>
		<link>https://privacy.org.au/2017/01/18/decision-in-privacy-commissioner-v-telstra-corporation-to-be-handed-down-tomorrow-morning/</link>
		
		<dc:creator><![CDATA[Jake Goldenfein]]></dc:creator>
		<pubDate>Wed, 18 Jan 2017 02:01:40 +0000</pubDate>
				<category><![CDATA[Media Release]]></category>
		<guid isPermaLink="false">http://privacy.org.au/wpfiles/?p=647</guid>

					<description><![CDATA[The Full Bench of the Federal Court of Australia will hand down its judgment tomorrow in the decision Privacy Commissioner v Telstra Corporation Pty Ltd
The details are as follows:
Date: 19 January 2017.
Venue: Court 8C (Level 8) Commonwealth Law Courts, 305 William Street, Melbourne, 3000
Time: 9 am <span class="excerpt-more"><a href="https://privacy.org.au/2017/01/18/decision-in-privacy-commissioner-v-telstra-corporation-to-be-handed-down-tomorrow-morning/">Read More</a></span>]]></description>
										<content:encoded><![CDATA[<p>The Full Bench of the Federal Court of Australia will hand down its judgment tomorrow in the decision Privacy Commissioner v Telstra Corporation Pty Ltd</p>
<p>The details are as follows:</p>
<p>Date: 19 January 2017.<br />
Venue: Court 8C (Level 8) Commonwealth Law Courts, 305 William Street, Melbourne, 3000<br />
Time: 9 am</p>
<p>The judgment is important, determining what constitutes &#8220;personal information&#8221; for the purpose of the Privacy Act 1988. It will decide whether the metadata we create is part of our personal information.  It is likely to be the most significant privacy decision by an Australian Court to date. This case began when in 2013 Ben Grubb, then a reporter with the Fairfax press, sought metadata information regarding his mobile phone held by Telstra Corporation. Telstra refused to provide it to him claiming it was not personal information.  He complained to the Privacy Commissioner who held that the metadata was personal information.  Telstra successfully appealed that decision in the Administrative Appeals Tribunal in 2015. In 2016, the Privacy Commissioner appealed that decision to the Full Bench of the Federal Court.</p>
<p>The Australian Privacy Foundation, together with the New South Wales Council for Civil Liberties, applied to be heard as an amicus curiae, a friend of the court.</p>
<p>The judgment is also notable in that it is one of the last decisions of Justice Edelman in the Federal Court before he is sworn in as a justice of the High Court later this month.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
