<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Ali Winston | The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2020-08-13T13:53:43+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/author/ali-winston" />
	<id>https://www.theverge.com/authors/ali-winston/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/authors/ali-winston/rss" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Ali Winston</name>
			</author>
			
			<title type="html"><![CDATA[Feds are treating BlueLeaks organization as ‘a criminal hacker group,’ documents show]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/8/13/21365448/blueleaks-dhs-distributed-denial-secrets-dds-ddosecrets-police" />
			<id>https://www.theverge.com/2020/8/13/21365448/blueleaks-dhs-distributed-denial-secrets-dds-ddosecrets-police</id>
			<updated>2020-08-13T09:53:43-04:00</updated>
			<published>2020-08-13T09:53:43-04:00</published>
			<category scheme="https://www.theverge.com" term="Policy" /><category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Security" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[The transparency activist organization Distributed Denial of Secrets (DDoSecrets) has been formally designated as a &#8220;criminal hacker group,&#8221; following the publication of 296 gigabytes of sensitive law enforcement data earlier this summer, known colloquially as &#8220;BlueLeaks.&#8221; The description comes from a bulletin circulated to fusion centers around the country in late June by the Department [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Photo illustration by William Joel / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/21730429/VRG_ILLO_4143_001.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>The transparency activist organization Distributed Denial of Secrets (DDoSecrets) has been formally designated as a &ldquo;criminal hacker group,&rdquo; following the publication of 296 gigabytes of sensitive law enforcement data earlier this summer, known colloquially as <a href="https://www.wired.com/story/blueleaks-anonymous-law-enforcement-hack/">&ldquo;BlueLeaks.&rdquo;</a> The description comes from <a href="https://www.documentcloud.org/documents/7034323-CCSO-00000003.html">a bulletin</a> circulated to fusion centers around the country in late June by the Department of Homeland Security&rsquo;s Office of Intelligence and Analysis. The bulletin&rsquo;s language mirrors earlier US government descriptions of WikiLeaks, Anonymous, and LulzSec.</p>

<p>&ldquo;A criminal hacker group Distributed Denial of Secrets (DDS) on 19 June 2020 conducted a hack-and-leak operation targeting federal, state, and local law enforcement databases, probably in support of or in response to nationwide protests stemming from the death of George Floyd,&rdquo;<em> </em>the bulletin reads. &ldquo;DDS leaked ten years of data from 200 police departments, fusion centers, and other law enforcement training and support resources around the globe, according to initial media and DHS reporting. DDS previously conducted hack-and-leak activity against the Russian Government.&rdquo;</p>

<p>The document was obtained by <a href="https://www.lucyparsonslabs.com/">Lucy Parsons Lab</a> researcher Brian Waters through an Illinois Freedom of Information Act request with the Cook County Sheriff&rsquo;s Office.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Calling us criminal hackers gives them the excuse to circumvent the First Amendment.”</p></blockquote></figure>
<p>The BlueLeaks data was reportedly provided to Distributed Denial of Secrets by a hacker claiming ties to Anonymous, comprising 10 years of information from more than 200 police departments and fusion centers. The records include police and FBI reports, bulletins, guides, and technical information about surveillance techniques and intelligence gathering. A number of news organizations have used BlueLeaks data to publish stories about law enforcement tactics, including the <a href="https://web.archive.org/save/https://twitter.com/CyberPunkJake/status/1275069320880517121">counter-surveillance methods</a> of Black Lives Matter protesters, a <a href="https://theintercept.com/2020/07/15/george-floyd-protests-police-far-right-antifa/">skewed analysis</a> on the antifa threat to law enforcement, and worries about widespread mask-wearing during the COVID-19 pandemic <a href="https://theintercept.com/2020/07/16/face-masks-facial-recognition-dhs-blueleaks/">foiling facial recognition algorithms</a>.</p>

<p>From the beginning, DDoSecrets has faced intense difficulties keeping the BlueLeaks material online. In late June, Twitter suspended DDoSecrets&rsquo;s account in response to the leaks and mass-blocked hyperlinks to the leaked dataset, making it impossible to share on the platform. It was a remarkably draconian step for a company that has long allowed links to extremist content and active election interference efforts like DCLeaks to remain online. Last month, German authorities <a href="https://www.zdnet.com/article/german-authorities-seize-blueleaks-server-that-hosted-data-on-us-cops/">seized the DDoSecrets server</a> that hosted the BlueLeaks data, effectively shutting down the organization&rsquo;s online repository of the records. The seizure was made on the request of American authorities.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Unlike WikiLeaks and Assange, we have no involvement in actual hacks.”</p></blockquote></figure>
<p>The bulletin&rsquo;s description of &ldquo;a criminal hacker group&rdquo; will only strengthen suspicions that federal law enforcement is building a criminal case against DDoSecrets, particularly combined with the recent server seizures. Emma Best, one of DDoSecrets&rsquo;s founders, told <em>The Verge</em> that they &ldquo;absolutely&rdquo; believe the document shows that American authorities are investigating their organization in the same manner as it did WikiLeaks, whose founder, Julian Assange, is <a href="https://www.justice.gov/opa/pr/wikileaks-founder-charged-superseding-indictment">charged</a> with conspiring to steal and publish classified Pentagon documents.</p>

<p>Crucially, Best maintains that the group has never been involved in any intrusions to obtain documents and merely publishes files after they&rsquo;ve been obtained by others. &ldquo;Unlike WikiLeaks and Assange, we have no involvement in actual hacks and don&rsquo;t provide material support to hackers,&rdquo; they told <em>The Verge</em>.</p>

<p>It is not illegal to publish classified information in the United States, and most of the BlueLeaks data is marked &ldquo;For Official Use Only&rdquo; rather than classified.</p>

<p>Still, Best maintains that DDoSecrets is simply a publisher devoted to freedom of expression and transparency both at home and abroad. &ldquo;Calling us &ldquo;criminal hackers&rdquo; (while ignoring the numerous facts and evidence that undermines that accusation) gives them the excuse to circumvent the First Amendment,&rdquo; Best told <em>The Verge</em>.</p>

<p>One of the odder claims in the three-page bulletin is an assertion that Distributed Denial of Secrets conducted a similar &ldquo;hack-and-leak&rdquo; operation in 2019 on Russian government personnel. &ldquo;Russian media speculated the incident was a response to Russia&rsquo;s hack-and-leak activities targeting the Democratic Party to influence the outcome of the 2016 US presidential election&rdquo; the bulletin reads.</p>

<p>The January 2019 DDoSecrets release referenced in the bulletin, called the <a href="https://www.nytimes.com/2019/01/25/world/europe/russian-documents-leaked-ddosecrets.html">Dark Side of the Kremlin</a>, included 175 gigabytes of information &mdash; some previously released on Russian-language websites &mdash; about the dealings of the Kremlin, the Russian Orthodox Church, and Russia&rsquo;s war in Ukraine. It included a significant amount of hacked material from the Russian Interior Ministry that WikiLeaks refused to release in 2016. According to <a href="https://www.theguardian.com/world/2017/feb/09/russian-hacking-groups-last-member-at-liberty-comes-out-of-the-shadows">media reports</a>, the Russian hacking group Shaltai Boltai and other Eastern European hackers were responsible for the materials referenced in the bulletin.</p>

<p>The Department of Homeland Security did not respond to a request for comment.</p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Ali Winston</name>
			</author>
			
			<title type="html"><![CDATA[The NYC subway’s new tap-to-pay system has a hidden cost — rider data]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/3/16/21175699/mta-omny-privacy-security-smartphone-identifier-location-nyc" />
			<id>https://www.theverge.com/2020/3/16/21175699/mta-omny-privacy-security-smartphone-identifier-location-nyc</id>
			<updated>2020-03-16T09:00:00-04:00</updated>
			<published>2020-03-16T09:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Features" /><category scheme="https://www.theverge.com" term="Mass Transit" /><category scheme="https://www.theverge.com" term="Transportation" />
							<summary type="html"><![CDATA[New York City&#8217;s subway system, a 24/7 behemoth that logs a billion and a half trips per year, is synonymous with archaic technology, from a signals system that dates to the Great Depression era to rail cars in service for more than four decades. The introduction last year of OMNY, a $574 million new contactless [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/19788682/acastro_200320_3931_OMNY_0001.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>New York City&rsquo;s subway system, a 24/7 behemoth that logs a billion and a half trips per year, is synonymous with archaic technology, from a signals system that dates to the Great Depression era to rail cars in service for more than four decades. <a href="https://www.nytimes.com/2019/07/30/nyregion/metrocard-mta-subway-discontinued.html">The introduction last year</a> of OMNY, a $574 million new contactless payment system for city buses and subways, bucks that trend.&nbsp;</p>

<p>However, experts say the OMNY payment scheme is rife with problems, based on the limited information about the system made public in its <a href="https://omny.info/terms-of-service">terms of service</a> and <a href="https://omny.info/privacy">privacy policy</a>. The collection of significant amounts of information from users, including smartphone device identifiers and location, which, coupled with payment and transportation data, could be used to map out riders&rsquo; patterns of life in minute detail and create a privacy nightmare.</p>

<p>Created for the MTA by Cubic Corporation, OMNY uses near-field communication (NFC) technology to enable tap payment at turnstiles via debit cards, smartphone payment apps, and eventually a loadable card such as those used by transit riders in London, Sidney, San Francisco, and Washington, DC. Cubic has created NFC card payment systems for transit systems in San Diego, Sydney, Vancouver, and the Bay Area in recent years, and is also expected to debut mobile payment apps on the Chicago Transit Authority later this year.</p>

<p>This replacement for the venerable MetroCard (the magnetic stripe swipe card introduced in 1992 to replace the subway token) will supposedly speed up bus service and entry to the subway system &ndash; and spare countless out-of-towners the embarrassment of not knowing how to correctly swipe in at a turnstile.&nbsp;</p>

<p>In addition to privacy concerns, there are also questions related to the security of such data, whether OMNY could be used by the MTA to unilaterally exclude people from New York City&rsquo;s transit system, and language in the payment system&rsquo;s terms of service that indemnifies the MTA from liability for customers being <a href="https://gothamist.com/news/greedy-omny-scanners-are-double-charging-some-subway-riders-who-use-metrocards">double-charged</a> for rides.</p>

<p>The problems have been exacerbated by the MTA&rsquo;s refusal to engage with questions about different aspects of OMNY, even as the payment system is being rapidly introduced throughout the city. By the end of 2020, OMNY validators will be in every subway station and bus in the city. Early next year, the new payment system will be introduced to the Metro North and Long Island Rail Road commuter lines.&nbsp;</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/16304011/akrales_190528_3453_0099.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="Fare gates at a NYC subway station." title="Fare gates at a NYC subway station." data-has-syndication-rights="1" data-caption="" data-portal-copyright="Photo by Amelia Holowaty Krales / The Verge" />
<p>Riders of New York City subways and buses are well accustomed to the reality that they are being tracked on public transportation. Surveillance cameras proliferated as anti-terrorism and crime measures in the years after 9/11, while MetroCard data is routinely used by police to recreate a criminal suspect&rsquo;s movements.</p>

<p>NYPD officers and district attorney investigators have for years used judicial subpoenas to retrieve MetroCard information stored by the MTA to track criminal suspects. One such instance involved the murder of a baby boy by his father, who worked as a subway cleaner. Detectives used the man&rsquo;s MTA-issued MetroCard to track his movements from Co-Op City in the Bronx to lower Manhattan, where he <a href="https://www.nytimes.com/2018/08/08/nyregion/east-river-baby-father-detained.html">allegedly threw his son&rsquo;s body</a> into the East River.&nbsp;</p>

<p>However, the introduction of a payment system that ties a rider&rsquo;s movements not only to their bank card, but potentially their smartphone via payment apps creates a raft of privacy and data security issues. In OMNY&rsquo;s privacy policy, the MTA states that information including, but not limited to, payment information, billing address, and the point of entry to the transit system will be logged in Cubic Corporation&rsquo;s servers.&nbsp;</p>

<p>Steve Brunner, Cubic&rsquo;s general manager for the tri-state area, said the firm had multiple local data centers to safeguard against losing information in a catastrophic event. &ldquo;If there is an outage or failure of a component at one data center, it will automatically either partially or fully roll over to the other data center,&rdquo; Brunner said in an interview with <em>The Verge</em> last year.</p>

<p>In addition, the privacy policy authorizes the MTA and Cubic to retain the data for an indefinite period &mdash; the MTA claims that it stores transaction information for six months, but keeps other portions of the transaction information for up to seven years. Riders can log in to their OMNY account and review their movement history for the 90 days prior.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Privacy advocates say that OMNY’s retention of individual rider data warrants greater disclosure </p></blockquote></figure>
<p>Privacy advocates say that OMNY&rsquo;s retention of individual rider data and smartphone device identifiers for an indeterminate period that could run over half a decade warrants greater disclosure and public discussion.&nbsp;</p>

<p>&ldquo;If you&rsquo;re using OMNY on your phone &ndash; there&rsquo;s no card yet &ndash; it&rsquo;s not clear to me what other information they&rsquo;re taking from your phone or how that can identify you,&rdquo; said Jerome Greco, a staff attorney at the Legal Aid Society&rsquo;s digital forensics unit who specializes in surveillance technology.</p>

<p>OMNY&rsquo;s privacy policy also includes a carve-out for the collection of additional information &ldquo;that is not specifically listed&rdquo; in the document, allowing the transit authority broad leeway to harvest additional data from riders. According to the MTA, such information includes IP addresses and device numbers from phones used to pay for rides, creating a whole new category of sensitive information that could be used either to push advertisements toward riders or track their movements outside of the transit system via Bluetooth, Wi-Fi, or their device&rsquo;s MAC address.</p>

<p>The MTA maintains that it retains all information securely with triple DES encryption and that such data is never decrypted.&nbsp;</p>

<p>&ldquo;Our transactions are encrypted from the moment you touch the validator,&rdquo; said Al Putre, the MTA&rsquo;s program director for OMNY. &ldquo;We keep them in an encrypted state even when we store it in our account-based processor. We use state of the art encryption methods and security module hardware. We do absolutely everything we can do to maintain the integrity of the transaction to ensure it&rsquo;s secure. If we have just one little glitch, our credibility goes out the door.&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Our transactions are encrypted from the moment you touch the validator”</p></blockquote></figure>
<p>Indeed, the OMNY terms of service contain specific language that admits riders run the risk of incursions to their privacy by using the payment system. &ldquo;Security risk is inherent in all internet and information technologies, and we cannot guarantee the security of your Personal Information,&rdquo; the policy reads. While the MTA maintains this is standard contractual language for information technology products, it is telling straphangers they will be sacrificing privacy for convenience.</p>

<p>Cubic, the company in charge of designing and implementing the OMNY system, has run into problems around data security before: in San Francisco, information from its contactless payment system for the Muni light rail system was <a href="https://www.citylab.com/transportation/2016/11/san-francisco-never-considered-paying-ransom-for-its-hacked-transit/508952/">hacked</a> and held ransom for $73,000 in Bitcoin, forcing the system to let riders use it for free. Last year, London&rsquo;s Oyster payment system was taken offline after a <a href="https://www.theregister.co.uk/2019/08/08/tfl_oyster_card_outage_online_topup/">credential stuffing spree</a> compromised the accounts of an untold number of riders.</p>

<p>&ldquo;We&rsquo;re definitely concerned about issues on privacy and how the MTA is using data,&rdquo; said Jaqi Cohen, the campaign direction for the New York Public Interest Research Group&rsquo;s Straphangers Campaign. &ldquo;Any way the MTA is planning on using and protecting data should be known to riders and the public &ndash; the terms of service should not be hidden from the riders and the way the MTA plans to use these data should be made very explicit.&rdquo;</p>

<p>The MTA is already facing an<a href="https://www.stopspying.org/latest-news/2020/1/7/stop-sues-mta-for-facial-recognition-records"> open records lawsuit</a> in New York regarding its unannounced deployment of facial recognition technology in the Times Square station last Spring.&nbsp;In London, where former New York City Transit President Andy Byford drew much inspiration for his projects, Cubic has<a href="https://www.digitalspy.com/tech/a839232/london-underground-pay-with-face-technology/"> already tested facial recognition options</a> for payment.,The MTA denies that facial recognition is being considered for any integration into the OMNY payment system.&nbsp;</p>

<p>The retention of cellphone device identifiers by OMNY was singled out by advocates as a significant matter for concern. Law enforcement makes particular use of cellphone location data to identify and track persons of interest. In New York City, US Immigration and Customs Enforcement agents <a href="https://www.univision.com/local/nueva-york-wxtv/ice-used-stingray-phone-tracking-spy-tool-551-times-in-three-years">use cell-site simulators to track down undocumented immigrants</a> by their cellphones. ICE has shown an appetite for both criminal justice and <a href="https://www.theverge.com/2018/1/26/16932350/ice-immigration-customs-license-plate-recognition-contract-vigilant-solutions">transportation</a> data to locate deportation targets nationwide, and recently <a href="https://www.nydailynews.com/new-york/ny-ice-subpoenas-illegal-immigrants-sanctuary-20200118-t6sy4we43rg6jgzsnvui734wku-story.html">subpoenaed</a> New York City authorities for data on four people slated for deportation.</p>

<p>&ldquo;If they&rsquo;re able to single out your individual phone, then can they get more data from your phone company or iCloud backup, and those would require warrants,&rdquo; said Greco of the Legal Aid Society. However, he pointed out that OMNY&rsquo;s privacy policy does not require a warrant to turn information over to law enforcement.</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/16304010/akrales_190528_3453_0028.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="Photo by Amelia Holowaty Krales / The Verge" />
<p>Much like how cashless payments have <a href="https://www.nbcnews.com/business/business-news/when-stores-go-cashless-it-discrimination-n973676">come under fire</a> for discriminating against people without bank accounts or mobile payment apps or debit cards, OMNY&rsquo;s implementation is running into questions over access and equity. Last summer, in the early stages of the new payment system&rsquo;s rollout, riders who pay with a MasterCard debit were reimbursed $5.50 every Friday. In other transit systems run by Cubic, customers can earn fare discounts by <a href="https://adtechdaily.com/2020/01/31/cubic-to-reward-public-transit-riders-with-rollout-of-cubic-interactive/">watching ads on their cellphone</a>.</p>

<p>At a moment when the MTA and Governor Andrew Cuomo are taking a hard line with fare evasion, the idea that OMNY&rsquo;s promotions are effectively subsidizing wealthier riders&rsquo; trips has proven galling for some.</p>

<p>&ldquo;We&rsquo;re creating a system where wealthy riders pay less while Cuomo is deploying an army to crack down on black and brown riders,&rdquo; said Albert Fox Cahn, the director of the Surveillance Technology Oversight Project, which issued a <a href="https://www.stopspying.org/omny">critical report</a> on OMNY last year and filed the open records suit over the MTA&rsquo;s use of facial recognition.</p>

<p>Transit advocates say many low-income riders often pay more per ride because they cannot afford to purchase the weekly or monthly passes that cost riders less money per rides. The MTA has said it will continue providing discounted rides for students and seniors, as well as the discounted weekly and monthly cards, in the coming months.&nbsp;</p>

<p>With regard to fares, the MTA has also included language in its terms of service that indemnify the transit agency for accidental <a href="https://abc7ny.com/5829278/">double payments</a>, several of which have recently taken place when riders swiped in to the subway system with a MetroCard, only for their cellphone&rsquo;s Apple Pay app to accidentally deduct a $2.75 ride from their account after coming into contact with the display.&nbsp;</p>

<p>The relevant passage from OMNY&rsquo;s terms of service essentially blames riders for failing to properly use their devices, and states that the &ldquo;MTA is not responsible if your fare is charged to a card or through a smart device that you did not intend to use.&rdquo; The MTA maintains such language is necessary to indemnify the transit authority against fraud, the faulty double payment was caused by Apple&rsquo;s unannounced update last November to the Express Transit mode payment option, and it has fully refunded fares for all the roughly 500 instances of double payment. However, transit advocates are not satisfied with the response.&nbsp;&nbsp;</p>

<p>&ldquo;It&rsquo;s particularly outrageous that there&rsquo;s explicit language in the Terms of Service saying that if you don&rsquo;t pay, it&rsquo;s your fault, while the MTA is claiming fare evasion is a huge issue and using it to hire 500 new cops,&rdquo; said Cohen from NYPIRG&rsquo;s Straphangers&rsquo; Campaign. &ldquo;How much [money] has been collected in error?&rdquo;&nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“How much [money] has been collected in error?”</p></blockquote></figure>
<p>Aside from concerns over surveillance and functionality, OMNY&rsquo;s terms of service also hint that the MTA is looking to use the tap payment system as a new method to unilaterally exclude people from city subway stations, buses, and commuter railways.</p>

<p>Access to the transit system, according to OMNY&rsquo;s terms of service, can be blocked for &ldquo;suspicion of other illegal activity, in MTA&rsquo;s sole discretion.&rdquo; What&rsquo;s more, the MTA claims the right to suspend access to OMNY &ldquo;if you engage in activity that we conclude, in our sole and absolute discretion, breach our code of conduct.&rdquo; Behaviors deemed illegal by the MTA in recent years include putting your feet up on a seat, sleeping on the train, or passing between subway cars.</p>

<p>In response to queries by <em>The Verge</em>, the MTA&rsquo;s Putre said the terms of service language would be amended to remove prohibitions on people accessing the transit system.&nbsp;</p>

<p>&ldquo;The purpose of OMNY is to provide our customers with an easy and convenient way to pay the fare and we are committed to protecting NYC Transit riders&rsquo; privacy and preventing fraud,&rdquo; Putre said in a statement. &ldquo;For clarity and effective immediately, the OMNY Terms of Service have been amended to remove references to actions that might summarily prohibit access to OMNY services &mdash; a provision that has never been used. The Terms, as they did previously, will continue to protect customers from fraudulent use of their accounts by allowing interruption of OMNY charges in that situation.&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“the OMNY Terms of Service have been amended to remove references to actions that might summarily prohibit access to OMNY services”</p></blockquote></figure>
<p>&ldquo;By putting your feet up or falling asleep, you could get your OMNY account suspended,&rdquo; said Cohen. &ldquo;That&rsquo;s why the MTA needs to be transparent and explain to the public how this will work.&rdquo;</p>

<p>&ldquo;Public transit is public space, it&rsquo;s part of the public sphere. The idea of banning anyone from public transit raises prominent constitutional issues for us,&rdquo; said Daniel Pearlstein, the policy and communications director for the New York Riders Alliance, a transit advocacy organization.</p>

<p>Pearlstein said that the possibility of the MTA issuing unilateral bans to individuals for perceived offenses outside the criminal justice systems could amount to a de facto form of segregation.</p>

<p>&ldquo;We are skeptics around the MTA&rsquo;s narrative on fare evasion. Their rhetoric is about blaming low income riders of color for the ills of a transit system that are overwhelmingly the fault of powerful people going back a generation.&rdquo;</p>

<p>To date, individual exclusion from public transit is not something that has taken place outside of specific criminal cases. NYPD Commissioner Dermot Shea has pushed legislators in Albany to pass a law <a href="https://newyork.cbslocal.com/2019/03/20/sex-offender-new-york-city-subway-ban/">banning repeat sex offenders from using city subways</a>. However, the MTA&rsquo;s codification of unilateral authority to ban people for incidents that may not even rise to the level of criminality may also run into problems around due process.</p>

<p>&ldquo;Here, they&rsquo;re just talking about suspicion. They&rsquo;re not talking about people who&rsquo;ve been convicted: this is suspicion by the MTA. The MTA becomes the judge, jury and executioner,&rdquo; said Greco of the Legal Aid Society.</p>

<p class="has-end-mark">&ldquo;It seems to be even more egregious if it is in the MTA&rsquo;s sole discretion. How do I appeal that? How do they make that determination? Who makes that determination? What standards are they using? Is this going to become like the no-fly list?&rdquo;</p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Ali Winston</name>
			</author>
			
			<author>
				<name>Ingrid Burrington</name>
			</author>
			
			<title type="html"><![CDATA[A pioneer in predictive policing is starting a troubling new project]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2018/4/26/17285058/predictive-policing-predpol-pentagon-ai-racial-bias" />
			<id>https://www.theverge.com/2018/4/26/17285058/predictive-policing-predpol-pentagon-ai-racial-bias</id>
			<updated>2018-04-26T13:36:05-04:00</updated>
			<published>2018-04-26T13:36:05-04:00</published>
			<category scheme="https://www.theverge.com" term="Verge Archives" />
							<summary type="html"><![CDATA[Jeff Brantingham is as close as it gets to putting a face on the controversial practice of &#8220;predictive policing.&#8221; Over the past decade, the University of California-Los Angeles anthropology professor adapted his Pentagon-funded research in forecasting battlefield casualties in Iraq to predicting crime for American police departments, patenting his research and founding a for-profit company [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Garret Beard" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10302845/gbeard_2253_001_social.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Jeff Brantingham is as close as it gets to putting a face on the controversial practice of &ldquo;predictive policing.&rdquo; Over the past decade, the University of California-Los Angeles anthropology professor adapted his Pentagon-funded research in forecasting battlefield casualties in Iraq to predicting crime for American police departments, patenting his research and founding a for-profit company named PredPol, LLC.</p>

<p>PredPol quickly became one of the market leaders in the nascent field of crime prediction around 2012, but also came under fire from activists and civil libertarians who argued the firm provided a sort of &ldquo;tech-washing&rdquo; for racially biased, ineffective policing methods.</p>

<p>Now, Brantingham is using military research funding for another tech and policing collaboration with potentially damaging repercussions: using machine learning, the Los Angeles Police Department&rsquo;s criminal data, and an outdated gang territory map to automate the classification of &ldquo;gang-related&rdquo; crimes.</p>

<p>Being classified as a gang member or related to a gang crime can result in additional criminal charges, heavier prison sentences, or inclusion in a civil gang injunction that restricts a person&rsquo;s movements and ability to associate with other people. Generally, law enforcement determines gang links through a highly subjective, individualized assessment of criminal histories, arrests, interviews, and other intelligence. In recent years, activists in California, Illinois, and other states have pushed back against gang policing measures such as databases and gang injunctions, and in the case of California, <a href="https://www.revealnews.org/blog/legislature-approves-ground-breaking-transparency-for-california-gang-database/">succeeded</a> in winning residents the right to review and appeal their gang classification.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Being classified as a gang member or related to a gang crime can result in additional criminal charges</p></blockquote></figure>
<p>But in a paper on &ldquo;<a href="http://www.aies-conference.com/wp-content/papers/main/AIES_2018_paper_93.pdf">Partially</a> Generative Neural Networks for Gang Crime Classification&rdquo; presented in February at the inaugural <a href="http://www.aies-conference.com/">Artificial</a> Intelligence, Ethics, and Society (AIES) conference, Brantingham and his co-authors propose automating this complex and subjective assessment.</p>

<p>The paper attempts to predict whether crimes are gang-related using a neural network, a complex computational system modeled after a human brain that &ldquo;learns&rdquo; to classify or identify items based on ingesting a training dataset. The authors selected what they determined to be the four most important features (number of suspects, primary weapon used, the type of premises where the crime took place, and the narrative description of the crime) for identifying a gang-related crime from 2014&ndash;16 LAPD data and cross-referenced the crime incidents with a 2009 LAPD map of gang territory to create a training dataset for their neural network.</p>

<p>Researchers tested the accuracy of the network&rsquo;s predictions by seeing how well it classified crime data without one key feature: the narrative text description of the crime, the most time-consuming data for police to collect. This is where the &ldquo;partially generative&rdquo; aspect in the title comes in. In the absence of a written description, the neural network generates new text &mdash; effectively, an algorithmically written crime report based on the three other features used in the training model. The generated text isn&rsquo;t actually <em>read</em> by anyone, nor is it presumed to provide meaningful narrative context replacing a police report, but it is turned into a mathematical vector and incorporated into a final prediction of whether a crime is gang-related.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Effectively, an algorithmically written crime report</p></blockquote></figure>
<p>This paper is the first to be published by a research team co-led by Brantingham studying &ldquo;<a href="https://www.cais.usc.edu/projects/gametheory/">Spatio</a>-Temporal Game Theory &amp; Real-Time Machine Learning for Adversarial Groups&rdquo; at the University of Southern California&rsquo;s Center for Artificial Intelligence and Society (CAIS). CAIS&rsquo; mission states a goal of &ldquo;[sharing] our ideas about how AI can be used to tackle the most difficult societal problems.&rdquo;</p>

<p>Funding for the USC research team that includes Brantingham&rsquo;s project comes from the Minerva Initiative, a Pentagon research program intended to improve the military&rsquo;s understanding of social, political, and behavioral drivers of conflict. According to the <a href="http://minerva.defense.gov/Minerva/Objectives/">Minerva</a> Initiative website, funding is provided to projects that address &ldquo;specific topic areas determined by the Secretary of Defense.&rdquo; Via email, CAIS co-founder and paper co-author Milind Tambe said that the Minerva grant for this project is &ldquo;roughly&rdquo; $1.2 million, to be distributed over three years.</p>

<p>The website for the research team&rsquo;s efforts, including the gang classification paper, opens with references to ISIS and Jabhat al-Nusra before shifting to the terrain of Los Angeles street gangs, a conflation that echoes Brantingham&rsquo;s earlier DOD-funded work that led him to co-founding PredPol. PredPol has sold its services to police everywhere from California to Georgia, as well as the United Kingdom. In 2015, <a href="https://www.revealnews.org/article/arizona-bill-would-fund-predictive-policing-technology/">PredPol</a> unsuccessfully lobbied the Arizona legislature to approve a $2 million appropriation bill to use the firm&rsquo;s forecasting technology to predict gang activity.</p>

<p>First reported in <a href="http://www.sciencemag.org/news/2018/02/artificial-intelligence-could-identify-gang-crimes-and-ignite-ethical-firestorm"><em>Science</em></a>, the paper met with significant concern over its ethical implications. However, reporting on the paper and its fallout made no mention of Brantingham&rsquo;s business connections to PredPol or the military funding of his past and present research.</p>

<p>When asked in a phone interview about whether this research might inform future business endeavors, Brantingham said, &ldquo;This is a separate project, and that&rsquo;s how we&rsquo;re thinking about it.&rdquo; Pointing out that it took a decade for his previous military-funded research to become PredPol, Brantingham emphasized that the paper reflected very preliminary work. &ldquo;It&rsquo;s our job to do careful basic research and make sure we understand how and why things are the way they are, long before any thoughts of use in the field might be contemplated.&rdquo;</p>

<p>However preliminary the research might be and however good its authors&rsquo; intentions are, the paper and Brantingham&rsquo;s involvement raise eyebrows with critics of increasingly automated data-driven policing tech.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Any time you take out the human perspective or interaction, I don’t believe there’s any positives.”</p></blockquote></figure>
<p>Aaron Harvey, a San Diego resident and activist who successfully fended off charges of gang conspiracy from the local prosecutor that could have landed him in prison for over a decade, has since become a prominent California activist pushing back against the state&rsquo;s gang laws, which are the oldest and most severe in the United States.</p>

<p>&ldquo;Any time you take out the human perspective or interaction, I don&rsquo;t believe there&rsquo;s any positives,&rdquo; Harvey said of Brantingham&rsquo;s research. Aside from removing human discretion from the process, Harvey believed that automating such decisions based on historical criminal data from police departments alone would only reinforce past allegations of gang involvement, whether they were true or not. &ldquo;You&rsquo;re making algorithms off a false narrative that&rsquo;s been created for people &mdash; the gang documentation thing is the state defining people according to what they believe,&rdquo; Harvey said. &ldquo;When you plug this into the computer, every crime is gonna be gang-related.&rdquo;</p>

<p>Christo Wilson, assistant professor in computer and information science at Northeastern University and a co-organizer of the Fairness, Accountability, and Transparency in Machine Learning conference, also has concerns about the model&rsquo;s potential to reinforce errors and biases. &ldquo;If I train a model to predict people&rsquo;s height, we know how to interpret the output and gauge its accuracy.&rdquo; But, Wilson noted, &ldquo;gang-related&rdquo; is a complex, subjective determination. &ldquo;So the algorithm is accurate at predicting what? Whether LAPD officers would label a crime as gang-related. Now, maybe the LAPD is 100 percent objective in their determinations of what is and is not gang-related. But if they are not, then the algorithm is going to reproduce their errors and biases.&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“So the algorithm is accurate at predicting what? Whether LAPD officers would label a crime as gang-related”</p></blockquote></figure>
<p>There is ample evidence in the public record of widespread inaccuracies in gang data &mdash; a <a href="https://www.revealnews.org/blog/california-state-auditor-rampant-flaws-in-gang-database/">2016 state audit of California&rsquo;s CalGang database</a> found rampant errors, files that should have been purged years earlier, and unsubstantiated claims of gang involvement.</p>

<p>Micha Gorelick, senior research engineer at machine intelligence research company Cloudera Fast Forward Labs, adds a further objection: the training data assumes gang territories haven&rsquo;t shifted in at least five years.&nbsp;When asked about the use of the 2009 map with 2014-16 crime data, Brantingham said that it was the most recent one available to him and that &ldquo;There is some movement of territories over time but not as much as you would think, actually.&rdquo;</p>

<p>Harvey, who grew up in the Blood-affiliated neighborhood of Lincoln Park in southeast San Diego, pointed out that gang territories and allegiances are highly fluid, and five years is an eternity in street life. &ldquo;You&rsquo;re able to come up with a conclusion of something and never have that on-the-ground interaction with the community,&rdquo; Harvey said of Brantingham&rsquo;s research approach.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Encoding racial bias.”</p></blockquote></figure>
<p>Gorelick says many of the technical decisions in the paper are overly simplistic and technically rudimentary, but he believes the gang territory map is &ldquo;the most nefarious of the features used.&rdquo; Evaluating the likelihood a crime is gang-related based on a blanket labeling of a neighborhood as a gang territory &ldquo;is encoding geographic bias, which, especially in a place like LA, is encoding racial bias.&rdquo;</p>

<p>Wilson also pointed out that the paper fails to incorporate documented approaches to evaluating biased outcomes in machine learning: &ldquo;The authors could have [looked at] whether their algorithm achieves statistical parity across races and ethnicities &hellip; They also could have looked for so-called disparate mistreatment by looking to see if the classification errors are evenly spread across these groups. But they did none of this, even though the methods to do so are well-known in the fair algorithms and even the predictive policing literature.&rdquo;</p>

<p>As for Brantingham&rsquo;s and his co-authors&rsquo; insistence of how preliminary this research is, Wilson noted a similar defense was used for controversial research using AI to identify <a href="https://www.theguardian.com/technology/2017/sep/07/new-artificial-intelligence-can-tell-whether-youre-gay-or-straight-from-a-photograph">sexual orientation</a> or <a href="https://www.telegraph.co.uk/technology/2016/11/24/minority-report-style-ai-learns-predict-people-criminals-facial/">criminality</a>. Like Brantingham&rsquo;s paper, &ldquo;both of these studies also had fundamental methodological problems. But that doesn&rsquo;t obviate the essential ethics of the research itself: should we be doing this research at all?&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Should we be doing this research at all?”</p></blockquote></figure>
<p>Milind Tambe of USC&rsquo;s Center for Artificial Intelligence and Society emphasized that his research center, which houses Brantingham&rsquo;s new research and collaborates with the university&rsquo;s school of social work, &ldquo;focuses on AI for Social Good&rdquo;, and that this preliminary research contributes to improving domain understanding to facilitate said social good. But the number of criticisms of the technical and ethical shortcomings of the paper raises questions about whose version of social good is being served by this research.</p>

<p>For years, PredPol has been plagued with criticisms over the paucity of depth, richness, and rigor the software brings to policing. This new line of research suggests that Brantingham has not taken critiques of his research methodology to heart and is pressing forward with a project that is founded on incomplete data, dubious methods, and a premise that, if applied in the field, could result in more people of color behind bars.</p>

<p><strong>Correction:</strong><em> An earlier version of this report quoted Hau Chan, one of the researchers involved in the 2018 AIES paper, as responding to ethical concerns by saying &ldquo;I&rsquo;m just an engineer,&rdquo; which he did not say. The erroneous quote was based on a transcription error in the Science report that originally reported the remarks. Chan&rsquo;s remarks are more accurately quoted as &ldquo;as a&nbsp;researcher I don&rsquo;t know what&rsquo;s the appropriate answer for that question.&rdquo; The erroneous quote has been removed, and copy has been updated with accurate context. </em>The Verge<em> regrets the error.</em></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Ali Winston</name>
			</author>
			
			<title type="html"><![CDATA[New Orleans ends its Palantir predictive policing program]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2018/3/15/17126174/new-orleans-palantir-predictive-policing-program-end" />
			<id>https://www.theverge.com/2018/3/15/17126174/new-orleans-palantir-predictive-policing-program-end</id>
			<updated>2018-03-15T15:50:21-04:00</updated>
			<published>2018-03-15T15:50:21-04:00</published>
			<category scheme="https://www.theverge.com" term="Verge Archives" />
							<summary type="html"><![CDATA[Two weeks ago, The Verge reported the existence of a six-year predictive policing collaboration between the New Orleans Police Department and Palantir Technologies, a data mining giant co-founded by Peter Thiel. The nature of the partnership, which used Palantir&#8217;s network-analysis software to identify potential aggressors and victims of violence, was unknown to the public and [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Garret Beard" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10302845/gbeard_2253_001_social.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p><a href="https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd">Two weeks ago</a>, <em>The Verge</em> reported the existence of a six-year predictive policing collaboration between the New Orleans Police Department and Palantir Technologies, a data mining giant co-founded by Peter Thiel. The nature of the partnership, which used Palantir&rsquo;s network-analysis software to identify potential aggressors and victims of violence, was unknown to the public and key members of the city council prior to publication of <em>The Verge&rsquo;s </em>findings.</p>

<p>Yesterday, outgoing New Orleans Mayor Mitch Landrieu&rsquo;s press office told the <a href="http://www.nola.com/crime/index.ssf/2018/03/palantir_new_orleans_gang_case.html"><em>Times-Picayune</em></a><em> </em>that his office would not renew its pro bono contract with Palantir, which has been extended three times since 2012. The remarks were the first from Landrieu&rsquo;s office concerning Palantir&rsquo;s work with the NOPD. The mayor did not respond to repeated requests for comment from <em>The Verge</em> for the February 28th article, done in partnership with <a href="https://www.theinvestigativefund.org/">Investigative Fund</a>, or from local media since news of the partnership broke.</p>

<p>There is also potential legal fallout from the revelation of New Orleans&rsquo; partnership with Palantir. Several defense attorneys interviewed by <em>The Verge, </em>including lawyers who represented people accused of membership in gangs that, according to documents and interviews, were identified at least in part through the use of Palantir software, said they had never heard of the partnership nor seen any discovery evidence referencing Palantir&rsquo;s use by the NOPD.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Landrieu’s office will not renew its contract with Palantir, which has been extended three times since 2012</p></blockquote></figure>
<p>Yesterday, Orleans Criminal District Court Judge Camille Buras agreed to hear a motion from Kentrell Hickerson challenging his racketeering and drug conspiracy convictions. Hickerson&rsquo;s attorney, Kevin Vogeltanz, filed his <a href="https://www.documentcloud.org/documents/4411697-Hickerson-Appeal-Defendant-s-Motion-to.html">motion</a> with the court on March 8th, citing the nondisclosure of any relevant intelligence from Palantir about his client&rsquo;s alleged involvement in the 3NG street gang as a potential violation of Hickerson&rsquo;s rights. Under the Supreme Court case <em>Brady v. Maryland, </em>defendants have the right to procure any and all potentially exculpatory evidence assembled against them by law enforcement.</p>

<p>After the hearing, Orleans Parish District Attorney spokesperson Ken Daley told <em>The</em> <em>New Orleans Advocate </em>that Hickerson was grasping at straws. &ldquo;The NOPD&rsquo;s Palantir software played no role whatsoever in Mr. Hickerson&rsquo;s indictment and prosecution. Furthermore, any claim that the NOPD&rsquo;s Palantir program contained exculpatory evidence to Mr. Hickerson&rsquo;s defense is without merit,&rdquo; Daley told the paper.</p>

<p>However, according to Vogeltanz, during the hearing, Assistant District Attorney Alex Calenda admitted that he was an &ldquo;end user&rdquo; of NOPD&rsquo;s Palantir system. During an interview with <em>The Verge, </em>former New Orleans Police Chief Ronal Serpas said Palantir was used to identify potential members of the 3NG, the 39ers, and the 110ers gangs in several prominent racketeering cases.</p>

<p>On April 3rd, there will be a status hearing where Judge Buras will rule on whether or not to grant Vogeltanz&rsquo;s motion. Should that come to pass, subpoenas will be issued to compel individuals with knowledge of the NOPD-Palantir partnership to testify at an evidentiary hearing.</p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Ali Winston</name>
			</author>
			
			<title type="html"><![CDATA[Palantir has secretly been using New Orleans to test its predictive policing technology]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd" />
			<id>https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd</id>
			<updated>2018-02-27T15:25:25-05:00</updated>
			<published>2018-02-27T15:25:25-05:00</published>
			<category scheme="https://www.theverge.com" term="Features" /><category scheme="https://www.theverge.com" term="Policy" /><category scheme="https://www.theverge.com" term="Report" />
							<summary type="html"><![CDATA[In May and June 2013, when New Orleans&#8217; murder rate was the sixth-highest in the United States, the Orleans Parish district attorney handed down two landmark racketeering indictments against dozens of men accused of membership in two violent Central City drug trafficking gangs, 3NG and the 110ers. Members of both gangs stood accused of committing [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10302629/gbeard_2253_001.gif?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>In May and June 2013, when New Orleans&rsquo; murder rate was the sixth-highest in the United States, the Orleans Parish district attorney handed down two landmark racketeering indictments against dozens of men accused of membership in two violent Central City drug trafficking gangs, 3NG and the 110ers. Members of both gangs stood accused of committing 25 murders as well as several attempted killings and armed robberies.</p>

<p>Subsequent investigations by the Bureau of Alcohol, Tobacco, Firearms and Explosives, the Federal Bureau of Investigation, and local agencies produced further RICO indictments, including that of a 22-year-old man named Evans &ldquo;Easy&rdquo; Lewis, a member of a gang called the 39ers who was accused of participating in a drug distribution ring and several murders.</p>
<div class="wp-block-vox-media-highlight vox-media-highlight alignnone"><h3 class="wp-block-heading" id="">&nbsp;</h3>

<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10307033/MiscLogos_Investigative_Fund.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />


<p><em>This article was reported in partnership with </em><a href="https://www.theinvestigativefund.org"><em>The Investigative Fund at The Nation Institute</em></a><em>.</em></p>
</div>
<p>According to Ronal Serpas, the department&rsquo;s chief at the time, one of the tools used by the New Orleans Police Department to identify members of gangs like 3NG and the 39ers came from the Silicon Valley company Palantir. The company provided software to a secretive NOPD program that traced people&rsquo;s ties to other gang members, outlined criminal histories, analyzed social media, and predicted the likelihood that individuals would commit violence or become a victim. As part of the discovery process in Lewis&rsquo; trial, the government turned over more than 60,000 pages of documents detailing evidence gathered against him from confidential informants, ballistics, and other sources &mdash; but they made no mention of the NOPD&rsquo;s partnership with Palantir, according to a source familiar with the 39ers trial.</p>

<p>The program began in 2012 as a partnership between New Orleans Police and Palantir Technologies, a data-mining firm founded with seed money from the CIA&rsquo;s venture capital firm. According to interviews and documents obtained by <em>The Verge,</em> the initiative was essentially a predictive policing program, similar to the <a href="https://www.theverge.com/2016/8/19/12552384/chicago-heat-list-tool-failed-rand-test">&ldquo;heat list&rdquo; in Chicago</a> that purports to predict which people are likely drivers or victims of violence.</p>

<p>The partnership has been <a href="https://www.documentcloud.org/documents/4344819-K14-182-Palantir-Technologies-Amendment-1.html">extended</a> <a href="https://www.documentcloud.org/documents/4344818-K16-430-Palantir-Technologies-Amendment-2.html">three</a> <a href="https://www.documentcloud.org/documents/4344817-K17-453-Palantir-Technologies-Inc-Amd-3.html">times</a>, with the third extension scheduled to expire on February 21st, 2018. The city of New Orleans and Palantir have not responded to questions about the program&rsquo;s current status.</p>

<p>Predictive policing technology has proven highly controversial wherever it is implemented, but in New Orleans, the program escaped public notice, partly because Palantir established it as a philanthropic relationship with the city through Mayor Mitch Landrieu&rsquo;s signature NOLA For Life program. Thanks to its philanthropic status, as well as New Orleans&rsquo; &ldquo;strong mayor&rdquo; model of government, the agreement never passed through a public procurement process. &nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“No one in New Orleans even knows about this, to my knowledge.”</p></blockquote></figure>
<p>In fact, key city council members and attorneys contacted by <em>The Verge</em> had no idea that the city had any sort of relationship with Palantir, nor were they aware that Palantir used its program in New Orleans to market its services to another law enforcement agency for a multimillion-dollar contract.</p>

<p>Even James Carville, the political operative instrumental in bringing about Palantir&rsquo;s collaboration with NOPD, said that the program was not public knowledge. &ldquo;No one in New Orleans even knows about this, to my knowledge,&rdquo; Carville said.</p>

<p>More than half a decade after the partnership with New Orleans began, Palantir has patented at least one <a href="https://www.google.com/patents/US9129219?dq=inassignee:">crime-forecasting system</a> and has sold similar software to foreign intelligence services for predicting the likelihood of individuals to commit terrorism.</p>

<p>Even within the law enforcement community, there are concerns about the potential civil liberties implications of the sort of individualized prediction Palantir developed in New Orleans, and whether it&rsquo;s appropriate for the American criminal justice system.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s not the right tool for local and state law enforcement.” </p></blockquote></figure>
<p>&ldquo;They&rsquo;re creating a target list, but we&rsquo;re not going after Al Qaeda in Syria,&rdquo; said a former law enforcement official who has observed Palantir&rsquo;s work first-hand as well as the company&rsquo;s sales pitches for predictive policing. The former official spoke on condition of anonymity to freely discuss their concerns with data mining and predictive policing. &ldquo;Palantir is a great example of an absolutely ridiculous amount of money spent on a tech tool that may have some application,&rdquo; the former official said. &ldquo;However, it&rsquo;s not the right tool for local and state law enforcement.&rdquo;</p>

<p>Six years ago, one of the world&rsquo;s most secretive and powerful tech firms developed a contentious intelligence product in a city that has served as a neoliberal laboratory for everything from charter schools to radical housing reform since Hurricane Katrina. Because the program was never public, important questions about its basic functioning, risk for bias, and overall propriety were never answered.</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10307803/spot_03_01.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />
<p>Co-founded in 2004 by Alexander Karp and Peter Thiel (the company&rsquo;s single largest shareholder), Palantir Technologies&rsquo; rapid ascent to becoming one of the highest-valued private Silicon Valley companies has been driven by lucrative contracts with the Pentagon and United States intelligence services, as well as foreign security services. In recent years, Palantir has sought to expand its data fusion and analysis business to the private sector, with <a href="https://www.buzzfeed.com/williamalden/palantir-was-dumped-by-a-key-cybersecurity-client?utm_term=.pczB4QkerY#.vtLqAdj9b6">mixed success</a>.</p>

<p>Prediction is not new territory for Palantir. Since at least 2009, Palantir was used by the Pentagon to predict the location of improvised explosive devices in Afghanistan and Iraq &mdash; a wartime risk-assessment program absent the civil liberties concerns that come with individualized predictive policing. Its commercial software platform, Metropolis, also <a href="https://www.nasdaq.com/article/palantirs-ipo-plans-are-just-as-secretive-as-the-company-itself-cm846619">reportedly</a> uses predictive analytics to help businesses develop consumer markets and streamline investments. But before 2012 with the New Orleans program, there is no publicly available record that Palantir had ventured into predictive policing.</p>

<p>Interest and investment in predictive policing technology accelerated after 2009 when the National Institute of Justice began issuing grants for pilot projects in crime forecasting. Those grants underpin some of the best-known &mdash; and most scrutinized &mdash; predictive policing efforts in Chicago and Los Angeles. Programs vary, and the algorithms are often proprietary, but they all aim to ingest vast stores of data &mdash; geography, criminal records, the weather, social media histories &mdash; and make predictions about individuals or places likely to be involved in a crime. In the following years, many startup firms have struggled to monetize the crime-fighting method &mdash; most notably <a href="https://archives.sfweekly.com/sanfrancisco/all-tomorrows-crimes-the-future-of-policing-looks-a-lot-like-good-branding/Content?oid=2827968">PredPol</a>, a California startup whose contract awards have foundered after an initial blitz of publicity in the early 2010s.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Before 2012 with the New Orleans program, there is no publicly available record that Palantir had ventured into predictive policing</p></blockquote></figure>
<p>As more departments and companies began experimenting with predictive policing, government-funded research <a href="https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html">cast doubts on its efficacy</a>, and independent academics found it can have a <a href="https://hrdag.org/2016/10/10/predictive-policing-reinforces-police-bias/">disparate impact</a> on poor communities of color. A <a href="http://onlinelibrary.wiley.com/doi/10.1111/j.1740-9713.2016.00960.x/pdf">2016 study</a> reverse-engineered PredPol&rsquo;s algorithm and found that it replicated &ldquo;systemic bias&rdquo; against over-policed communities of color and that historical crime data did not accurately predict future criminal activity. One of the researchers, a Michigan State PhD candidate named William Isaac, had not previously heard of New Orleans&rsquo; partnership with Palantir, but he recognized the data-mapping model at the heart of the program. &ldquo;I think the data they&rsquo;re using, there are serious questions about its predictive power. We&rsquo;ve seen very little about its ability to forecast violent crime,&rdquo; Isaac said.</p>

<p>According to interviews and documents obtained by <em>The Verge</em>, Palantir first approached New Orleans in 2012 through a well-known intermediary: James Carville, the Democratic Party power broker and architect of Bill Clinton&rsquo;s successful 1992 presidential campaign. Carville is a paid adviser of Palantir whose involvement with the data-mining company <a href="https://twitter.com/palantirtech/status/111138535163179009">dates back at least to 2011</a>. &nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“I am the sole driver of that project.”</p></blockquote></figure>
<p>In an interview, Carville told <em>The Verge</em> that he was the impetus for the collaboration between Palantir and New Orleans. &ldquo;I am the sole driver of that project. It was entirely my idea,&rdquo; said Carville, adding that he and Palantir CEO Alex Karp flew down to New Orleans to meet with Mayor Landrieu. &ldquo;To me, it was a case of morality. Young people were shooting each other, and the public wasn&rsquo;t as involved as they should have been.&rdquo;</p>

<p>The documents outlining Palantir&rsquo;s relationship with New Orleans describe the company&rsquo;s role as &ldquo;pro bono&rdquo; and philanthropic. In 2015, Palantir mentioned its work in New Orleans in its <a href="https://www.palantir.com/philanthropy-engineering/annual-report/2015/murder-reduction/">annual philanthropic report</a>, characterizing the effort as collaborative &ldquo;network analysis&rdquo; for law enforcement and other city stakeholders. &nbsp;</p>

<p>Carville&rsquo;s remarks on a Bay Area public radio station four years ago elucidate how Palantir&rsquo;s relationship with the city came about. In a <a href="https://ww2.kqed.org/forum/2014/01/16/love-and-war-with-mary-matalin-and-james-carville/">January 2014 appearance</a> on KQED&rsquo;s Forum talk show, Carville and his wife Mary Matalin touted Palantir&rsquo;s work in New Orleans as a major driver in the city&rsquo;s two-year decline in the murder rate.</p>

<p>&ldquo;The CEO of a company called Palantir &ndash; the CEO, a guy named Alex Karp &mdash; said that they wanted to do some charitable work, and what&rsquo;d I think? I said, we have a really horrific crime rate in New Orleans,&rdquo; Carville told KQED Forum&rsquo;s host Michael Krasny, without mentioning his professional relationship to Palantir. &ldquo;And so he came down and met with our mayor&hellip; they both had the same reaction as to the utter immorality of young people killing other young people and society not doing anything about it. And we were able to, at no cost to the city, start integrating data and predict and intervene as to where these conflicts were going to be arising. We&rsquo;ve seen probably a third of a reduction in our murder rate since this project started.&rdquo;</p>

<p>Matalin, who is also a political consultant, made it clear to Krasny that the prediction work being done with NOPD by the Palo Alto firm was both a prototype and potentially could sweep up innocent people.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Unless you’re the cousin of some drug dealer that went bad, you’re going to be okay.”</p></blockquote></figure>
<p>&ldquo;We&rsquo;re kind of a prototype,&rdquo; said Matalin. &ldquo;Unless you&rsquo;re the cousin of some drug dealer that went bad, you&rsquo;re going to be okay.&rdquo;</p>

<p>Ronal Serpas, the New Orleans chief of Police from 2010 through August 2014, recalled his initial contact with Palantir&rsquo;s staff during a meeting initiated by Mayor Landrieu&rsquo;s office. &ldquo;They came over and discussed the kind of work they do in theaters of war, the kind of work they do in other parts of the world,&rdquo; Serpas said during an interview in his office at Loyola University. &ldquo;My impression was Palantir was also interested in trying to develop products that could do some predicting of crime.&rdquo;</p>

<p>The relationship between New Orleans and Palantir was finalized on February 23rd, 2012, when Mayor Landrieu signed an <a href="https://www.documentcloud.org/documents/4344821-K12-168-Palantir-Technologies.html">agreement</a> granting New Orleans free access to the firm&rsquo;s public sector data integration platform. Licenses and tech support for Palantir&rsquo;s law enforcement platform can run to millions of dollars annually, <a href="https://www.documentcloud.org/documents/4350052-LASD-Palantir-Audit.html">according to an audit</a> of the Los Angeles County Sheriff&rsquo;s Department.</p>

<p>In January 2013, New Orleans would also allow Palantir to use its law enforcement account for LexisNexis&rsquo; Accurint product, which is comprised of millions of searchable public records, court filings, licenses, addresses, phone numbers, and social media data. The firm also got free access to city criminal and non-criminal data in order to train its software for crime forecasting. Neither the residents of New Orleans nor key city council members whose job it is to oversee the use of municipal data were aware of Palantir&rsquo;s access to reams of their data.</p>
<hr class="wp-block-separator" />
<p>Palantir has a history of secrecy, and New Orleans is not the only instance of the company conducting business with government agencies through associated nonprofits, avoiding the public procurement process.</p>

<p>Palantir provides data analysis and integration for the Los Angeles Police Department, but the arrangement was made through the <a href="https://www.propublica.org/article/private-donors-supply-spy-gear-to-cops">LA Police Foundation</a> rather than the LAPD itself. In New York, the firm&rsquo;s contract was not disclosed by the city comptroller for security reasons (<a href="https://theintercept.com/2017/07/07/nypd-surveillance-post-act-lies-misinformation-transparency/">NYPD does this with surveillance equipment contracts</a>), and it was never brought to the city council for approval. Palantir&rsquo;s work with NYPD only became public when documents about its <a href="https://www.buzzfeed.com/williamalden/theres-a-fight-brewing-between-the-nypd-and-silicon-valley?utm_term=.xvkaMKDX3#.qqj2vlK0A">tumultuous relationship</a> with the country&rsquo;s largest police force were leaked to <em>BuzzFeed</em> reporter William Alden. &nbsp;&nbsp;</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10307807/spot_01_01.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />
<p>In New Orleans, according to extensive reporting by <em>The Verge</em>, Mayor Landrieu&rsquo;s office, the city attorney, and the NOPD appear to be the only entities aware of the firm&rsquo;s work in the city. Key members of the city council were not aware of Palantir&rsquo;s work in New Orleans until approached by <em>The Verge</em>.</p>

<p>The Palantir partnership would have likely received more scrutiny from the city council had it been itemized in a budget, but the council&rsquo;s approval isn&rsquo;t required for such a program. The structure of city government in New Orleans is predicated on a &ldquo;strong mayor&rdquo; model where the council does not have approval authority over contracts or policies for the city police department.</p>

<p>Cities around the country have recently begun to grapple with the question of if and how municipalities should regulate data sharing and privacy. Some cities like Seattle and Oakland have passed legislation establishing committees to craft guidelines and conduct oversight, while others like New York are discussing what role city councils should play regarding privacy in the digital age.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“I don’t think there’s anyone in the council that would say they were aware that this had even occurred.”</p></blockquote></figure>
<p>Several civil and criminal attorneys who are heavily involved with the New Orleans&rsquo; criminal justice system were also unaware of any predictive policing efforts by the NOPD. Multiple criminal attorneys had never seen Palantir analytic products as part of any discovery materials turned over to them in the course of trial cases, although such analysis would typically be required to be given to defense counsel if it had been used as part of an NOPD investigation. &nbsp;</p>

<p>Jason Williams, the president of the New Orleans city council and a former defense attorney, reviewed documentation of Palantir&rsquo;s collaboration with NOPD at the request of <em>The Verge</em>. Williams said he had never heard of the company&rsquo;s involvement with NOPD.</p>

<p>&ldquo;I don&rsquo;t think there&rsquo;s anyone in the council that would say they were aware that this had even occurred because this was not part and parcel to any of our budget allocations or our oversight,&rdquo; Williams said in an interview during a council meeting.</p>

<p>Williams, who also served as a criminal court judge before his election to the city council in 2014, said that he wasn&rsquo;t necessarily opposed to using data-driven methods to help at-risk New Orleanians.</p>

<p>&ldquo;My primary concern would be how this was used in my city. If it was used to identify marginalized people that are at risk of being harmed, to stop them from being harmed, I&rsquo;m going to have a whole different appreciation of that than I&rsquo;m gonna have if this system was used nefariously.&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s almost as if New Orleans were contracting its own version of the NSA.”</p></blockquote></figure>
<p>Councilwoman Susan Guidry, who chairs the council&rsquo;s criminal justice committee and has been in office since 2010, was also unaware of New Orleans&rsquo; partnership with Palantir and NOPD&rsquo;s crime-forecasting work. When shown NOPD documentation of the program, Guidry told <em>The Verge</em> she had never encountered it before.</p>

<p><em>The Verge</em> shared documentation of the program with a group of New Orleans civil rights attorneys. None were previously aware of NOPD&rsquo;s prediction work &mdash; though one had heard rumors that Palantir was collaborating with NOPD &mdash; and they were troubled by the secrecy that surrounded the program.</p>

<p>&ldquo;It&rsquo;s especially disturbing that this level of intrusive research into the lives of ordinary residents is kept virtually a secret,&rdquo; said Jim Craig, the director of the Louisiana office of the Roderick and Solange MacArthur Justice Center. Craig, who reviewed documentation of the program at <em>The Verge</em>&rsquo;s request, compared the predictive policing effort to signals intelligence work. &ldquo;It&rsquo;s almost as if New Orleans were contracting its own version of the NSA to conduct 24/7 surveillance of the lives of its people,&rdquo; Craig said. Authorities, he believes, have kept the program under wraps because it would elicit widespread outrage. &ldquo;Right now, people are outraged about traffic cameras and have no idea this data-mining project is going on,&rdquo; Craig said. &ldquo;The South is still a place where people very much value their privacy.&rdquo;</p>

<p>Nicholas Corsaro and Robin Engel are two University of Cincinnati professors who conducted a recent evaluation of the New Orleans&rsquo; violence reduction strategy that Palantir was used for, and helped design an NOPD gang database that the Palantir forecasting model draws on. Both Engel and Corsaro were unaware of New Orleans&rsquo; predictive policing efforts, its involvement with Palantir, or even the fact that the database they designed was feeding into the program. &ldquo;Trying to predict who is going to do what based on last year&rsquo;s data is just horseshit,&rdquo; Corsaro said in an interview.</p>

<p>Palantir did sometimes publicly refer to its work in New Orleans. However, none of Palantir&rsquo;s public presentations about the program that <em>The Verge</em> was able to identify went into detail about individualized crime forecasting, scraping of social media data, or the use of social network analysis for crime prediction. Instead, the company <a href="https://www.palantir.com/philanthropy-engineering/annual-report/2015/murder-reduction/">represented its work in New Orleans</a> as &ldquo;developing a better understanding of violent crime propensity and designing targeted interventions to protect the city&rsquo;s most vulnerable populations.&rdquo;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“Trying to predict who is going to do what based on last year’s data is just horseshit.”</p></blockquote></figure>
<p>In a public speaking appearance where he touted the efficacy of their work in New Orleans, Courtney Bowman, a Palantir civil liberties engineer heavily involved with the company&rsquo;s work with NOPD, acknowledged that excessive secrecy could deepen the rift between law enforcement and over-policed communities. During a <a href="https://dataedge.ischool.berkeley.edu/2016/schedule/using-data-drive-social-impact">May 6th, 2016 presentation</a> at the UC Berkeley School of Information&rsquo;s DataEdge conference, Bowman said, &ldquo;These sorts of programs only work if the community is comfortable with the degree to which this type of information is being applied and if they&rsquo;re aware of how the information is being used.&rdquo;</p>

<p>The city of New Orleans and Palantir both declined requests for comment about how their partnership was formed, and what sort of input other elected officials and the public had into the data-mining firm&rsquo;s predictive policing efforts.</p>

<p>Ronal Serpas, who ran NOPD when the partnership with Palantir began, said that he believed the city council and public at large should have been informed about the police department&rsquo;s decision to engage in predictive policing with Palantir. The role of local legislatures and governing bodies in overseeing the sharing of government data is far from settled, but Serpas believes that agreements with firms like Palantir warrant greater scrutiny.</p>

<p>&ldquo;It is, to me, something that certainly requires a view, requires a look,&rdquo; Serpas said.</p>
<hr class="wp-block-separator" />
<p>Though neither Palantir staff nor current New Orleans officials would talk about the day-to-day functioning of the crime-forecasting initiative, documents obtained by <em>The Verge</em>, external studies, and the recollections of former Chief Serpas offer a portrait of how the predictive policing beta test has functioned over the past six years.</p>
<div class="wp-block-vox-media-highlight vox-media-highlight alignnone"><h3 class="wp-block-heading" id="">&nbsp;</h3>

<div class="documentcloud-embed"><a href="https://www.documentcloud.org/documents/4344815-Nola-hc3-Final-20140403.html" target="_blank" rel="noopener noreferrer">View Link</a></div>


<p><em>A slide deck for a presentation by New Orleans city employees at Palantir&rsquo;s 2014 &ldquo;HobbitCon&rdquo; internal conference about the company&rsquo;s pro bono work in New Orleans.&nbsp;</em></p>
</div>
<p>Palantir&rsquo;s prediction model in New Orleans used an intelligence technique called social network analysis (or SNA) to draw connections between people, places, cars, weapons, addresses, social media posts, and other indicia in previously siloed databases. Think of the analysis as a practical version of a Mark Lombardi painting that highlights connections between people, places, and events. After entering a query term &mdash; like a partial license plate, nickname, address, phone number, or social media handle or post &mdash; NOPD&rsquo;s analyst would review the information scraped by Palantir&rsquo;s software and determine which individuals are at the greatest risk of either committing violence or becoming a victim, based on their connection to known victims or assailants.</p>

<p>The data on individuals came from information scraped from social media as well as NOPD <a href="https://www.documentcloud.org/documents/4344815-Nola-hc3-Final-20140403.html#document/p5/a398052">criminal databases</a> for ballistics, gangs, probation and parole information, jailhouse phone calls, calls for service, the central case management system (i.e., every case NOPD had on record), and the department&rsquo;s repository of field interview cards. The latter database represents every documented encounter NOPD has with citizens, even those that don&rsquo;t result in arrests. In 2010, <em>The</em> <em>Times-Picayune</em> revealed that Chief Serpas had <a href="http://www.nola.com/crime/index.ssf/2012/07/as_nopd_files_away_mountain_of.html">mandated</a> that the collection of field interview cards be used as a measure of officer and district performance, resulting in over 70,000 field interview cards filled out in 2011 and 2012. The practice resembled NYPD&rsquo;s &ldquo;stop and frisk&rdquo; program and was instituted with the express purpose of gathering as much intelligence on New Orleanians as possible, regardless of whether or not they committed a crime.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>A partial license plate, a nickname, an address, a phone number, a social media handle</p></blockquote></figure>
<p>NOPD then used the list of potential victims and perpetrators of violence generated by Palantir to target individuals for the city&rsquo;s CeaseFire program. CeaseFire is a form of the decades-old carrot-and-stick strategy developed by David Kennedy, a professor at John Jay College in New York. In the program, law enforcement informs potential offenders with criminal records that they know of their past actions and will prosecute them to the fullest extent if they re-offend. If the subjects choose to cooperate, they are &ldquo;called in&rdquo; to a required meeting as part of their conditions of probation and parole and are offered job training, education, potential job placement, and health services. In New Orleans, the CeaseFire program is run under the broader umbrella of NOLA For Life, which is Mayor Landrieu&rsquo;s pet project that he has funded through millions of dollars from private donors.</p>

<p>According to Serpas, the person who initially ran New Orleans&rsquo; social network analysis from 2013 through 2015 was Jeff Asher, a former intelligence agent who joined NOPD from the CIA. If someone had been shot, Serpas explained, Asher would use Palantir&rsquo;s software to find people associated with them through field interviews or social media data. &ldquo;This data analysis brings up names and connections between people on FIs [field interview cards], on traffic stops, on victims of reports, reporting victims of crimes together, whatever the case may be. That kind of information is valuable for anybody who&rsquo;s doing an investigation,&rdquo; Serpas said.</p>

<p>According to Palantir&rsquo;s own <a href="https://www.documentcloud.org/documents/4344816-NOLA-Murder-Reduction-White-Paper.html">documentation</a>, Asher and his colleagues ran social network analyses of every victim of a fatal or non-fatal shooting in New Orleans from 2011 through 2013. Through this technique, which Asher dubbed &ldquo;The NOLA Model,&rdquo; the city devised a list of roughly 3,900 people who were at the highest risk of being involved in gun violence because of their connection to a previous shooter or victim. &ldquo;We can identify 30-40% of shooting victims,&rdquo; Asher claimed at Palantir&rsquo;s 2014 internal conference. Asher declined repeated requests for an interview.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“The NOLA Model.”</p></blockquote></figure>
<p>Theoretically, Asher&rsquo;s approach is substantially influenced by the research of Andrew Papachristos, a Yale professor who tracked violence as if it were a communicable disease spreading through networks of association. However, since his work was cited as the academic underpinning for crime-forecasting models employed by PredPol and the Chicago Police Department, Papachristos has <a href="https://yaledailynews.com/blog/2017/01/17/yale-study-models-gun-violence-as-a-social-epidemic/">sought to distance</a> his research from those methods.</p>

<p>Once NOPD generated its list of likely shooters and victims, the police department and social service providers &mdash; for the &ldquo;carrot&rdquo; side of NOLA For Life &mdash; would select people who were either incarcerated or on court supervision for a &ldquo;call-in meeting.&rdquo;</p>

<p>Mayor Landrieu&rsquo;s office touted the program frequently, referring to it as an <a href="http://www.nolaforlife.org/progress/stop-the-shootings/ceasefire/">essential part</a> of New Orleans&rsquo; criminal justice policy. Palantir also claimed credit: &ldquo;we&rsquo;re helping to break the cycle of violence&rdquo; in New Orleans, read a passage in the company&rsquo;s 2015 <a href="https://www.palantir.com/philanthropy-engineering/annual-report/2015/murder-reduction/">Philanthropy Engineering report</a>. But its actual impact is unclear.</p>

<p>Of the 308 people who participated in call-ins <a href="https://www.documentcloud.org/documents/4344807-GVRS-Services-Summary.html">from October 2012 through March 2017</a>, seven completed vocational training, nine completed &ldquo;paid work experience,&rdquo; none finished a high school diploma or GED course, and 32 were employed at one time or another through referrals. Fifty participants were detained following their call-in, and two have since died.</p>

<p>By contrast, law enforcement vigorously pursued its end of the program. From November 2012, when the new Multi-Agency Gang Unit was founded, through March 2014, racketeering indictments escalated: 83 alleged gang members in eight gangs were indicted in the 16-month period, according to an <a href="https://www.documentcloud.org/documents/4344815-Nola-hc3-Final-20140403.html">internal Palantir presentation</a>.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>The city devised a list of roughly 3,900 people who were at the highest risk of being involved in gun violence </p></blockquote></figure>
<p>Call-ins declined precipitously after the first few years. According to city records, eight group call-ins took place from 2012 to 2014, but only three took place in the following three years. Robert Goodman, a New Orleans native who became a community activist after completing a prison sentence for murder, worked as a &ldquo;responder&rdquo; for the city&rsquo;s CeaseFire program until August 2016, discouraging people from engaging in retaliatory violence. Over time, Goodman noticed more of an emphasis on the &ldquo;stick&rdquo; component of the program and more control over the non-punitive aspects of the program by city hall that he believes undermined the intervention work. &ldquo;It&rsquo;s supposed to be ran by people like us instead of the city trying to dictate to us how this thing should look,&rdquo; he said. &ldquo;As long as they&rsquo;re not putting resources into the hoods, nothing will change. You&rsquo;re just putting on Band-Aids.&rdquo;</p>

<p>After the first two years of Palantir&rsquo;s involvement with NOPD, the city saw a marked drop in murders and gun violence, but it was <a href="http://www.nola.com/crime/index.ssf/2017/01/shootings_new_orleans_2016_chi.html">short-lived</a>. Even former NOPD Chief Serpas believes that the preventative effect of calling in dozens of at-risk individuals &mdash; and indicting dozens of them &mdash; began to diminish.</p>

<p>&ldquo;When we ended up with nearly nine or 10 indictments with close to 100 defendants for federal or state RICO violations of killing people in the community, I think we got a lot of people&rsquo;s attention in that criminal environment,&rdquo; Serpas said, referring to the racketeering indictments. &ldquo;But over time, it must&rsquo;ve wore off because before I left in August of &lsquo;14, we could see that things were starting to slide&rdquo;</p>

<p>Nick Corsaro, the University of Cincinnati professor who helped build NOPD&rsquo;s gang database, also worked on <a href="https://www.documentcloud.org/documents/4350047-University-of-Cincinnatti-NOPD-Research.html">an evaluation</a> of New Orleans&rsquo; CeaseFire strategy. He found that New Orleans&rsquo; overall decline in homicides coincided with the city&rsquo;s implementation of CeaseFire program, but the Central City neighborhoods targeted by the program &ldquo;did not have statistically significant declines that corresponded with November 2012 onset date.&rdquo;</p>

<p>Put plainly, the study did not confirm claims by Palantir and city officials that data-driven interventions were behind the temporary drop-off in violent crime. &nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s exactly this facet of Chicago’s ‘heat’ list model that has exposed CPD to a great deal of public scrutiny.”</p></blockquote></figure>
<p>Though the call-ins dropped off, emails obtained by <em>The Verge</em> indicate that the NOPD continued to use Palantir for law enforcement. Palantir declined repeated requests for comment, but the emails also show that the company was aware of the potential risks posed by predictive policing algorithms, and the negative publicity that comes with them. On May 23rd, 2016, Palantir civil liberties engineer Courtney Bowman responded to a request by NOPD crime analyst Zach Donnini about whether Palantir could help generate numerical rankings for individuals&rsquo; risk for committing or becoming the victim of a shooting.</p>

<p>&ldquo;I have some serious concerns about instituting a ranking or numeric scoring approach,&rdquo; <a href="https://www.documentcloud.org/documents/4344803-Emails-Winston-17-4839.html">Bowman wrote</a>. &ldquo;It&rsquo;s exactly this facet of Chicago&rsquo;s &ldquo;heat&rdquo; list model that has exposed CPD to a great deal of public scrutiny,&rdquo; the email reads, linking to two articles critiquing Chicago&rsquo;s predictive policing approach.</p>

<p>&ldquo;The looming concern is that an opaque scoring algorithm substitutes the veneer of quantitative certainty for more holistic, qualitative judgement and human culpability,&rdquo; Bowman wrote. &ldquo;One of the lasting virtues of the SNA work we&rsquo;ve done to date is that we&rsquo;ve kept human analysts in the loop to ensure that networks are being explored and analyzed in a way that passes the straight-face test.&rdquo;</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10307809/spot_02_01.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />
<p>Regardless of the sustainability of New Orleans&rsquo; murder reduction, Palantir used its work with the NOPD to solicit large contracts with other American cities. Later,<strong> </strong>the company won lucrative contracts for predictive programs with foreign governments.</p>

<p>According to <a href="https://www.documentcloud.org/documents/4377436-Chicago-PD-Palantir-emails.html">emails</a> obtained by <em>The Verge</em>, Palantir marketing staff first contacted the Chicago Police Department in late 2013 about the possibility of selling a predictive policing package based on the firm&rsquo;s New Orleans work, eventually settling on a $3 million price tag. Through a series of federal grants awarded to CPD beginning in 2009, Chicago Police and academics at the Illinois Institute of Technology had already created their own crime-forecasting program that assigned a risk score to individuals based on criminal data and social media histories.</p>

<p>On August 19th, 2014, Katie Laidlaw, a marketing executive at Palantir, emailed Chicago Police commander Jonathan Lewin. &ldquo;I would like to follow-up on connecting with Superintendent McCarthy, specifically to frame potential Palantir engagement around our proven outcomes in supporting homicide reduction in New Orleans,&rdquo; Laidlaw wrote.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Palantir used its work with NOPD to solicit large contracts</p></blockquote></figure>
<p>The emails also show that Chicago Police hoped to receive grant money from the Department of Homeland Security to fund the Palantir software acquisition. However, the Chicago Police Department never piloted or purchased Palantir&rsquo;s software.</p>

<p>Commander Lewin, who is in charge of Chicago&rsquo;s &ldquo;heat list&rdquo; model of predictive policing and who was on the receiving end of Katie Laidlaw&rsquo;s sales pitch for Palantir, said in an interview that he was aware of Palantir&rsquo;s work with other law enforcement agencies but never approved either a test run or purchase of Palantir software.</p>

<p>Though Palantir did not succeed in selling its New Orleans-tested tools to Chicago Police, the data-mining company has successfully sold forecasting products to foreign security services.</p>

<p>In 2016, the Danish national police and intelligence services signed an <a href="https://www.documentcloud.org/documents/4350050-Danish-POL-INTEL-Procurement-Doc-1.html">84-month contract with Palantir</a> &mdash; reported in the Danish press to have been worth between $14.8 and $41.4 million &mdash; for a predictive technology package meant to identify potential terrorists. According to procurement documents, the program uses law enforcement data like license plate reader records, CCTV video, and police reports to make predictions about individuals&rsquo; likelihood to commit terrorism. Denmark&rsquo;s national legislature had to <a href="https://edri.org/new-legal-framework-for-predictive-policing-in-denmark/">pass an exemption</a> to the European Union&rsquo;s data protection regulations in order to purchase Palantir&rsquo;s software.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Denmark had to pass an exemption to the European Union’s data protection regulations in order to purchase Palantir’s software</p></blockquote></figure>
<p>Prior to the 2016 contract with Denmark, Palantir Technologies&rsquo; reported work with law enforcement never mentioned forecasting or prediction capabilities.</p>

<p>Last year, the liberal Israeli newspaper <em>Haaretz</em> <a href="https://www.haaretz.com/israel-news/.premium-1.792206">reported</a> that Israel&rsquo;s security services used analytics systems that scraped social media and other data to predict potential &ldquo;lone-wolf&rdquo; attackers from Palestinian communities in the West Bank, and that Palantir was one of only two technology companies to provide predictive intelligence systems to Israeli security organizations. The New Orleans project is the first reported instance of Palantir using social media data as a part of the company&rsquo;s social network analysis.</p>

<p>&ldquo;I&rsquo;m not surprised to find out that people are being detained abroad using that information,&rdquo; said New Orleans council president Jason Williams, pointing out the differences between the legal systems of Israel and the United States. &ldquo;My concern is, the use of technology to get around the Constitution &mdash; that is not something that I would want to see in the United States.&rdquo;</p>
<hr class="wp-block-separator" />
<p>Around the country, cities like New York are weighing legislation about how to oversee the algorithms government agencies use to make decisions. These debates have yet to begin in New Orleans, where the city&rsquo;s intractable crime rate takes up much of the oxygen in public discourse. However, the secrecy of Palantir&rsquo;s relationship with NOPD raises red flags to outside observers and prompts questions about how the company&rsquo;s algorithms are being used.</p>

<p>William Isaac, the Michigan State researcher who has analyzed predictive policing systems for bias, said he has long had suspicions that Palantir engaged in some sort of individual forecasting program. &ldquo;They had only publicly acknowledged the extent to which their technology is data deconfliction and visualization,&rdquo; Isaac said.</p>

<p>After being walked through the documentation of Palantir&rsquo;s New Orleans project, Isaac said the program was remarkably similar to Chicago&rsquo;s individual &ldquo;heat list&rdquo; model, which a RAND Corporation study found had no impact on violent crime and was overwhelmingly composed of young African-American and Latino men with extensive law enforcement contact.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“The same flaws that were in the Chicago predictive program are going to be amplified in New Orleans’ data set.”</p></blockquote></figure>
<p>&ldquo;If you&rsquo;re trying to predict anything, you need to have some representation across the universe that you&rsquo;re trying to predict. If you&rsquo;re trying to predict crime, you need to have positive and negative examples for every possible offense,&rdquo; Isaac said. Police departments tend to have good data about communities where they are present but little data about communities where they do not patrol as vigorously &mdash; which tend to be affluent and white.</p>

<p>&ldquo;The same flaws that were in the Chicago predictive program are going to be amplified in New Orleans&rsquo; data set,&rdquo; Isaac said.</p>

<p>The secrecy surrounding the NOPD program also raises questions about whether defendants have been given evidence they have a right to view. Sarah St. Vincent, a researcher at Human Rights Watch, recently published an 18-month investigation into parallel construction, or the practice of law enforcement concealing evidence gathered from surveillance activity. In an interview, St. Vincent said that law enforcement withholding intelligence gathering or analysis like New Orleans&rsquo; predictive policing work effectively kneecaps the checks and balances of the criminal justice system. At the Cato Institute&rsquo;s 2017 Surveillance Conference in December, St. Vincent <a href="https://www.cato.org/multimedia/events/2017-cato-surveillance-conference-afternoon-flash-talks">raised concerns</a> about why information garnered from predictive policing systems was not appearing in criminal indictments or complaints.</p>

<p>&ldquo;It&rsquo;s the role of the judge to evaluate whether what the government did in this case was legal,&rdquo; St. Vincent said of the New Orleans program. &ldquo;I do think defense attorneys would be right to be concerned about the use of programs that might be inaccurate, discriminatory, or drawing from unconstitutional data.&rdquo;</p>

<p class="has-end-mark">If Palantir&rsquo;s partnership with New Orleans had been public, the issues of legality, transparency, and propriety could have been hashed out in a public forum during an informed discussion with legislators, law enforcement, the company, and the public. For six years, that never happened.</p>

<p><em>Correction: The article previously stated incorrectly that Palantir was the world&rsquo;s fifth most valuable company. It is among the highest-valued private companies in Silicon Valley. </em></p>
						]]>
									</content>
			
					</entry>
	</feed>
