You are Here:
PALANTIR-virus threat = excuse to push through decades planned MEGA surveillance

Author (Read 8580 times)

0 Members and 2 Guests are viewing this topic.


Dr. Naomi Hunter

  • Global Moderator
  • InfoWarrior
  • *****
  • 160

Is Peter Thiel’s Palantir a privatized version of Promis?

Palantir technologies is a private American software company that specializes in big data analytics and was founded by Facebook’s first outside investor, PayPal co-founder and executive committee member of the Trump transition team, Peter Thiel, along with Joe Lonsdale, Stephen Cohen, and Alex Karp. The company is known for two projects in particular: Palantir Gotham and Palantir Metropolis. Palantir Gotham is used by counter-terrorism analysts at offices in the United States Intelligence Community (USIC), Department of Defense, fraud investigators at the Recovery Accountability and Transparency Board, and cyber analysts at Information Warfare Monitor, while Palantir Metropolis is used by hedge funds, banks, and financial services firms. Palantir’s original clients were federal agencies of the USIC. It has since expanded its customer base to serve state and local governments, as well as private companies in the financial and healthcare industries.

J.P. Morgan’s executives were caught spying on employees through Palantir, “Aided by as many as 120 “forward-deployed engineers�? from the data mining company Palantir Technologies Inc., which J.P. Morgan engaged in 2009, Cavicchia group vacuumed up emails and browser histories, GPS locations from company-issued smartphones, printer and download activity, and transcripts of digitally recorded phone conversations. Palantir’s software aggregated, searched, sorted, and analyzed these records, surfacing keywords and patterns of behavior that Cavicchia team had flagged for potential abuse of corporate assets. Palantir’s algorithm, for example, alerted the insider threat team when an employee started badging into work later than usual, a sign of potential disgruntlement. That would trigger further scrutiny and possibly physical surveillance after hours by bank security personnel.”

“Palantir cut its teeth working for the Pentagon and the CIA in Afghanistan and Iraq. The company’s engineers and products don’t do any spying themselves; they’re more like a spy’s brain, collecting and analyzing information that’s fed in from the hands, eyes, nose, and ears. The software combs through disparate data sources-financial documents, airline reservations, cell-phone records, social media postings-and searches for connections that human analysts might miss. It then presents the linkages in colorful, easy-to-interpret graphics that look like spider webs.�? (Waldman, Chapman, Robertson 2018)

“It all ended when the bank’s senior executives learned that they, too, were being watched, and what began as a promising marriage of masters of big data and global finance descended into a spying scandal. The misadventure, which has never been reported, also marked an ominous turn for Palantir, one of the most richly valued startups in Silicon Valley. An intelligence platform designed for the global War on Terror was weaponized against ordinary Americans at home.�?

Imagine an entire world running on this type of technology; a society run by artificial intelligence. The fourth industrial revolution and the introduction of the ‘internet of things’, “a network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these things to connect, collect, and exchange data.”

One crucial aspect of the internet of things will be the planned 5G cellular network. The network is being designed in Israel at Intel’s 5 R&D centers in Israel. The Israel Daily News brags, “The Israeli team at Israel’s Holy Land offices are reportedly leading the charge for the jump to 5G, that makes sense, given the number of Israel firms Intel has gobbled up over the years. As of now, Intel is probably the single largest employer of tech jobs in the country, with almost 1100 Israeli Employees to date. The company has sunk close to $35 billion dollars in Israel’s economy.” That’s $35 billion dollars that should have been invested in the United States, where Intel was established and is headquartered.

Intel receives $3.87 billion dollars in US government subsidies, the third largest amount of corporate welfare allocated to a US company. Yet Intel invests $35 billion dollars in a foreign country, Israel, with the American taxpayers picking up the tab. Intel also designs all its CPU core processors in Israel. Cyber-security researcher Christopher Domas found that Intel CPUs have patented “hidden backdoors that let you seize root by sending a command to an undocumented RISC core that manages the main CPU.” (Wagenseil) These backdoors were put in place by Israeli technicians at Intel’s Israeli design and fabrication centers.

NHS partners with Amazon, Google, Microsoft and Palantir in fight against coronavirus

Tom McArthur
Page editor
Yahoo Finance UKMarch 29, 2020

A member of the military at the ExCel centre in London which is being made into a temporary hospital - the NHS Nightingale hospital, comprising of two wards, each of 2,000 people, to help tackle coronavirus. (Photo by Yui Mok/PA Images via Getty Images)

The NHS is teaming up with some of the world’s biggest tech companies to fight the coronavirus pandemic.

In an official blog, it confirmed that the firms are to create computer ‘dashboard’ systems to show the spread of the virus across the UK and the organisation’s ability to deal with it.

Four tech companies are named in the post: US giants, Microsoft (MSFT), Google (GOOG) and Palantir - a controversial organisation that provides services for US agencies as well as international NGOs. The fourth firm is Faculty AI, which is based in London.

According to the BBC, Amazon is also involved in the campaign against Covid-19, and will eventually distribute home testing kits to the public.

NHSX - a unit within the healthcare system responsible for digital innovation - is heading the effort to harness a range of data sources, so that they could be used in combination.

The ambitious goal is a dashboard that pulls in real time information about the pandemic which will help medics and the government to:

See how the virus is spreading and identify risks to particularly vulnerable groups of people

Increase resources in emerging hotspots almost immediately

Get critical equipment to hospitals and other facilities in greatest need

Send patients to hospitals best able to care for them based on current demand, resources and staffing levels

The NHS says that the information would mostly be drawn from existing data sources, and would be anonymised so that individual patients could not be identified. This process will involve scrubbing the data of identifiers like names and addresses and replacing them with a “pseudonym,” it said.
Who’s doing what?

The NHS said:

Microsoft had built a data store on its Azure cloud computing platform to hold the information in a single, secure location

Palantir is providing use of its Foundry software tool, which analyses records to deliver a "single source of truth"

Faculty AI is making dashboards, models and simulations that decision-makers would view

Google's G Suite of productivity apps could be used to collect real-time operational data like occupancy levels and Accident & Emergency capacity

Privacy campaigners have raised concerns, namely with Palantir, which is controversial for its role in helping US immigration agents track and deport undocumented immigrants, as well as its work with the CIA overseas.

Anna Bacciarelli @a_bacci

The NHS is partnering with Palantir, of ICE fame, for data management and analytics in the corona crisis. A break for Palantir in the UK – and a somewhat risky choice for the NHS. Would love to see the due diligence on this.
The NHS says that all the data involved would be made "open source wherever we can".

Regarding Palantir, they say: “Palantir is a data processor, not a data controller, and cannot pass on or use the data for any wider purpose without the permission of NHS England”.

They continue: “After the emergency is over, we hope to be able to use what we have learned from our technology partners to get better within the Government at data collection, aggregation and analysis in a way that protects the privacy of our citizens.”

Do we really want Palantir embedded in the NHS?
Beware bad policy made in haste because of coronavirus, say campaigners

Tough times call for strong measures. During the Second World War, the British public accepted the need for ID cards to manage rationing and monitor the population as a necessary restriction on their liberty. After the war, ID cards were dropped, although some of the monitoring mechanisms and restrictions remained in place. In response to the coronavirus crisis, governments around the world have brought in sweeping and occasionally Draconian measures to control the movement of their citizens by digital means, including tracking of smartphones to enforce social distancing and quarantine. The measures taken in the UK currently enjoy the support of the public who recognise the necessity of fighting the virus, but their efficacy is in most cases unproven, and as lockdowns drag on inevitably these constraints will start to chafe.

Transparency and accountability

In a democracy, trust is an essential ingredient. People must have faith that these measures really are necessary or they will work around them, blunting the attack on the virus and possibly leading to social disorder. To retain trust, the authorities to be crystal clear and consistent when communicating why interventions are necessary and we also need to be reassured that once the emergency is over they will be rolled back: the power and reach of digital technologies means that drifting into a Chinese-style surveillance state where all of our activities are pinned to a central ID is an all too realistic scenario and one that must be avoided.

Which is why NHS England's announcement last week that it is to employ the services of Palantir to help coordinate the distribution of ventilators and other equipment to hospitals sets off alarm bells. Over the years, Palantir, the secretive, CIA-funded data-mining company set up by Paypal billionaire Peter Thiel has become a watchword for intrusive surveillance through its involvement in the US ‘War on Terror', predictive policing and Immigration and Customs Enforcement (ICE) deportations. Palantir employees are also implicated in the Cambridge Analytica scandal and the company is accused of helping political operatives smear their opponents. A Bloomberg article Palantir Knows Everything About You alleges that Palantir was only dropped by investment bank JP Morgan after senior executives found out that they themselves were being spied on by overzealous operatives.

"Palantir Foundry is a very powerful data integration tool that allows you to take a view of data across disparate systems, disparate schema, in very different forms and with very mismatched metadata, and to overlay that with certain things," said Phil Booth, coordinator of medConfidential - a group campaigning for medical data privacy. "But, I really doubt that proper procurement processes were gone through at this before for this particular project. It's obviously one that's been assembled at that scale, speed."

Without exception the privacy campaigners we spoke to recognised the need for emergency data access measures. All were broadly supportive of the provisions rolled out so far, as allowed for under GDPR and other data protection legislation provided it is evidence-led, and all wanted to ensure public trust is maintained so that the interventions can be as effective as possible. However, they were critical of the tendency for officials to communicate via leaks instead of being up-front.

"Never has there been a more important time for citizens to trust its government. I, and the average citizen want to trust our government," said Geoff Revill of Krowdthink.

He continued: "There are three pillars to trust - transparency, control and accountability. As citizens are disempowered and lose control in lockdown and the coronavirus legislation, it becomes ever more critical for the government to explicitly increase its transparency and accountability."
Avoiding overreach

The big tech companies have been trying to get into the NHS for years, says Booth, and while their expertise is welcome, we must be careful during this crisis not to make bad policy based on a particularly hard case.

Which brings us back to Palantir. Do we really want a powerful, secretive data-mining company based outside of our jurisdiction and with little prior healthcare expertise embedded in the NHS? The precedents aren't good. Google was given access to patient data in a way that was later ruled to be illegal and in December health minister Matt Hancock and the Department of Health and Social Care granted Amazon access to healthcare data in a manner that was heavily criticised by campaigners. "Matt Hancock tends to be a bit of a tech fanboy and reaches for the big shiny object," said Booth, who obtained details of the deal via a Freedom of Information request. "Amazon's people clearly wrote the contract. They got far more out of it than anyone else would have done."

As well as Palantir, NHS England is deploying the services of Microsoft, Google and UK software consultancy Faculty AI. On the face of it, the Covid-19 intervention announced last week, which aims to create a data platform to track occupancy levels at hospitals and capacity of A&E departments and collate aggregated statistics about the lengths of stay for patients, does not involve access to personal data. And to its credit, although a little late, NHSX, the cross-departmental health initiative, addresses privacy concerns head-on.

"The data brought into the back end datastore held by NHS England and NHS Improvement and NHSX will largely be from existing data sources, for example, data already collected by NHS England, NHS Improvement, Public Health England and NHS Digital. All NHS data remains under NHS England/ NHS Improvement control," its directors say in an open letter.

They also state the Covid-19 datastore will be closed once the outbreak has been contained. This is welcome news and a good example of the sort of transparent communications that have often been lacking during the crisis. However, the border between personal and non-personal data can be hazy (what about information on NHS staff and volunteers for example?), and, crucially, there's no mention of what will happen to the platform and the key players after that. Will they become an integral part of the NHS, and if so what will that mean?

"Palantir is not a company that inspires confidence," said Jim Killock, executive director of privacy and free speech advocates Open Rights Group on a public conference call on Friday. "For them to suddenly be involved through government fiat without any kind of procurement procedure or competitive tendering process is concerning, although understandable. Now, currently, that's non-personal data, but do we think that this company's ambitions are going to stop at that point? I think that's very debatable.

"It would be really helpful to hear the government say, ‘this is only temporary, they will cease at the end of this crisis and Palantir will not be invited to share personal data in this period until there are competitive tendering arrangements in place'."

This is at the heart of the dilemma. For years, going back to NPfIT and before, NHS has been crying out for some sort of dashboard-based integrated system to coordinate resources, and now suddenly there's a chance to build it. If it can be created properly, with all the right checks and balances then it should be welcomed. But it's a big if. There's a danger that bad policy could be made on the hoof that will have damaging long-term consequences.

"I deeply fear mission creep," said Revill. "Data in the hands of a company like Palantir is knowledge, and knowledge is power. I don't know of any politician that voluntarily relinquishes power once they have it."

Forget Apple And Google—Here’s The Real Challenge For COVID-19 Contact-Tracing
Zak Doffman Contributor

The news on Friday (April 10) that Apple and Google are partnering to simplify coronavirus contact-tracing is a big deal. Just like that, read the headlines, more than 3 billion people globally might have an effective warning system if they come into contact with newly diagnosed COVID-19 patients. Unfortunately, that’s not the case. Putting aside how many devices actually carry the right Bluetooth technology, there are two critical factors that stand in the way of this being effective.

Bluetooth contact tracing uses a relative signal strength indicator to detect when one device is near another, and for how long. Your phone collects unique identifiers for those other phones you are near throughout your day, those other phones do the same for you. Your phone also downloads unique identifiers for those newly testing positive for COVID-19. If there’s a match, you receive a locally relevant alert—monitor for symptoms, get tested, self-isolate—without breaching your privacy.

Apple and Google have joined forces to remove the core technical issues associated with a bluetooth contact tracing app. This will provide an API initially, and core OS functionality beyond that, for national programs to build upon, making their apps as effective and user-friendly as possible, working in the background, preserving battery life, safeguarding data privacy.

Let’s take a look at Singapore. The country’s TraceTogether was launched last month and has become the public catalyst for a global push towards the privacy-friendly use of bluetooth proximity tracking to alert users who may be at risk. The push to such platforms in Europe and the U.S. reference Singapore as an example of what good looks like. The software behind TraceTogether has now been made open-source, as Singapore encourages other countries to follow its lead.

Singapore was lauded for its fast approach to COVID-19, built around rigorous contact-tracing for new patients, although it is now suffering a second-wave of confirmed infections. Deploying a government-backed system in Singapore is very different to Europe and North America. The culture is more compliant and citizens are subject to far more surveillance and control than we see in the west. Even so, only around one million citizens, some 20% of the population, has downloaded the app.

"In order for TraceTogether to be effective, we need something like three-quarters—if not everyone—of the population to have it,” the country’s development minister Lawrence Wong told local media early this month. “Then we can really use that as an effective contact-tracing tool.” Not only do citizens need to install the app, they need to get it up and running on their devices. Some of those technical challenges are resolved by Apple and Google—the decision to install, though, is not.

This is the first critical issue with making such voluntary contact-tracing apps work—inertia versus compliance. China is the only country that can claim highly-effective digital contact tracing thus far, and that is because it did not require any citizen compliance. The country simply deployed its existing surveillance state. That’s not an option elsewhere. As Oxford University researchers in the U.K. (which is also building a national bluetooth app) have warned: It only works “if used by enough people,” a figure thought to be around 60%. TraceTogether’s million installs is “a record for a [Singapore] government app,” highlighting the scale of the challenge.

Beyond the highest level numbers, there are other issues—the percentage of older citizens with capable smartphones, for example, exacerbating the age disparity we are already seeing with COVID-19. Older or less healthy citizens worry about becoming infected. Younger, healthy citizens less so. Given behavior and mobility, younger citizens are likely a major cause of new infections. Not only are they far less likely to install an app, given fewer health concerns, they are also much more likely to be asymptomatic or to suffer a mild infection and not seek testing.

And that brings us to the second critical challenge—testing. Bluetooth tracing apps link to national health efforts to test and monitor the population. You cannot have an app at scale where anyone can just tick a box to say they have symptoms and then all those they've been near immediately self-isolate. That would be chaos. Instead, a formal test would flag a user’s infection and the app would then alert those they had been near. If most people do not get tested, it does not work.

Much of the focus on the Google and Apple announcement has been on technical privacy and ease of use. It is beyond doubt that for such bluetooth contact-tracing to work, it needs this move by the smartphone OS giants—but that doesn't make it happen. The best analogy is to suggest that better battery technology in of itself created the electric car industry—a major factor, yes, but it was far more complicated than that.

The risk with the type of publicity greeting Apple and Google’s move is that it has welcomed the good news aspects and overlooked the “no news” aspects. There is a mountain to climb to get even 60% of national populations to use an app or enable an OS function, never mind 75% or more. Let’s not forget, the idea for bluetooth contact tracing is almost a decade old—the FluPhone app was touted by Cambridge University back in 2011. It did not catch on, despite WHO estimating that “290,000 to 650,000 respiratory deaths occur each year associated with seasonal influenza.”

Beyond the technical enablers, the privacy safeguards touted by Apple and Google should also be welcomed, taking the platform out of government hands to protect privacy, limiting access to data. In that regard, an ACLU spokesperson welcomed an approach that “appears to mitigate the worst privacy and centralization risks,” while warning that “there is still room for improvement [and] any contract tracing app [should] remain voluntary and decentralized, and used only for public health purposes and only for the duration of this pandemic.”

The DP-3T team working across eight European universities, including University College London, has been campaigning against centralized tracking and tracing apps, arguing that data must be kept on devices. As such they welcome the Google and Apple news.

Michael Veale, lecturer in digital rights and regulation at the university, told me this “enforces a decentralized approach, where no personal data leaves your device—The shape of the protocol is very much the same as ours,” he explained, although the tech giants can go much further in the actual implementation, coding it into the OS, removing the need to keep phones on or play with bluetooth settings.

That said, Veale also warns that take-up below 60% “would endanger the quality of the data and its core functionality, so that's where the focus needs to be,” adding that “if testing is not widely available, I think an app provides more false security than actual support fighting the virus... Making it solve societal problems in practice, however, needs much more than just a technical protocol.”

ACLU also acknowledged these two critical challenges to making such a system work, “widespread, free, and quick testing,” as well as generating unprecedented levels of trust in a government-backed platform to push people into installing the app, with “such contact tracing likely to exclude many vulnerable members of society who lack access to technology.”

Unless and until governments can either develop or mandate a system that deploys this across the majority of its population, and then backs it up with the rigorous testing regime that is stitched into the core concept of operation, such apps will be helpful but not game-changing. As countries around the world progress the development and launch of their apps, all of which will benefit from the Apple and Google move, you can expect to see take-up and testing as the twin pillars of success.

To make this effective, we may need to accept some form of mandatory proximity tracing as the pandemic evolves. This is already being explored in the U.K., where its own version of a bluetooth contact-tracing app might be mandatory for those returning to work, according to the Sunday Times today (April 12). As unlikely as that sounds now, think how much has changed from a tracking perspective in recent weeks.

Other articles:

Campus activists find a target at the intersection of immigration and technology: Palantir

Peter Thiel’s Palantir Spreads Its Tentacles Throughout Europe
The $20 billion data mining startup tripled revenue in Europe and plans to keep expanding there.
How Peter Thiel’s company Palantir was built with CIA funding and has helped the likes of the NSA and GCHQ spy since its inception
Peter Thiel's 'invasive' Palantir on push to ramp up secretive UK government contracts
The world’s biggest women’s tech conference just dropped Palantir as a sponsor
It’s the third time in recent months an outside group has severed ties with Palantir over its controversial work for ICE.

CMU students protest university's involvement with Palantir, a company contracted with ICE

Last Edit by Gladstone


Dr. Naomi Hunter

  • Global Moderator
  • InfoWarrior
  • *****
  • 160

Palantir Knows Everything About You

Peter Thiel’s data-mining company is using War on Terror tools to track American citizens. The scary thing? Palantir is desperate for new customers.
By Peter Waldman, Lizette Chapman, and Jordan Robertson
April 19, 2018

High above the Hudson River in downtown Jersey City, a former U.S. Secret Service agent named Peter Cavicchia III ran special ops for JPMorgan Chase & Co. His insider threat group—most large financial institutions have one—used computer algorithms to monitor the bank’s employees, ostensibly to protect against perfidious traders and other miscreants.

Aided by as many as 120 “forward-deployed engineers” from the data mining company Palantir Technologies Inc., which JPMorgan engaged in 2009, Cavicchia’s group vacuumed up emails and browser histories, GPS locations from company-issued smartphones, printer and download activity, and transcripts of digitally recorded phone conversations. Palantir’s software aggregated, searched, sorted, and analyzed these records, surfacing keywords and patterns of behavior that Cavicchia’s team had flagged for potential abuse of corporate assets. Palantir’s algorithm, for example, alerted the insider threat team when an employee started badging into work later than usual, a sign of potential disgruntlement. That would trigger further scrutiny and possibly physical surveillance after hours by bank security personnel.
Businessweek cover Featured in Bloomberg Businessweek, April 23, 2018. Subscribe now.

Over time, however, Cavicchia himself went rogue. Former JPMorgan colleagues describe the environment as Wall Street meets Apocalypse Now, with Cavicchia as Colonel Kurtz, ensconced upriver in his office suite eight floors above the rest of the bank’s security team. People in the department were shocked that no one from the bank or Palantir set any real limits. They darkly joked that Cavicchia was listening to their calls, reading their emails, watching them come and go. Some planted fake information in their communications to see if Cavicchia would mention it at meetings, which he did.

It all ended when the bank’s senior executives learned that they, too, were being watched, and what began as a promising marriage of masters of big data and global finance descended into a spying scandal. The misadventure, which has never been reported, also marked an ominous turn for Palantir, one of the most richly valued startups in Silicon Valley. An intelligence platform designed for the global War on Terror was weaponized against ordinary Americans at home.

Founded in 2004 by Peter Thiel and some fellow PayPal alumni, Palantir cut its teeth working for the Pentagon and the CIA in Afghanistan and Iraq. The company’s engineers and products don’t do any spying themselves; they’re more like a spy’s brain, collecting and analyzing information that’s fed in from the hands, eyes, nose, and ears. The software combs through disparate data sources—financial documents, airline reservations, cellphone records, social media postings—and searches for connections that human analysts might miss. It then presents the linkages in colorful, easy-to-interpret graphics that look like spider webs. U.S. spies and special forces loved it immediately; they deployed Palantir to synthesize and sort the blizzard of battlefield intelligence. It helped planners avoid roadside bombs, track insurgents for assassination, even hunt down Osama bin Laden. The military success led to federal contracts on the civilian side. The U.S. Department of Health and Human Services uses Palantir to detect Medicare fraud. The FBI uses it in criminal probes. The Department of Homeland Security deploys it to screen air travelers and keep tabs on immigrants.

Police and sheriff’s departments in New York, New Orleans, Chicago, and Los Angeles have also used it, frequently ensnaring in the digital dragnet people who aren’t suspected of committing any crime. People and objects pop up on the Palantir screen inside boxes connected to other boxes by radiating lines labeled with the relationship: “Colleague of,” “Lives with,” “Operator of [cell number],” “Owner of [vehicle],” “Sibling of,” even “Lover of.” If the authorities have a picture, the rest is easy. Tapping databases of driver’s license and ID photos, law enforcement agencies can now identify more than half the population of U.S. adults.

JPMorgan was effectively Palantir’s R&D lab and test bed for a foray into the financial sector, via a product called Metropolis. The two companies made an odd couple. Palantir’s software engineers showed up at the bank on skateboards. Neckties and haircuts were too much to ask, but JPMorgan drew the line at T-shirts. The programmers had to agree to wear shirts with collars, tucked in when possible.

As Metropolis was installed and refined, JPMorgan made an equity investment in Palantir and inducted the company into its Hall of Innovation, while its executives raved about Palantir in the press. The software turned “data landfills into gold mines,” Guy Chiarello, who was then JPMorgan’s chief information officer, told Bloomberg Businessweek in 2011.

The founder of Palantir is extremely well connected. Here’s how his life might appear in the company’s model.

Cavicchia was in charge of forensic investigations at the bank. Through Palantir, he gained administrative access to a full range of corporate security databases that had previously required separate authorizations and a specific business justification to use. He had unprecedented access to everything, all at once, all the time, on one analytic platform. He was a one-man National Security Agency, surrounded by the Palantir engineers, each one costing the bank as much as $3,000 a day.

Senior investigators stumbled onto the full extent of the spying by accident. In May 2013 the bank’s leadership ordered an internal probe into who had leaked a document to the New York Times about a federal investigation of JPMorgan for possibly manipulating U.S. electricity markets. Evidence indicated the leaker could have been Frank Bisignano, who’d recently resigned as JPMorgan’s co-chief operating officer to become CEO of First Data Corp., the big payments processor. Cavicchia had used Metropolis to gain access to emails about the leak investigation—some written by top executives—and the bank believed he shared the contents of those emails and other communications with Bisignano after Bisignano had left the bank. (Inside JPMorgan, Bisignano was considered Cavicchia’s patron—a senior executive who protected and promoted him.)

JPMorgan officials debated whether to file a suspicious activity report with federal regulators about the internal security breach, as required by law whenever banks suspect regulatory violations. They decided not to—a controversial decision internally, according to multiple sources with the bank. Cavicchia negotiated a severance agreement and was forced to resign. He joined Bisignano at First Data, where he’s now a senior vice president. Chiarello also went to First Data, as president. After their departures, JPMorgan drastically curtailed its Palantir use, in part because “it never lived up to its promised potential,” says one JPMorgan executive who insisted on anonymity to discuss the decision.

The bank, First Data, and Bisignano, Chiarello, and Cavicchia didn’t respond to separately emailed questions for this article. Palantir, in a statement responding to questions about how JPMorgan and others have used its software, declined to answer specific questions. “We are aware that powerful technology can be abused and we spend a lot of time and energy making sure our products are used for the forces of good,” the statement said.

Much depends on how the company chooses to define good. In March a former computer engineer for Cambridge Analytica, the political consulting firm that worked for Donald Trump’s 2016 presidential campaign, testified in the British Parliament that a Palantir employee had helped Cambridge Analytica use the personal data of up to 87 million Facebook users to develop psychographic profiles of individual voters. Palantir said it has a strict policy against working on political issues, including campaigns, and showed Bloomberg emails in which it turned down Cambridge’s request to work with Palantir on multiple occasions. The employee, Palantir said, worked with Cambridge Analytica on his own time. Still, there was no mistaking the implications of the incident: All human relations are a matter of record, ready to be revealed by a clever algorithm. Everyone is a spidergram now.

Thiel addresses the 2016 Republican National Convention. Jim Watson/AFP/Getty Images

Thiel, who turned 50 in October, long reveled as the libertarian black sheep in left-leaning Silicon Valley. He contributed $1.25 million to Trump’s presidential victory, spoke at the Republican convention, and has dined with Trump at the White House. But Thiel has told friends he’s had enough of the Bay Area’s “monocultural” liberalism. He’s ditching his longtime base in San Francisco and moving his personal investment firms this year to Los Angeles, where he plans to establish his next project, a conservative media empire.

As Thiel’s wealth has grown, he’s gotten more strident. In a 2009 essay for the Cato Institute, he railed against taxes, ­government, women, poor people, and society’s acquiescence to the inevitability of death. (Thiel doesn’t accept death as inexorable.) He wrote that he’d reached some radical conclusions: “Most importantly, I no longer believe that freedom and democracy are compatible.” The 1920s was the last time one could feel “genuinely optimistic” about American democracy, he said; since then, “the vast increase in welfare beneficiaries and the extension of the franchise to women—two constituencies that are notoriously tough for libertarians—have rendered the notion of ‘capitalist democracy’ into an oxymoron.”

Thiel went into tech after missing a prized Supreme Court clerkship following his graduation from Stanford Law School. He co-founded PayPal and then parlayed his winnings from its 2002 sale to EBay Inc. into a career in venture investing. He made an early bet on Facebook Inc. (where he’s still on the board), which accounts for most of his $3.3 billion fortune, as estimated by Bloomberg, and launched his career as a backer of big ideas—things like private space travel (through an investment in SpaceX), hotel alternatives (Airbnb), and floating island nations (the Seasteading Institute).

He started Palantir—named after the omniscient crystal balls in J.R.R. Tolkien’s Lord of the Rings trilogy—three years after the attacks of Sept. 11, 2001. The CIA’s investment arm, In-Q-Tel, was a seed investor. For the role of chief executive officer, he chose an old law school friend and self-described neo-Marxist, Alex Karp. Thiel told Bloomberg in 2011 that civil libertarians ought to embrace Palantir, because data mining is less repressive than the “crazy abuses and draconian policies” proposed after Sept. 11. The best way to prevent another catastrophic attack without becoming a police state, he argued, was to give the government the best surveillance tools possible, while building in safeguards against their abuse.

Legend has it that Stephen Cohen, one of Thiel’s co-founders, programmed the initial prototype for Palantir’s software in two weeks. It took years, however, to coax customers away from the longtime leader in the intelligence analytics market, a software company called I2 Inc.

In one adventure missing from the glowing accounts of Palantir’s early rise, I2 accused Palantir of misappropriating its intellectual property through a Florida shell company registered to the family of a Palantir executive. A company claiming to be a private eye firm had been licensing I2 software and development tools and spiriting them to Palantir for more than four years. I2 said the cutout was registered to the family of Shyam Sankar, Palantir’s director of business development.

I2 sued Palantir in federal court, alleging fraud, conspiracy, and copyright infringement. In its legal response, Palantir argued it had the right to appropriate I2’s code for the greater good. “What’s at stake here is the ability of critical national security, defense and intelligence agencies to access their own data and use it interoperably in whichever platform they choose in order to most effectively protect the citizenry,” Palantir said in its motion to dismiss I2’s suit.

The motion was denied. Palantir agreed to pay I2 about $10 million to settle the suit. I2 was sold to IBM in 2011.

Sankar, Palantir employee No. 13 and now one of the company’s top executives, also showed up in another Palantir scandal: the company’s 2010 proposal for the U.S. Chamber of Commerce to run a secret sabotage campaign against the group’s liberal opponents. Hacked emails released by the group Anonymous indicated that Palantir and two other defense contractors pitched outside lawyers for the organization on a plan to snoop on the families of progressive activists, create fake identities to infiltrate left-leaning groups, scrape social media with bots, and plant false information with liberal groups to subsequently discredit them.

After the emails emerged in the press, Palantir offered an explanation similar to the one it provided in March for its U.K.-based employee’s assistance to Cambridge Analytica: It was the work of a single rogue employee. The company never explained Sankar’s involvement. Karp issued a public apology and said he and Palantir were deeply committed to progressive causes. Palantir set up an advisory panel on privacy and civil liberties, headed by a former CIA attorney, and beefed up an engineering group it calls the Privacy and Civil Liberties Team. The company now has about 10 PCL engineers on call to help vet clients’ requests for access to data troves and pitch in with pertinent thoughts about law, morality, and machines.

During its 14 years in startup mode, Palantir has cultivated a mystique as a haven for brilliant engineers who want to solve big problems such as terrorism and human trafficking, unfettered by pedestrian concerns such as making money. Palantir executives boast of not employing a single sales­person, relying instead on word-of-mouth referrals.

The company’s early data mining dazzled venture investors, who valued it at $20 billion in 2015. But Palantir has never reported a profit. It operates less like a conventional software company than like a consultancy, deploying roughly half its 2,000 engineers to client sites. That works at well-funded government spy agencies seeking specialized applications but has produced mixed results with corporate clients. Palantir’s high installation and maintenance costs repelled customers such as Hershey Co., which trumpeted a Palantir partnership in 2015 only to walk away two years later. Coca-Cola, Nasdaq, American Express, and Home Depot have also dumped Palantir.

Karp recognized the high-touch model was problematic early in the company’s push into the corporate market, but solutions have been elusive. “We didn’t want to be a services company. We wanted to do something that was cost-efficient,” he confessed at a European conference in 2010, in one of several unguarded comments captured in videos posted online. “Of course, what we didn’t recognize was that this would be much, much harder than we realized.”

Palantir’s newest product, Foundry, aims to finally break through the profitability barrier with more automation and less need for on-site engineers. Airbus SE, the big European plane maker, uses Foundry to crunch airline data about specific onboard components to track usage and maintenance and anticipate repair problems. Merck KGaA, the pharmaceutical giant, has a long-term Palantir contract to use Foundry in drug development and supply chain management.

Deeper adoption of Foundry in the commercial market is crucial to Palantir’s hopes of a big payday. Some investors are weary and have already written down their Palantir stakes. Morgan Stanley now values the company at $6 billion. Fred Alger Management Inc., which has owned stock since at least 2006, revalued Palantir in December at about $10 billion, according to Bloomberg Holdings. One frustrated investor, Marc Abramowitz, recently won a court order for Palantir to show him its books, as part of a lawsuit he filed alleging the company sabotaged his attempt to find a buyer for the Palantir shares he has owned for more than a decade.

As shown in the privacy breaches at Facebook and Cambridge Analytica—with Thiel and Palantir linked to both sides of the equation—the pressure to monetize data at tech companies is ceaseless. Facebook didn’t grow from a website connecting college kids into a purveyor of user profiles and predilections worth $478 billion by walling off personal data. Palantir says its Privacy and Civil Liberties Team watches out for inappropriate data demands, but it consists of just 10 people in a company of 2,000 engineers. No one said no to JPMorgan, or to whomever at Palantir volunteered to help Cambridge Analytica—or to another organization keenly interested in state-of-the-art data science, the Los Angeles Police Department.

Screenshots of Palantir’s Gotham program, from a promotional video. Source: Youtube

Palantir began work with the LAPD in 2009. The impetus was federal funding. After several Sept. 11 postmortems called for more intelligence sharing at all levels of law enforcement, money started flowing to Palantir to help build data integration systems for so-called fusion centers, starting in L.A. There are now more than 1,300 trained Palantir users at more than a half-dozen law enforcement agencies in Southern California, including local police and sheriff’s departments and the Bureau of Alcohol, Tobacco, Firearms and Explosives.

The LAPD uses Palantir’s Gotham product for Operation Laser, a program to identify and deter people likely to commit crimes. Information from rap sheets, parole reports, police interviews, and other sources is fed into the system to generate a list of people the department defines as chronic offenders, says Craig Uchida, whose consulting firm, Justice & Security Strategies Inc., designed the Laser system. The list is distributed to patrolmen, with orders to monitor and stop the pre-crime suspects as often as possible, using excuses such as jaywalking or fix-it tickets. At each contact, officers fill out a field interview card with names, addresses, vehicles, physical descriptions, any neighborhood intelligence the person offers, and the officer’s own observations on the subject.

The cards are digitized in the Palantir system, adding to a constantly expanding surveillance database that’s fully accessible without a warrant. Tomorrow’s data points are automatically linked to today’s, with the goal of generating investigative leads. Say a chronic offender is tagged as a passenger in a car that’s pulled over for a broken taillight. Two years later, that same car is spotted by an automatic license plate reader near a crime scene 200 miles across the state. As soon as the plate hits the system, Palantir alerts the officer who made the original stop that a car once linked to the chronic offender was spotted near a crime scene.

The platform is supplemented with what sociologist Sarah Brayne calls the secondary surveillance network: the web of who is related to, friends with, or sleeping with whom. One woman in the system, for example, who wasn’t suspected of committing any crime, was identified as having multiple boyfriends within the same network of associates, says Brayne, who spent two and a half years embedded with the LAPD while researching her dissertation on big-data policing at Princeton University and who’s now an associate professor at the University of Texas at Austin. “Anybody who logs into the system can see all these intimate ties,” she says. To widen the scope of possible connections, she adds, the LAPD has also explored purchasing private data, including social media, foreclosure, and toll road information, camera feeds from hospitals, parking lots, and universities, and delivery information from Papa John’s International Inc. and Pizza Hut LLC.

The Constitutionality Question

Why the courts haven’t ruled on whether Palantir’s analytical tools are legal

Civil rights advocates say the compilation of a digital dossier of someone’s life, absent a court warrant, is an unlawful intrusion under the U.S. Constitution. Law enforcement officials say that’s not the case. For now, the question is unsettled, and that may be no accident. Civil liberties lawyers are seeking a case to challenge the constitutionality of Palantir’s use, but prosecutors and immigration agents have been careful not to cite the software in evidentiary documents, says Paromita Shah, associate director of the National Lawyers Guild’s National Immigration Project. “Palantir lives on that secrecy,” she says.

Since the 1970s, the Supreme Court has differentiated between searching someone’s home or car, which requires a warrant, and searching material out in the open or shared with others, which doesn’t. The justices’ thinking seems to be evolving as new technologies rise.

In a 2012 decision, U.S. v. Jones, the justices said that planting a GPS tracker on a car for 28 days without a warrant created such a comprehensive picture of the target’s life that it violated the public’s reasonable expectation of privacy.

Similarly, the court’s 2014 decision in Riley v. California found that cellphones contain so much personal information that they provide a virtual window into the owner’s mind, and thus necessitate a warrant for the government to search. Chief Justice John Roberts, in his majority opinion, wrote of cellphones that “with all they contain and all they may reveal, they hold for many Americans ‘the privacies of life.’” Justice Louis Brandeis, 86 years earlier, wrote a searing dissent in a wiretap case that seems to perfectly foresee the advent of Palantir.

“Ways may someday be developed,” Brandeis warned, “by which the government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences.”
—Peter Waldman

The LAPD declined to comment for this story. Palantir sent Bloomberg a statement about its work with law enforcement: “Our [forward-deployed engineers] and [privacy and civil liberties] engineers work with the law enforcement customers (including LAPD) to ensure that the implementation of our software and integration of their source systems with the software is consistent with the Department’s legal and policy obligations, as well as privacy and civil liberties considerations that may not currently be legislated but are on the horizon. We as a company determine the types of engagements and general applications of our software with respect to those overarching considerations. Police Agencies have internal responsibility for ensuring that their information systems are used in a manner consistent with their policies and procedures.”

Operation Laser has made L.A. cops more surgical—and, according to community activists, unrelenting. Once targets are enmeshed in a spidergram, they’re stuck.

Manuel Rios, 22, lives in the back of his grandmother’s house at the top of a hill in East L.A., in the heart of the city’s gang area. Tall with a fair complexion and light hair, he struggled in high school with depression and a learning disability and dropped out to work at a supermarket.

He grew up surrounded by friends who joined Eastside 18, the local affiliate of the 18th Street gang, one of the largest criminal syndicates in Southern California. Rios says he was never “jumped in”—initiated into 18. He spent years addicted to crystal meth and was once arrested for possession of a handgun and sentenced to probation. But except for a stint in county jail for a burglary arrest inside a city rec center, he’s avoided further trouble and says he kicked his meth habit last year.

In 2016, Rios was sitting in a parked car with an Eastside 18 friend when a police car pulled up. His buddy ran, pursued by the cops, but Rios stayed put. “Why should I run? I’m not a gang member,” he says over steak and eggs at the IHOP near his home. The police returned and handcuffed him. One of them took his picture with a cellphone. “Welcome to the gang database!” the officer said.

Since then he’s been stopped more than a dozen times, he says, and told that if he doesn’t like it he should move. He has nowhere to go. His girlfriend just had a baby girl, and he wants to be around for them. “They say you’re in the system, you can’t lie to us,” he says. “I tell them, ‘How can I be in the hood if I haven’t got jumped in? Can’t you guys tell people who bang and who don’t?’ They go by their facts, not the real facts.”

The police, on autopilot with Palantir, are driving Rios toward his gang friends, not away from them, worries Mariella Saba, a neighbor and community organizer who helped him get off meth. When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. “These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join. They internalize being told they’re bad.”

In Chicago, at least two immigrants have been detained for deportation by Immigration and Customs Enforcement officers based on erroneous information in gang databases, according to a pair of federal lawsuits. Chicago is a sanctuary city, so it isn’t clear how ICE found out about the purported gang affiliations. But Palantir is a likely link. The company provided an “intelligence management solution” for the Cook County Sheriff’s Office to integrate information from at least 14 different databases, including gang lists compiled by state and local police departments, according to county records. Palantir also has a $41 million data mining contract with ICE to build the agency’s “investigative case management” system.

One of the detained men, Wilmer Catalan-Ramirez, a 31-year-old body shop mechanic, was seriously injured when six ICE agents burst into his family’s home last March without a warrant. He’d been listed in the local gang database twice—in rival gangs. Catalan-Ramirez spent the next nine months in federal detention, until the city of Chicago admitted both listings were wrong and agreed to petition the feds to let him stay in the U.S. ICE released him in January, pending a new visa application. “These cases are perfect examples of how databases filled with unverified information that is often false can destroy people’s lives,” says his attorney, Vanessa del Valle of Northwestern University’s MacArthur Justice Center.

Palantir is twice the age most startups are when they cash out in a sale or initial public offering. The company needs to figure out how to be rewarded on Wall Street without creeping out Main Street. It might not be possible. For all of Palantir’s professed concern for individuals’ privacy, the single most important safeguard against abuse is the one it’s trying desperately to reduce through automation: human judgment.

As Palantir tries to court corporate customers as a more conventional software company, fewer forward-deployed engineers will mean fewer human decisions. Sensitive questions, such as how deeply to pry into people’s lives, will be answered increasingly by artificial intelligence and machine-learning algorithms. The small team of Privacy and Civil Liberties engineers could find themselves even less influential, as the urge for omnipotence among clients overwhelms any self-imposed restraints.

Computers don’t ask moral questions; people do, says John Grant, one of Palantir’s top PCL engineers and a forceful advocate for mandatory ethics education for engineers. “At a company like ours with millions of lines of code, every tiny decision could have huge implications,” Grant told a privacy conference in Berkeley last year.

JPMorgan’s experience remains instructive. “The world changed when it became clear everyone could be targeted using Palantir,” says a former JPMorgan cyber expert who worked with Cavicchia at one point on the insider threat team. “Nefarious ideas became trivial to implement; everyone’s a suspect, so we monitored everything. It was a pretty terrible feeling.” —With Michael Riley

ICO approves use of British mobile phone tracking data to fight spread of coronavirus
Regulator okays the use of anonymised phone tracking data to help tackle the spread of COVID-19

UK's Information Commissioner's Office (ICO) has granted the government permission to use people's anonymised mobile phone tracking data in efforts to tackle the spread of coronavirus.

In a statement, the ICO's Deputy Commissioner Steve Wood said that analysis of generalised location data trend is helping agencies to fight the spread of virus, and that its use does not fall under data protection law if it is properly anonymised and aggregated.

"In these circumstances, privacy laws are not breached as long as the appropriate safeguards are in place," Wood said.

He further added that the security and safety of citizens remains the primary concern for the ICO and that it would continue to work with the government to provide advice about the application of data protection law during the coronavirus outbreak.

Last week, it had emerged that the government was considering taking the help of British mobile operators in using anonymous usage and location data of mobile users to create movement maps that would help agencies to discover whether people are following lockdown rules.

In addition to revealing the movement of people, such maps would also help the government to find out the hotspots of congregations.

Privacy advocates, however, have branded the step as extremely concerning. They warned that such measures could lead to massive surveillance of people in future.

They urged the government to put in place clear time limits on the extended powers.

Presently, it remains unknown how much personal data the British government wants to use in its efforts to tackle coronavirus spread, but it is also true that other countries, including China, Hong Kong, Israel and South Korea, have taken similar surveillance measures to varying degrees.

In these countries, infected people are required to download a smartphone app to reveal their contacts and movements to government agencies.

Spain, Slovakia, Poland and Romania are also using similar tracking apps to trace people's movement.

A report by The Washington Post last week stated that the US government was discussing with major tech firms, including Google and Facebook, as well as health experts about they could use location data from Americans' mobile phones to fight the outbreak.

As per the report, American public health experts are currently more interested in using anonymous aggregate data that could help map the spread of the virus in the country.

Last Edit by Gladstone


Dr. Naomi Hunter

  • Global Moderator
  • InfoWarrior
  • *****
  • 160
Palantir, The $20 Billion, Peter Thiel-Backed Big Data Giant, Is Providing Coronavirus Monitoring To The CDC

Peter Thiel, co-founder and chairman of Palantir Technologies. The company is working with the CDC ...
  • © 2019 Bloomberg Finance LP

In the last week, staff at the Centers for Disease Control and Prevention (CDC) started logging into a new web app. It promises to help them watch where COVID-19 is spreading and checks how well equipped hospitals are to deal with the spike in cases of the fatal virus, according to two sources familiar with the work. According to those sources, it was built by Palantir, a $20 billion-valued big data company whose data harvesting work for the U.S. Immigration and Customs Enforcement agency has provoked criticism from human rights groups.

With the CDC project, it’s avoiding any such controversy, partly because it isn't ingesting personally-identifiable information, said the sources, who spoke on condition of anonymity due to the sensitivities of the government contract. Instead, the sources said the tech, based on its big data gathering and analysis technology called Palantir Foundry, takes in a range of anonymized data from U.S. hospitals and healthcare agencies, including lab test results, emergency department statuses, bed capacity and ventilator supply. Palantir is also developing models for the outbreak of the virus to help CDC predict where resources are required, they added.

“In the U.S. we are continuing to work closely with our partners at HHS, including CDC, and across the government agencies to ensure they have the most comprehensive, accurate and timely view of information as the COVID-19 response effort evolves,” a Palantir spokesperson said.

The CDC hadn’t responded to a request for comment at the time of publication.

Such tech would give the CDC a clear understanding of what's happening in any given U.S. geography, whether at state, county or city level, at a single moment in time. The information would help the CDC decide where to allocate resources, such as masks and ventilators, one source said. That could prove vital given the rush to meet a pervasive and urgent need for ventilators, in particular.

Palantir is one of several tech companies, including Google and Oracle, flexing their prowess in data gathering and analysis in efforts to stem the coronavirus. Some ideas, such as using locations from mobile phones to track movements of people, have prompted concerns that once the crisis ebbs, increased surveillance will be hard to unwind. Palantir’s tool does not use any personally-identifiable data at this point, but could do in the future, said one of the sources.

Similar to Palantir’s U.K. work

The app, which CDC staff started to use in the last few days, is hosted by Amazon Web Services as part of a partnership for the CDC project, one of the sources said. Palantir has long used the cloud giant for back-end infrastructure.

The U.S. data gathering app looks a lot like a project revealed in the U.K. last week, where reports indicated Palantir was also providing its Foundry platform, alongside Amazon Web Services and Microsoft, to assist the National Health Service (NHS) in the coronavirus crisis.

Palantir’s Foundry will help the NHS determine current occupancy levels at hospitals, down to the number and type of beds, as well as the capacity of accident and emergency, departments and waiting times, wrote the U.K. government late last week. The tool is also gathering details of the lengths of stay for coronavirus patients, the U.K. project coordinators said.

“Palantir is a data processor, not a data controller, and cannot pass on or use the data for any wider purpose without the permission of NHS England,” it added.

The response to Palantir’s involvement in the U.K. has been cautious in light of its previous surveillance work, notably its production of tools that helped ICE target undocumented immigrants in America. It has close ties to U.S. intelligence and law enforcement agencies, including the CIA, an investor via the agency’s In-Q-Tel venture fund, and was credited with helping find Osama Bin Laden before his killing. The company was founded by a social theory Ph.D. Alex Karp, a long-time associate of Palantir investor Peter Thiel, the billionaire venture capitalist who was also an early backer of Facebook.

It’s unclear just how much Palantir will make from the work. According to public records, the most recent contract signed by Palantir with the CDC was in early February for $675,000 for unspecified hardware and software license renewals. Palantir also signed a contract for just $28,000 with the Food and Drug Administration late last month for use of the Palantir Gotham tool, which is typically used to help government agencies find criminals or criminal groups within masses of data.

The app only launched in the last week, though work on the coronavirus project with CDC started two weeks ago, a source with knowledge of the work said. Palantir is also working with Health and Human Services and other federal government customers, they added."
_____________________________ _____________
The U.S. Coast Guard Just Ordered $8 Million In Palantir Tech For Help With COVID-19 ‘Readiness’

Palantir, the $20 billion data-crunching giant funded by Peter Thiel, is working with the U.S. Coast Guard, which has found itself on the frontlines of the COVID-19 pandemic as ships become floating incubation laboratories for deadly outbreaks of the coronavirus.

According to federal procurement records, Palantir’s tools will help the U.S. Coast Guard’s “Readiness System in response to [the] COVID-19 pandemic.” The sum is small— just $11,250 is assigned to the order—but it’s a “blanket purchase agreement call,” meaning it’s one order that forms part of wider work with the contracting organization. (As Jack Poulson from pointed out after publication, the full value of the contract stands at $8.1 million.)

Palantir, based in Palo Alto, California, has been sealing deals across the world to help governments respond to the coronavirus pandemic. Last week, Forbes revealed how the company, once heavily criticized for its work with American agencies tracking down illegal immigrants, had sold its tech to the Centers for Disease Control and Prevention (CDC) as the U.S. agency sought to monitor what masks, ventilators and staff hospitals required to fight the pandemic.

It’s unclear what the U.S Coast Guard’s “readiness system” is or how Palantir tech is helping. Typically its tools draw in masses of data from various sources - in the case of CDC, for instance, the myriad hospitals and health organizations across America - and make it easy for customers to analyze, organize and find patterns in the information. Palantir declined to comment and the U.S. Coast Guard hadn’t responded to requests for more detail.

Coast Guard rescues Americans on cruises

Part of the Department of Homeland Security, the Coast Guard has been working alongside the CDC to help evacuate cruise ships hit by COVID-19 outbreaks. It’s now facilitated the screening, quarantine and repatriation of 250,000 passengers from more than 120 vessels in the last three weeks. That included the Grand Princess, the cruise ship that was held off the coast of San Francisco in February and was linked to a burst of coronavirus cases in California. The government agency not only provided protective gear to those on the embattled cruise ship, but also evacuated infected passengers. On April 3, the Coast Guard said it helped offload more than 1,200 passengers from cruise ships Zaandam and Rotterdam at Port Everglades, Florida.

Palantir has a long history of working with U.S. intelligence and military, one that’s often caused concern amongst human rights organizations. The CIA was an early investor via the agency’s In-Q-Tel venture fund and the company came into the spotlight after it was credited with helping hunt down Osama bin Laden. According to public contract records, it has hundreds of millions of dollars in contracts with the United States Special Operations Command and the U.S. Army. The latter signed a $30 million contract in December 2019.

Palantir’s healthy side

Palantir’s recent work has included various health-related contracts. In January, Palantir signed a contract with the National Institutes of Health for $3.6 million for “comprehensive data capabilities” to the President's Emergency Plan For AIDS Relief  “to support administration of HIV-related programs.”

Palantir had previously told Forbes it was working with more than 12 national governments in their respective coronavirus efforts. The U.K.’s NHS had already revealed it was using Palantir tech in much the same way as the CDC. But there was some pushback from privacy groups in the U.K., largely thanks to the Silicon Valley company’s surveillance work and its sales to the Immigration Customs Enforcement department, which used Palantir data harvesting tech during its work chasing down undocumented immigrants.

The company was founded by social theory Ph.D. Alex Karp, a close associate of Palantir investor Thiel, the billionaire venture capitalist and Trump ally. It’s one of various tech giants, including Google and Oracle, who are trying to use their data crunching capabilities to help fight COVID-19. As Forbes reported last week, Larry Ellison, the billionaire founder of Oracle, has built a database tracking every single COVID-19 treatment and its effectiveness on patients.

Last Edit by Gladstone


Powered by EzPortal