Can big data really help screen immigrants?
Can the algorithm tell you if it’s a terrorist? Can it predict whether you will become a productive member of society?
American immigration officials are trying to answer these questions. They want to build an automated computer system to help determine who will visit or immigrate to the United States.
The immigration and customs enforcement agency (ICE) wants to use the technology of the big data world to screen applicants. The project will flush out all publicly available data, including social media.
But some critics of the idea – including many technologists – worry.
President trump wants to develop an immigration system that acknowledges that people contribute to the “national interest” as he puts it. And anyone who keeps the trumpet is likely to commit crimes or ACTS of terrorism. In turn, ICE asks software companies if they can build a computer system to make these “automated.”
“Any self-respecting data scientist should avoid screaming,” Casey o ‘neill says. She is the author of the mathematical destruction of weapons, and how the algorithms designed by the government and the private sector reinforce the bias. She is one of more than 50 top computer science experts in the United States, who signed a letter urging the trump administration to abandon the plan.
Any self-respecting data scientist should avoid screaming.
Cathy o ‘neill
O ‘neil says algorithms need past data to make accurate predictions about the future. The more data, the better. But terrorist attacks are extremely rare compared with consumer transactions.
So o ‘neill was concerned that the algorithm would make a mistake, without any responsibility. This, o ‘neill says, is “a pseudoscientific excuse to prevent many very good people from entering our country as immigrants”. “And it’s a scientific method, and since you’re not an expert in science and math, you can’t ask any questions, you have to believe that.”
Oops, we lost your DACA application.
Oops, we lost your DACA application.
The White House says refugee admissions can be restored through a new security review.
Critics also say the algorithm cannot measure how a person contributes to society. They think it will inevitably depend on something more measurable, such as income, which does not account for the full story of the applicant.
Immigration officials say these fears are exaggerated. They insist that, if not for a few years, it would actually be an algorithm like this. They are moving away from the original idea of having the computer make the final decision on who gets the visa.
“That’s not the case,” said clark ceteres, assistant director of national security research at ICE, which oversees the project. “There will be no artificial intelligence, artificial intelligence, to determine whether people can come to this country or whether they can stay here.”
There will be no artificial intelligence, artificial intelligence, whether people can come to this country, or whether they can stay here.
The government has collected a lot of information about people applying for visas, according to Mr. What the department needs, he says, is an automated tool that helps human analysts categorize all data so that nothing can be missed. For example, about the planned terrorist attacks.
“If there was information about the attack sitting on public BBS, I would be very sad if we didn’t do anything about it in the law and in the rules,” says Mr.
The company is interested in building the tool. Some software vendors attended a conference hosted by ICE in the summer. Giant oak is one of them – it is a data analytics company in Virginia that has provided data for the agency.
“Technology is changing,” says Gary Shiffman, chief executive of giant oak. “We should use this technology to help people make better decisions.”
Last month, ICE officials held another meeting with tech companies. But they have changed the name of the project because of the growing controversy surrounding it. From the “extreme review initiative” to the “visa life cycle review”.