https://news.stanford.edu/2017/11/28/ne ... -leanings/Stanford News wrote:An artificial intelligence algorithm developed by Stanford researchers can determine a neighborhood’s political leanings by its cars
Stanford researchers are using computer algorithms that can see and learn to analyze millions of publicly available images on Google Street View to determine the political leanings of a given neighborhood just by looking at the cars on the streets.
https://www.nytimes.com/2017/10/09/scie ... study.htmlThe NYT wrote:Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine
Michal Kosinski felt he had good reason to teach a machine to detect sexual orientation.
An Israeli start-up had started hawking a service that predicted terrorist proclivities based on facial analysis. Chinese companies were developing facial recognition software not only to catch known criminals — but also to help the government predict who might break the law next.
And all around Silicon Valley, where Dr. Kosinski works as a professor at Stanford Graduate School of Business, entrepreneurs were talking about faces as if they were gold waiting to be mined.
Few seemed concerned. So to call attention to the privacy risks, he decided to show that it was possible to use facial recognition analysis to detect something intimate, something “people should have full rights to keep private.”
https://www.wired.com/story/ai-research ... -watchdog/Wired wrote:AI Research Is in Desperate Need of an Ethical Watchdog
So researchers have to take ethics into their own hands. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. They trained the algorithm using millions of names from Twitter and from e-mail contact lists provided by an undisclosed company—and they didn't have to go through a university review board to make the app.
The app, called NamePrism, allows you to analyze millions of names at a time to look for society-level trends. Stony Brook computer scientist Steven Skiena, who used to work for the undisclosed company, says you could use it to track the hiring tendencies in swaths of industry. “The purpose of this tool is to identify and prevent discrimination,” says Skiena.
"Prevent"? Well, it can be used differently…
Of course I tried it out (http://www.name-prism.com/):
Needs some fine-tuning (and what the heck are these "nationalities"?).