AI To Determine if You Work or Not

Most disturbing…in 10 years or less, AI may determine your fitness for a job, any job, as stated in this article:

 

Companies will decide to make you a job offer (or let you keep your job) based upon the sum total of the traces you’ve left on the Internet: where you live, what you bought, where you’ve been, when you work out, what you read, the porn you watch, and every comment you’ve ever entered on any website, even those you thought were anonymous.

This is already happening, according to Steve Goodman, founder and CEO of Restless Bandit, a recruiting platform used by companies like Four Seasons, Applebee’s and Gannet. Here’s how he describes the future:

“Smartphones and the internet have made data so ubiquitous and far more valuable and accessible to companies like Restless Bandit. Whether you are going for a run, watching TV or even just sitting in traffic, virtually every activity creates a digital trace–more raw material.

As devices from watches to cars connect to the internet, the volume is increasing. Meanwhile, using AI techniques such as machine learning can extract more value from data. These algorithms can predict whether you’re happy in your job, whether your patterns have changed, or if you’re open to new opportunities.

And so much of this data is now sold all over the place. How do you think traffic shows up on Google Maps? It’s coming from your iPhone, sold by Apple to a consolidator, then sold to Google – all in real time.

It’s more than just your resume. It’s using all of these data points to triangulate your interest in new opportunities and your fitness for a different position.”

While Goodman clearly believes this is all a good thing, the potential for abuse is enormous.

To cite a parallel case, algorithms are already being used in the legal system to determine 1) whether an arrested individual should be offered bail, 2) what evidence is relevant and what conclusions can be drawn from it, and 3) the appropriate sentence length for convicted offenders. Chillingly, these algorithms (which are developed by private companies) are considered trade secrets, so that individuals who are denied bail, convicted and sentence have no legal way to find out why they were singled out.

Similarly, when employment algorithms become widely applied, you might get fired and find it impossible to get another job, but you’ll have no way of knowing why.

For example, suppose you’re an unemployed engineer who ought to be getting plenty of job offers. However, you’ve purchased a handgun in the past six months and posted “anonymous” complaints about a former employer. The algorithm, seeing this, concludes that, if hired, there’s a risk you’ll “go postal.” Hence no job offers.

Or maybe it wasn’t the handgun and the impolitic comments. Was something else? The porn you watch? The speeding ticket you got in 2005? Your political affiliation? The junk food you buy? Your lack of a gym membership? All you’ll know is that something is seriously wrong but you’ll be utterly powerless to do anything about it.

The same thing could happen if you’re already employed. For example, suppose your company runs an algorithm that concludes that you’re the sort of person who steals. Even though you’re a model employee and honest as the day is long, suddenly you’re not getting raises or promotions, and when a layoff comes, you’re the first to go. Now you’re unemployed and (because prospective employers use the same algorithm) you’ll not be offered another job.

We are primarily funded by readers. Please subscribe and donate to support us!

And you’ll never know why.

The scenarios describe above are already happening because employers have long established the legal right to monitor everything you do both at work and in your private life. For example, according to Facebook, while employers can’t legally demand your Facebook password, they can legal demand that you scroll through your “Friends Only” posts while they look over your shoulder. As the American Bar Association put it:

“The battle for workplace privacy is over; privacy lost.”

The situation became even more dire when Trump signed a bill making it legal for Internet providers to share browser history. Now employers can see everywhere you’ve been online, in addition to the information they can buy from the providers of your apps, your watch, your car, online stores, and the Internet of Things.

In short, employers are now like Santa Claus: they “know if you’ve been bad or good… so be good for goodness sake.” Of course, if you’ve already been “bad”–as defined, of course, by a secret algorithm applying secret rules programmed by somebody you’ll never meet–and that “bad” behavior has left traces on the Web, well, say “Hello!” to being permanently unemployed.

And you’ll never know why.

Oh, and don’t think you can escape this by starting your own company or joining the “gig economy.” Do you think for a moment that the companies that might hire you as a freelancer won’t run the algorithm before they offer you a temp job? You won’t get the gigs because the algorithm says you’re a risk.

And you’ll never know why.

h/t Maiya

Views:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.