Source: go.theregister.com – Author: Thomas Claburn
The US Consumer Financial Protection Bureau on Thursday published guidance advising businesses that third-party reports about workers must comply with the consent and transparency requirements set forth in the Fair Credit Reporting Act.
The Fair Credit Reporting Act (FCRA) was enacted in 1970 to ensure the accuracy, fairness, and privacy of information in profiles maintained by credit reporting agencies. But it also includes provisions that apply when a consumer report is used to make employment decisions.
The Bureau (CFPB) is concerned that companies may be using third-party reports about worker activity or behavior to inform adverse employment decisions (such as firing workers) based on undisclosed surveillance or opaque algorithmic scores.
“Workers shouldn’t be subject to unchecked surveillance or have their careers determined by opaque third-party reports without basic protections,” declared CFPB director Rohit Chopra in a statement. “The kind of scoring and profiling we’ve long seen in credit markets is now creeping into employment and other aspects of our lives. Our action today makes clear that longstanding consumer protections apply to these new domains just as they do to traditional credit reports.”
Worries about workplace surveillance and unaccountable algorithmic decision-making have proliferated with the adoption of machine learning models, the growing sophistication of online analytics, the uptake of sensor-laden mobile phones, and the need to manage remote workers. Cracked Labs, an Austrian nonprofit research group, has been exploring the topic in a recent series of reports.
While algorithmic wage discrimination remains a significant concern for employees, the CFPB is focused specifically on consumer reports used for predicting worker behavior (like guessing whether the worker will join a union), automatic job assignment systems that rely on worker performance data, warnings or disciplinary actions dispensed without human oversight, and assessments of social media usage in the context of employee evaluation.
- Anthropic’s latest Claude model can interact with computers – what could go wrong?
- Linus Torvalds affirms expulsion of Russian maintainers
- Woman stuck upside down under rock for hours after trying to retrieve dropped phone
- Fake reviewers face the wrath of Khan
Chopra elaborated on these concerns in public remarks on Thursday at the Michigan Nurses Association, noting how he has received questions from workers about being obligated to carry a device or install an app that surveils them.
“I have serious concerns about how background dossiers and reputation scores can be used in hiring, promotion, and reassignment,” he noted. “If an employer purchases a report that details whether a worker was a steward in a union, utilized family leave, enrolled their spouse and children in benefits programs, was cited for poor performance, or was deemed to be productive, this can raise serious issues about privacy and fairness. And if this information is converted into some sort of score using an opaque algorithm, that makes it even more suspicious.”
Consumer reporting agencies and background screening companies, according to the CFPB, now offer employers data about workers’ activities and personal lives.
“For example, some employers now use third parties to monitor workers’ sales interactions, to track workers’ driving habits, to measure the time that workers take to complete tasks, to record the number of messages workers send and the quantity and duration of meetings they attend, and to calculate workers’ time spent off-task through documenting their web browsing, taking screenshots of computers, and measuring keystroke frequency,” the agency reported. “In some circumstances, this information might be sold by ‘consumer reporting agencies’ to prospective or current employers.”
The CFPB circular explains the agency’s legal basis for applying the FCRA to data collected about workers and its requirements for businesses that rely on such data. Companies that wish to use such data must obtain employee consent before doing so. They must provide detailed information data used to make adverse employment decisions. When workers dispute said data, companies must correct inaccuracies. And any such data can only be used for purposes allowed under the law – companies can’t, for example, sell the information or use it for marketing financial products to workers. ®
Original Post URL: https://go.theregister.com/feed/www.theregister.com/2024/10/26/worker_surveillance_credit_reporting_privacy_requirement/
Category & Tags: –
Views: 0