The Use of AI to Close the Equal Pay Gap
The rapid development of AI has the potential to reduce pay inequality, with platforms such as Gapsquare using AI to track and analyse pay data by gender, ethnicity, and other categories. The analysis and insights produced by this platform can help businesses achieve pay equity.
Currently, the inherited system of inequity and societal inequities are the biggest barriers to employers closing their pay gaps. The UK’s biggest employers continue to make little progress on improving their pay parity, despite 5 years of mandatory reporting requirements on their gender pay gap. Whilst the algorithm bias of some AI systems has the potential to exacerbate societal inequalities, Gapsquare, an AI platform, highlights how AI technology can increase transparency of pay gaps and drive equity in business.
Utilising new technology platforms can help businesses achieve pay equity by understanding where disparities may lie. By introducing equity and inclusion key performance indicators based on this understanding and reporting the pay ranges for every job in their operations, businesses can hold themselves accountable for their progress in closing the pay gap.
What can your organisation do?
- Utilise technology platforms to generate insights on possible pay gaps to create an equal pay strategy that fits your business’ unique position.
- Invest in a diversity and inclusion governance to support your business’ commitment to pay equity.
Past Issues
-
CRYPTOCURRENCY & MODERN SLAVERY RISK
Traffickers are increasingly taking advantage of the pseudo-anonymity of cryptocurrency to move and hide the proceeds of their crimes, yet the link between cryptocurrency and modern slavery remains minimal. Financial institutions must use their unique positions to understand the modern slavery risks of these new technologies and implement due diligence frameworks to disrupt and prevent the money flows of these crimes.
Human traffickers often exploit traditional banking systems to launder the profits of their crimes through financial institutions. This is becoming increasingly difficult to do due to the ever-evolving systems financial institutions have in identifying and flagging suspicious transactions and thus preventing traffickers from moving the proceeds through these systems. In an attempt to circumvent these strict rules, traffickers are turning to cryptocurrency as a method to launder their proceeds due to the pseudo-anonymity of cryptocurrency transactions because of the lack of KYC controls in crypto business.
Due to the nature of cryptocurrency with no personally identifiable information, such as the trafficker’s name or date of birth, criminals can easily conceal their illegal transactions and identity. However, in accordance with KYC and AML regulations, crypto exchanges can collect data on the originating address, destination address, and amount of funds of every transaction which can be analysed to find patterns that might identify the user.
What can your organisations do?
- Leverage your data and technology through enhanced data analysis to identify patterns, such as the flow of funds, that might help identify the user and therefore the trafficker.
- Strengthen your anti-human trafficking policies that work with your AML procedures, such as transaction monitoring, KYC and due diligence, to detect traffickers within their systems.
Whilst the anonymity of cryptocurrency is what draws many people to it, it also draws in traffickers who can obfuscate regular transaction monitoring.
- Challenger banks and large investment banks who are now offering cryptocurrency need to collaborate and exchange information, including the trends you are seeing, as well as best practices you have found in detecting human trafficking-related transactions.
- You also need to work with law enforcement to provide transactional data and allow authorities to develop effective industry controls.
-
ONLINE SAFETY BILL
The UK Online Safety Bill is moving forward through Parliament and if adopted, will put into place new online safety laws for tech companies to protect online users. The Bill seeks to protect children from harmful content online. This will be particularly important in keeping children safe from exploitation and traffickers.
The promotion of safe online usage is vital as internet platforms, particularly social media platforms, are being used globally by traffickers in the recruitment and exploitation of vulnerable people. This year’s Safer Internet Day (February 7) is particularly relevant as the UK government’s Online Safety Bill moved to its second reading in the House of Lords on February 1.
All companies whose services host user-generated content, including images, videos etc. or provide UK users with a platform to talk with other people online will fall within the scope of the Bill. These firms will need to protect children, tackle illegal activity, and uphold their terms and conditions.
This Bill has come under great scrutiny since it was proposed in 2021 and has been through numerous revisions. Clause 52(4)(d) brings the Modern Slavery Act into the scope of the Bill and uses ‘priority offences’ to name some of the identifiable symptoms of modern slavery. Using these ‘priority offences’ will provide the Office of Communications (Ofcom), the British broadcasting regulator, with specific indicators to enact faster enforcement powers against tech firms that fail to remove illegal content. However, there have been calls to include human trafficking as a ‘priority offence’ instead of its current definition as a ‘symptom’ of modern slavery, which includes money laundering or sexual exploitation.
What can your organisation do?
- Ensure senior managers are aware of the criminal liabilities they face for breaching this Bill and begin implementing effective and robust systems to tackle illegal content on your sites.
- Assess how likely children are to access your online sites and take robust action to protect children from harmful and illegal content, such as pornography.
- Amend your company’s terms and conditions to explicitly set out what types of content adults are allowed to post on their sites. These terms and conditions also need to be transparently enforced.
-
WATER SCARCITY LINKED TO DATA CENTRE’S WATER-COOLING SYSTEMS
Tech companies are under scrutiny for their use of dwindling water resources to cool the data centres powering the cloud, decreasing the freshwater sources available for local communities. This drives the mass displacement of these communities, increasing their vulnerability to exploitation.
The vast data centres which power the cloud and other remote software services require millions of litres of water per day to keep the computers cool and functional. This water is usually sourced from already limited potable water sources, with many plants located in areas where increasing global temperatures are already contributing to general water scarcity.
Water scarcity is one of the key drivers of climate-related migration, causing mass displacement of communities that are then put at risk of exploitation. This heightened risk is a result of systems not being in place to support with safe passage or resettlement in new communities. As individuals seek safe passage or new forms of work, they are more susceptible to exploitative labour practices, including debt bondage.
Data centres are one of many factors contributing to water scarcity. However, despite these challenges, technology remains at the forefront of solving these problems through the use of systems like monitoring water use via the Internet of Things. The use of other cooling technologies such as different climates (outside air) and recycled (non-potable) water circulation can minimize data centres usage of fresh water.
What can your organisation do?
- Monitor your water use closely, particularly in areas more at risk of drought or existing water scarcity-related concerns. Include this as a factor when choosing sites for data centre development.
- Invest in water-positivity initiatives which replenish sources used and support communities systemically at risk of exploitation by providing free access to safe water.
- Encourage senior leadership to join the CEO Water Mandate for cross-sectoral learning and accountability beyond individual-level disclosures or strategies.
-
CORPORATE USE OF SURVEILLANCE TECHNOLOGY DOING MORE HARM THAN GOOD TO WORKER’S SAFETY
As technology increasingly optimises performance management and remote working, the adverse effects on worker wellbeing are beginning to show. To protect workers, companies must ensure employee surveillance tools do not become a high-tech means of exploitation.
The Covid-19 pandemic saw a rise in the use of employee surveillance tools as working remotely became the norm. These tools can be positive, such as ‘wearables’ monitoring the fatigue of workers operating manufacturing machinery and step-counters encouraging staff to do exercise.
Despite this positive side of optimised performance management, multiple case studies have highlighted the risks to health and safety inherent to the use of surveillance tools. This includes using surveillance technology to ensure workers hit demanding productivity targets, which can result in severe mental strain, overwork, and occupational injuries.
Left unmitigated, use of surveillance technology may harm worker rights and wellbeing. In one case, a tech giant allegedly used monitoring software to automatically fire underperforming workers and a heat map tool to assess where workers might be likely to unionise.
While “right to disconnect” legislation exists within Europe, protecting workers from surveillance abuse, it is yet to be implemented in the UK. It is therefore important UK based companies thinking of employing surveillance tools consider the full impact they may have on employee wellbeing.
What can your organisation do?
- Come to agreements with staff on whether surveillance technology should be used, and if so, how, to ensure protection of the right to privacy.
- Be aware of and prepare for potential legislative changes pertaining to flexible working, that may increase the prevalence of corporate surveillance technology on staff.
- Establish clear reporting and grievance pathways, ensuring workers can provide feedback about surveillance systems if they are being used by the company.
-
HARNESSING THE POWER OF AI AND DATA TO DISRUPT TRAFFICKING OPERATIONS
The advent of AI tools, such as ChatGPT, is forcing society to reflect upon technology’s power to change how we live. Whilst questions are focused on how this growing phenomenon might alter how we work, there are emerging discussions about how to harness the transformative power of AI and data to tackle entrenched global issues such as modern slavery and make it part of the fight for social justice.
The recent rapid developments in AI that led to the creation of ChatGPT have urged some, including its creator, to call for regulation in its usage, as there are fears that it may be used for nefarious purposes. Workers in the Global South responsible for much of its development have also shared their experiences of poor working conditions and low wages, raising concerns for the human cost of this technology.
However, the advent of such powerful technology also brings huge benefits, including its potential for tackling modern slavery, as identified by the Alan Turing Institute. Various projects utilising AI and data that aim to prevent and tackle modern slavery are currently in development. For example, natural language processing is being used alongside chain event graph technologies to better understand survivor narratives. In a sector-specific example, AI technology and statistical models are being used to tackle forced labour at sea in capture fishing.
One of the most important aspects of the development of these projects is how they can expose the hidden nature of modern slavery where previously it has not been possible to extract significant amounts of data. Typologies of this include domestic servitude, sexual exploitation online and marine labour exploitation. Collecting data on those in vulnerable and isolated circumstances will help with the identification of communities to aid the protection and prevention of risks to exploitation.
What can your organisation do?
- Research developments in AI and discuss how you can contribute, based on your capacities, to its use to combat the crime of modern slavery and human trafficking.
- Connect with other organisations pioneering the way in this field to assess whether partnerships may be viable. The Alan Turing Institute is one leading example of an organisation thinking actively about how to use AI to tackle modern slavery and other human rights issues.
-
HARNESSING THE POWER OF SOCIAL MEDIA TO DISRUPT DANGEROUS BEHAVIOUR
With the growth of social media, traffickers have adapted, and social media platforms are increasingly being used to approach and recruit children into exploitation. Whilst social media has the potential to enable trafficking, these platforms are also being used by STOP THE TRAFFIK to reach vulnerable communities at risk to disrupt dangerous behaviours online.
It is estimated that in 2023 more than half of the world now has access to and use social media. Traffickers are now targeting young people, aged 11-24, on specific platforms through adverts. County lines, a common form of criminal exploitation, is using adverts resembling professional jobs recruitment on social media to recruit and exploit young people into criminal activity.
However, the developments of social media have created the potential for these platforms to reach communities at risk of exploitation, specifically young people, and prevent their recruitment into exploitation. STOP THE TRAFFIK utilise the power of social media platforms to run our geo-targeted prevention campaigns that intend to reach vulnerable communities. Currently, STOP THE TRAFFIK’s social media campaign targeting young people in London, as they are the most vulnerable to involvement with county lines gangs, indicates the power of social media being used for disruption. As of June 2023, over 22,000 young people have clicked on the learn more link from the campaign, increasing their knowledge of how to keep themselves and others safe from exploitation. By using the power of social media, we can reach more young people who could be targeted by exploiters or traffickers online and equip them with the knowledge needed to avoid potentially exploitative situations.
Whilst social media can facilitate dangerous behaviour of exploitation, the success of STOP THE TRAFFIK’s geo-targeted campaign highlights how these platforms can transform the way vulnerable communities can be reached and empowered against exploitation.
What can your organisation do?
- Dedicate time and resources to train staff on monitoring dangerous content and accounts that appear to be recruiting children or young people into forms of criminal exploitation, such as county lines.
- Monitor and enforce age restrictions of all users on social media platforms, as there are reports of children as young as 7 years old being recruited into county lines through social media.
- Partner with law enforcement and survivor protection organisations to share suspicions of exploitative accounts for further investigation and remediation.
-
FALSE JOB ADVERTS AND THE RISE OF ONLINE SCAM SYNDICATES IN ASIA
Online scam syndicates are luring thousands of migrant workers into exploitative schemes across Asia through false job adverts posted on social media and online job recruitment sites. Businesses need to be proactive in utilising intelligence to tackle these false job adverts, as traffickers are monopolising the use of recruitment websites to lure victims into large human trafficking networks.
Across Southeast Asia, in countries such as Cambodia, Myanmar and Thailand, online scam syndicates are luring migrant workers into exploitation through false job adverts offering lucrative incomes inclusive of accommodation and travel. There are arguments that this recent growth can be attributed to traffickers increasingly taking advantage of the joblessness of white-collar workers across Asia following the pandemic. The intended target of these scam operations is often college graduates. A Hong Kong NGO discovered an individual who flew to Thailand for a job advertised on Facebook at $6,400 a month where they were smuggled to Myanmar and forced to work 12 hours a day in phone scams. Many victims of online scam syndicates are forced into criminal activities such as investment fraud, making them less likely to report their situation to the local authorities.
Traffickers are utilising social media platforms and legitimate recruitment websites to target those using these networks to search for employment. Traffickers are infiltrating community groups on social media sites used to support people finding employment to present false lucrative employment opportunities. Recruitment websites are also host to false job adverts, often posing under a legitimate company trading name, offering employment opportunities. Interpol have discovered evidence of similar models being replicated in West Africa. However, by harnessing the power of technology and intelligence, we are confident businesses can successfully spot and remove these false job adverts.
What can your organisation do?
- Invest in intelligence tasking and ensure online trails and information regarding false job adverts are not lost once content is removed, this can provide crucial evidence to law enforcement authorities.
- Partner with law enforcement and other online platforms to share suspicions of exploitative accounts, as often traffickers are using several aliases to post adverts across different platforms.
-
TECHNOLOGY PROVIDING A VOICE TO INDIGENOUS PEOPLES- INDIGENOUS NAVIGATOR
Technology plays an important role in monitoring human rights and ensuring that the voices of those most often excluded from mainstream conversation are included. Innovative project, the Indigenous Navigator, is bringing this to life global communities.
Historically, indigenous communities have not been consulted when businesses conduct human rights impact assessments, evaluating the potential impacts of their operations and/or supply chain on the rights of individuals. As a solution to this, the Indigenous Navigator was developed by and for indigenous peoples with the aim of allowing indigenous groups to generate data and use this to communicate their needs. They might, for example, gather data on how an area of land is used to show how a new project on this land would impact their way of life.
An example of this is work with the Sapmi to fill in vast knowledge gaps about their rights through detailed surveys. The Navigator was used to support the community to gather data about their activities, rights and history in the Sapmi region, information not currently collected by the government. This data can then be used to defend their rights in the case that they are threatened. If a business were to want to open a mine in Sapmi, the community could show, using data, the extent to which they would be impacted and help lessen these impacts.
The project shows the potential for technology to enable community-led data projects that government initiatives have been unable to fulfil on a large scale. Information gathering aims to empower communities to support themselves by presenting an accurate picture of their lives and livelihoods in a way that makes sense to international bodies or businesses.
What can your organisation do?
- Consult the navigator tools as initial desk research when considering moving operations to new locations or conducting the initial phases of a human rights impact assessment;
- Evaluate the ways in which your own technology can be used to assist in elevating the voices of indigenous peoples such as providing access to secure data storage or data gathering tools.