Skip to main content

Technology and Decent Work Charter

Transport is a technology-intensive sector of the economy.

Today digital technology is part of how work is done, how it is organised, how it is controlled, it is vital to the way information flows around the goods and people that transport workers move. As such, technology is a direct or indirect part of many issues that workers face. 

New digital technology has contradictory impacts on workers. It has the capacity to improve working conditions, make work safer and lead to greater job satisfaction, but workers’ experience is that it often actually creates significant negative impacts.

These impacts can include job destruction or displacement, loss of safety, privacy, labour rights and union organising, reduced pay and punishing work rates, discriminatory impacts on women and other groups of workers. Overall digital technology has serious implications for how work is organised, and who is responsible for working conditions. It can also transform the knowledge needed to do a job, with subsequent impacts on skills, training and competency requirements.

Workers therefore have a clear interest in ensuring technology is deployed in ways that minimise negative impacts and maximise positive ones.

Digital technologies are often sold as a silver bullet, optimising workforces and energy use. But the reality is that it all depends on how technology is brought in. 

Employers too can be negatively impacted by technology. They can waste money on systems that do not work properly and reduce productivity, they can ruin good labour and community relations, and lose highly experienced or qualified workers, and importantly, they can unintentionally lose effective control over important aspects of their business.

There are many transport companies around the world that have suffered the consequences. We believe that this means that there is an important commonality of interest in ensuring technology protects decent work. This charter is inform unions and employers of the rights workers should enjoy in order to ensure a technology introduction that guarantees success to the employer and proper protections to the workers. 

Seven point charter

This charter informs unions and employers of the rights workers should enjoy in order to ensure a technology introduction that guarantees success to the employer and proper protections to the workers.

The right to negotiation and worker-centric co-design

The negotiation of technology is the keystone to successful and safe technology introduction. The preconditions for meaningful negotiation include:

  • Early information, ongoing consultation and final negotiation of the technology
  • For digital technologies negotiation should cover
  • The criteria used by algorithms in software,
  • AI and decision-making systems
  • Negotiation of the datasets used by automated decision-making systems
  • Negotiation of standards and maintenance procedures for sensors and software

Technology designers and employers have an ethical duty to avoid negative impacts on workers. Often technologies can create new problems for workers, forcing them to develop ‘work-arounds’ that add stress or reduce productivity. To avoid this the designers need to employ a co-design process that puts the worker experience at the centre.

  • Tech developers should be informed of relevant labour regulations
  • Workers should be included in the design process by:
  • Being informed of the problem the technology is meant to fix
  • Being asked to identify potential issues with the technology
  • Being asked to identify any training needs that might arise o Ensuring that worker feedback includes specific input from women workers
  • Committing to incorporate feedback into the technology design
The right to risk and labour impact assessments

The potential issues technology brings can to a degree be predicted if they are examined beforehand. Employers should commit to carrying out a risk assessment before technology deployment, and impact assessments after a technology deployment, as part of the negotiation process around technology. 

Risks and impacts should be assessed on the following criteria:

  • Potential impacts on the environment (energy use, water use, toxicity)
  • Potential impacts on workers’ labour rights

 In particular digital technologies and their sub-components (sensors, software, datasets, algorithms etc) should be tested for their individual and collective impacts such as:

  • Gender discrimination impacts on women and non-binary workers.
  • Racial discrimination impacts on ethnic minority workers.
  • Segregated impacts on different age groups (particularly young workers, and those over 50).
  • OSH impacts, including physical and psychosocial impacts and sanitation, particularly where related to productivity criteria.
  • Privacy impacts.
  • Safety issues raised by cybersecurity testing.

In addition, as part of risk assessments employers and technology developers need to ensure clear liability paths for system component malfunctions exist

The right to shared control of workers’ data and algorithmic transparency

Digital technologies are built on a foundation of data. This data is either from pre-existing datasets, or is gathered in real-time in the workplace. Often data is gathered from multiple sources and combined. Data sources in the workplace can include:

  • Video-audio recordings
  • Geolocation of vehicles and equipment
  • Data from scanning guns or RFID tags
  • Data from ID cards and electronic key fobs
  • Keystroke monitoring and productivity software
  • Operation/process management software
  • Data from sensors embedded in machinery or vehicles
  • Mobile phone pings, messaging data (emails or sms)
  • Security gates or doors – fingerprint scanners, number plate scanners
  • HR data, including contact information and performance or training data

Much of this data is made by workers while they do their jobs. This data describes a work process and workers as part of the work process. It feeds the algorithms and software which apply criteria to this data to make automated decisions about what is desirable and what is not.

Data that describes workers as individuals or as a collective, and which would not exist without workers’ interaction with a person, a machine or a vehicle, should be considered workers’ data and subject to shared control through negotiation.

Once gathered data can then be reproduced freely, and can be easily transferred to third parties or used for purposes beyond those it was initially gathered for.

Most transport workplaces make some use of data, but as our workplaces become more digitalised, data collection and use will inevitably grow. Data therefore has many implications for workers, but is not something that most unions are used to negotiating around.

In order to successfully negotiate around data and algorithms we need a basic level of transparency, otherwise unions do not know how performance is being assessed.

The foundations for this are:

  • Explanation rights, what data is collected, why is it collected, where is it storied, who can access it, how long is it stored, how the algorithms using it function.
  • A commitment not to sell workers’ data on to third parties.
  • Individual and collective access to datasets and algorithms used to inform AI. 
  • Allowing workers effective control over any biometric or health data and a commitment not to use ‘emotional recognition’ technologies.

Algorithmic transparency is the counterpart of data controls. For effective collective bargaining around algorithms unions need:

  • A register of all the automated decision-making systems in the workplace. - An explanation of who wrote the algorithms (to establish liability), how the algorithm works (what it is measuring and what criteria are being applied), who decided upon these criteria, what data the algorithm makes use of (is this data providing a fair picture of the work process?), has this software been previously tested, and who operates, updates and otherwise maintains the automated decision-making systems.
  • Algorithmic parameters should also be time-bound and subject to change only upon negotiated agreement.
The right to clarity over liability and responsibility

Digital technologies are a combination of hardware, sensors, data stores, and software. The software and sensor elements are highly customisable. These technologies are usually manufactured, operated and maintained by a collection of different companies, which makes determining liability and responsibility more complicated. 

In some cases workers risk becoming part of a ‘moral crumple zone’ (Elish, 2019) in which they carry the can for failures elsewhere in the technology supply chain. 

To avoid this, as part of the consultation and negotiation process workers should be informed of what the legal framework for establishing this is, and what measures have been taken to ensure that all parties enjoy clarity with this regard. Employers should strive to provide clarity even where national laws have yet to do so in full.

Workers should be adequately insured to protect them in cases where liability is still disputed. 

The right to human points of contact and ultimate human control

The growing use of semi-or fully automated decision making systems in the workplace underlines the importance of measures to protect workers from errors, glitches or hacking. In cases of autonomous or automated machinery or vehicles such issues could have implications for the immediate physical safety of workers.

  • To protect workers from problems with automated or highly-automated decision-making systems, they should have a human point of contact within the company who enjoys the authority to enact necessary changes to these. 
  • To protect workers from problems with automated or autonomous machinery and vehicles there should be a system in place to trigger an automated power cut, alongside an alarm to a human operator with an off switch.
The right to redress, training and compensation

Experience has shown that to protect workers from issues with automated decision-making systems, they must be provided with organisational mechanisms to appeal these decisions, or the data and algorithms they are based on. 

Further, the introduction of new technologies often lead to changes to the skills or competencies required of specific groups of workers, including women and young workers. Employers should therefore ensure they provide quality training that is specific to the technologies being introduced. 

Moreover, workers whose roles are fully or largely eliminated by technology should be offered the opportunity to retrain for new roles created by it. This is particularly relevant for women workers who are more likely than men to lose their jobs to new technologies. 

Where workers do lose their jobs and do not desire to undertake retraining they should be offered adequate financial compensation, or facilitated to find work with another employer. 

The right to share in the benefits of technology

Technology introduction can result in increases in productivity. Furthermore the digitalisation of the workplace can enable employers to monetize data about the workplace and the workers in it. Since workers produce much of the data that drives productivity improvements, and can then be monetised, it is right that they should share in the benefits.

Effective ways of doing so include:

  • Negotiation of a reduction in working time without reducing pay 
  • Negotiation of an increase in pay

In cases where data is monetised a share of the income could accrue to the workers in the form of an annual bonus or a similar periodic payment as agreed through negotiation based on an percentage of the total profit derived from the data over that same period.

Connected

ニュース

ニュース

国連「パレスチナ人民連帯国際デー」にあたり、パレスチナの平和と正義への決意をITFが再確認

 国際運輸労連 (ITF) は、今日のパレスチナ人民連帯国際デーにあたり、パレスチナ人民の平和と正義、人権と尊厳、民族自決の擁護に向けた揺るぎない決意を改めて表明している。 早急な行動の呼びかけ  ITF は改めて以下を 早急に要請する 国際人道法を全面的に尊重し、 即時かつ恒久的に停戦すること 。 人道回廊の設置と継続を含む、 妨害なき人道的アクセスの確保。 すべての人質と

resources

Resources

オークランド港の自動化 失敗の教訓

2016年、オークランド港湾会社(POAL)は、処理能力を倍増させるとともに、オークランド市民、利用者、株主に安全、環境、地域利益、処理能力の面で恩恵をもたらすとして、コンテナ・ターミナルの自動化プロジェクトをスタートさせた。だがプロジェクトはあらゆる点で失敗に終わった。取扱量の増大どころか、コロナ禍による混乱を差し引いても、深刻な混雑、遅延、港湾および利用者の負担増を招いたにすぎなかった
ここに注目: