1. Falling into the delivery platform trap.
By Nuria Soto
This text will be published in two parts, this is the second part.
The algorithm as boss
Given my previous experience in other jobs, joining Deliveroo intrigued and attracted me. I could make my own way around the city, without a boss to watch over me, give me directions or admonish me about my work performance. To understand why that image soon faded, revealing a very different reality far removed from anything seductive, we need to address a central aspect of working on delivery platforms. This is the role of the app and its algorithm in the day-to-day organisation of work.
As soon as you started working, the first thing you had to do was to log on to a website called Staffomatic. There was a calendar with different slots made up of time shifts where you could sign up (‘apply’) and where you could see which colleagues had also signed up. However, these slots only allowed a limited number of people to sign up. Therefore, after the weekly schedule was published at a certain time, only the fastest were able to apply for certain slots. There were also workers – those better rated or evaluated by the app – who could access those schedules earlier, giving them an advantage over everyone else. We will address the issue of evaluation later.
In any case, not only did the initial promise of “work when you want” vanish due to factors such as the speed of applying for a slot, the limited quota of people for each slot, and the rating of each person, but because the hours to which you managed to apply were not guaranteed at that moment. The company then decided which of them it would accept. The system was also similar at Glovo. Juanjo Lavergne, a fellow worker in Riders X Derechos and one of the first Glovo riders back in 2015, described it in the Riders X Derechos workshop organised by La Laboratoria as follows:
I remember the situation, for example, when you were told that you had to sign up for working hours and the working hours were opened. Opened two days a week and very quickly there was only one moment, meaning exactly one minute, when the timetables were opened. Then everyone had to be on it to sign up for the working hours. I could already see that the scoring algorithm was there, working in the sense that there were a certain number of delivery drivers needed on a Friday, a Saturday and/or a Sunday and then they could open or close the hours whenever they wanted. So there you were, in a situation of total work stress that I had never experienced in my life, that is, in any job. I had to sign up for working hours through a machine. But I’m sure that there were people behind it who looked at all the screens and made allocation decisions according to the needs of the company.
Moreover, if for whatever reason you didn’t make it on time when the timetables opened, “you had to start negotiating with your colleagues to see who would give you an hour or hunt for the hours that people left free”, as another Riders X Derechos fellow worker who was working for Deliveroo at the time told the same workshop.
Then it was time to go to work the hours of the slots that the company had confirmed. The first thing I had to do was log in, enter the username and password the company had given me, and then I could start my shift – or my ‘mission’, as the company called it. Back then, in 2017, Deliveroo divided workers into zones. In my case, my zone was Ciutat Vella, though they changed it from time to time. I had to be in the Ciutat Vella zone to be able to log in. Otherwise, the app – which had my geolocation – would not let me start my shift. Once inside the area, we slid a button across on the app to ‘connect’ and start our ‘mission’.
When I logged in, I had to go to the designated ‘centroid’, the area centre, which was usually some place set by the company, where we had to wait for orders to come in through the app. When we finished, we always had to go back there. As I am writing this I am counting the number of times I have repeated ‘had to’ in my description, and this number reinforces for me the resounding certainty that the idea of “being your own boss” was just one of many empty promises. There was absolutely a boss, he just wasn’t wearing a tie or smoking cigars in his office.
To understand why there were and are so many ‘have to’s’, it is enough to know that if a rider does not comply with certain guidelines, it has a direct impact on the number of hours the company will accept you for the following week. Thus, the number of requirements that we met or did not meet was measured by a score – this was invisible to start with, but became viewable after a few months. It was called your ‘reputation’ and was displayed as a percentage. Thus, if, for example, I did not work at peak hours or on weekends, my score-reputation, my evaluation, went down. If I rejected or did not accept an order that appeared on my screen in time, my score went down. If a customer complained or gave me a bad rating (the end customer can rate the delivery person who brings them the food), more of the same. And all this was measured by an algorithm. Here was our real boss.
In short, in this ‘be-your-own-boss’ job I HAD TO accept all orders, work at peak hours, connect in the area they had designated, comply with standardised pick-up and delivery times, have a fully charged mobile phone (the company knows your device’s battery level at all times), carry the box with the Deliveroo logo (I had painted it black and, when Deliveroo staff found out, they forced me to change it back for one with the logo). I HAD TO follow the route indicated by the app to reach my destination, be careful with how I spoke to the customer, and at the same time be lucky enough to get good customers. The customer, incidentally, could rate you badly for many reasons that had nothing to do with you, or if you didn’t react well to certain clearly sexually-driven insinuations, as has happened to a colleague. If I didn’t succeed in all this, someone behind a number called LiveOps would send me a message on Telegram: “Hi, Núria, why haven’t you delivered the order yet?” “Hi, Núria, you have an order to accept.” “Hello, Núria…”. If I got a flat tyre or had a problem, I also had to message this channel to report the incident and ask for an order to be reassigned if I couldn’t deliver it. Usually, in these situations, we would send a photo of what had happened, fearful that we would be downgraded or disconnected for leaving an order half done or taking too long to deliver it. If we didn’t send a photo, we were sometimes asked for one anyway.
Reputation was all this and much more. It was played out in a myriad of everyday situations that cut across virtually all of your performance; for example, if the app wasn’t working right – which happened on a regular basis – and this resulted in you being unable to accept an onscreen order for a limited time, your score would drop.
It could also happen that, if our shift ended at 00:00, we would get a completely out-of-zone order at 23:59. In these situations you knew that the order was the one that several colleagues had rejected a few seconds earlier. You had a split second to decide whether you were going to extend your working day or not. If you didn’t, you felt bad, because your score would go down and with it came the worry about the income you would make at the end of the month. If you did, it was most likely the result of the pressure of an already low score, with the risk it implied for the amount of hours you would be given the following week. And that’s not all. An excessively low score could mean waking up the next day with an email informing you that the company was ‘ceasing to collaborate’ with you. Moreover, the reputation system seemed to work in one direction only: your score went down incredibly easily, but it was very difficult to get it to go up. You had to put in a lot of hours and orders to raise it back up, little by little.
Faced with these variations in your reputation (sometimes for reasons as arbitrary as a customer’s bad day or a random incident with your bike), you given no opportunity to explain or justify what happened. There was no one but an algorithm on the other side of the reputation system. The feeling of insecurity and fragility was constant. Not even in the face of a dismissal could we ask questions, reason, or ask for explanations. At most, we could send an email to the head office, but in the end it didn’t matter what we would say in it. Although there was a human decision behind every disconnection, it was supposedly always for one reason only: our score, i.e. the algorithm. Despite the fact that we were supposedly our own bosses, I have never felt so closely monitored in a job, nor have I ever been governed by a system that penalised the slightest deviation from what was expected of you.
The algorithm was like an omnipresent boss with whom you could not argue, but who could impose how, when and in what way you were going to work, and when you would be fired. All this would take place regardless of any regulation, agreement or arrangement. Who we were, what was happening to us, where we were in our lives and how this was related to the score determined for us did not matter. The only thing that counted was the result, not any of the reasons behind it.
The algorithm acts not only according to parameters that make up the scoring system, but also weigh in on the payment of orders and the way in which that is calculated. The most relevant case here is Glovo’s payment system, renamed by several workers as the subasta (auction) system. This is a method of payment that works by means of a coefficient multiplier. Starting from a base price defined by the distance travelled or the waiting time, among other variables, this is multiplied by other variables that are determined by factors such as the weather (e.g. a rain bonus). Unions and workers made complaints about the method, and the platform removed the multiplier with a value of less than 1, keeping the 1.0 – 1.3 multiplier.
The triumvirate of measurement, evaluation and monitoring imposed by the algorithm hardly fitted with the romanticised, uncomplicated idea of the delivery biker who goes freely around the city picking up and dropping off orders, combining her passion for cycling with the chance to earn some money in one job without a boss constantly checking up on her.
Far from it, you are playing by the rules of the game imposed by an arbitrary and at the same time unfathomable system of penalties in which what you say or do matters little. It injects a high dose of anxiety and tension into your daily life: the traffic light that won’t turn green; the cyclists in front of you going at a different pace; the roadworks that make you take unnecessary extra turns; the app not working properly; the container that doesn’t close right and spills soup all over your backpack; the burger they probably took out before ringing up the bill, meaning you deliver it at a temperature that will not make for a happy customer; the shift that everyone wants but you don’t get to apply for, for some reason; or the shift that you applied for but can’t carry out because of an unforeseen personal situation. And so on, in a long list of factors that turn your daily work into a constant anguish-filled endeavour not to fall out of favour with the algorithm. This is a struggle that, as the months go by, you normalise without realising it. This is always set against the backdrop of the pressure to stay physically fit, despite overdoing it, jumping traffic lights, and working in the rain. Otherwise, if anything happens to you, you don’t have access to another basic right: time off work.
One of the days I remember most from my work as a rider is one of the days when a huge storm broke out in Barcelona. It was a very rainy week. I was riding in the Ciutat Vella neighbourhood, the old part of the city centre. It is always quite busy and its narrow streets don’t let you pick up much speed or cycle long straight stretches. My times were probably not very fast because of this; neither were those of some of my colleagues. On top of this, the app kept crashing and not letting me check some orders as accepted and they were automatically unassigned, which lowered your reputation rating. In previous weeks I had seen how all too often, either due to timing, customer complaints or unknown or arbitrary reasons, a colleague would receive the infamous email informing them that their “collaboration” with the company was being terminated. That week I received the following e-mail:
Hello, Nuria:
We have noticed in recent weeks that your service levels have dropped and, according to several
factors analysed, the quality of your performance may affect deliveries in the area. The factors
analysed are:
– rejected or de-assigned orders;
– speed between acceptance and arrival at the restaurant;
– speed between restaurant and customer.
We hope that having this information will be useful for you in assessing the quality of your services.
Best regards.
A few days after receiving this mail, the day of the big storm arrived. You could barely see two metres in front of you and my rainproof coat had long since ceased being of any use. I had not released my allocated working hours because, if no one took them, it meant a penalty that could affect the number of hours I would be given the following week, and my continuing on the platform. I had already received the warning email.
Orders tend to increase when it rains and that was the case that day. A few days before, one of my colleagues had had an argument with a customer; the latter didn’t tip him at all and was rude to him, so my colleague, exhausted by the pressure, the cold, the stress and the tiredness, had snapped at them and left. The next day, through the Ciutat Vella delivery driver group chat, he forwarded us the termination of collaboration (dismissal) email that Deliveroo had just sent him.
It was clear that, if we didn’t want to end up like him, we had to avoid bad ratings and complaints from all those customers who, far from appreciating the extreme weather and putting themselves in the shoes of what it means to deliver by bike in those conditions, wanted something hot to eat at home for a ridiculous price. We also had to be quick and stick to the hours we had applied for the week before, when we didn’t know what the weather conditions would be like. If we didn’t, our score could drop and we could lose hours or find ourselves out of a job.
I was still delivering on this difficult night shift when, after delivering an order, my soaked mobile phone started to fail, and wouldn’t let me check the last order as delivered. This meant minutes were ticking away and my times were getting worse. I started pedalling home as fast as I could, not paying attention to any traffic lights, so that I could dry my phone as quickly as possible and check the delivery. On the way, I was almost hit by a car because of the low visibility in the rain. My heart was racing with fear and tension. I finally managed to check “delivered” on the order. I went to sleep exhausted and terrified of receiving the famous disconnection mail the next day.
I have never been able to forget that day, not so much because of what it is like to cycle in such a storm, but because of the anguish and anxiety I went through at the thought of the consequences that the app failing would have on my reputation. It was not the first time, nor would it be the last time, that I had felt such pressure at work. What I felt was nothing more than the result of pushing a person to the peak of their performance, no matter what. Rain, fatigue, cold, a sick body; these are circumstances that are not factored in. I felt like I was nothing. I was more concerned about the timing of the order than my own safety. That’s how much I had been taken over by the algorithm. We riders play Russian roulette, taking risk after risk as we ride, hoping to increase or maintain our reputation that day. Meanwhile, for delivery platforms, we are just another soulless, context-free number; a pair of legs that deliver according to the rules of the game imposed by the company, without the intervention of the state or any kind of labour regulation.
It is therefore easy to understand that conflict with the platforms has revolved – practically all over the world – around the main tool of work. Contrary to what it might appear to be at first glance, in our case it is not our bikes or scooters, but the app we use for work. Without it, it would be impossible for us to work within the framework of the platform economy. Although I have mentioned some of the parameters that we know define and shape the algorithm at the core of the app, there are a whole series of other variables that are totally unknown and inaccessible. We do not know for sure how it works, what data it collects, what parameters operate within it or how it evaluates our performance. Our working hours (and therefore our income), or our continuity in the company, are decided on the basis of impenetrable parameters and metrics that we are unable to decipher. In other words, lack of job security is inherent to the algorithm. Once you come under its sway, insecurity is at the heart of the job.
It should be noted that many of us were made to suffer by the algorithm before we began to understand it. When we started working on these platforms, we had barely heard of the concept. We didn’t know that the algorithm was somehow the thing that would cause us to put all that pressure on ourselves to check an order as delivered on time in the middle of a storm. We didn’t know it would cause us to receive an email alert that our times were low, or to imagine the week of zero working hours that would follow a weekend when we hadn’t been able to work the high-demand shifts. “That sequence of steps or instructions to carry out a task […], that abstraction endowed with an autonomous existence, independent of the particular programming language with which it was developed […], that part of an assembly that needs databases, memory or lists to function […] is also social software, simultaneously capable of programming the behaviours and actions of bodies”. Of our bodies. This is how Tiziana Terranova explained it in her thought-provoking article, “Marx en tiempos de algoritmos” (Marx in the Times of the Algorithm)2 and this is how it was for us riders in our daily lives.
The technological utopia of labour management based on computer parameters far removed from any human interference, which reduces labour relations to relations between things (app- service-money), falls apart the moment we introduce the question of its creators. The process of constructing an algorithm is traversed by economic criteria, by ethical criteria – or the absence of them – and by managerial or political criteria (David San Martín, 2023).3 It is people representing capital who create these tools for specific projects to do with economic growth. Algorithms are never neutral tools, they are political or, rather, technopolitical instruments. Management and managerial decision-making algorithms clearly encapsulate labour decisions, depersonalising them, and making them abstract. With their irruption, they move use from discussing technology as an optimisation of production processes to seeing them as substitutes or complements to managerial decisions. In other words, we are now talking about ‘decision- making technologies, or business organisation technologies’. It is not difficult to translate this idea into how companies like Glovo and Deliveroo function. Here, the figure of the boss is replaced by that of an untransparent algorithm which, through intricate automated processes (set up by someone), is in charge of organising our work, deciding for itself the hours we will work next week, the penalties we deserve and the position we will occupy in the endless competition unleashed in the fight for hours.
However, and here is the key, algorithms are always socially and institutionally constructed and managed mechanisms. We cannot forget that, however technically perfect, aseptic or neutral these devices may appear to be, technological objects are designed with specific criteria, with very specific purposes and also – it must be said- with certain limitations, as San Martín reminded us. Digital technologies in the hands of the financial class are placed at the service of generating new forms of power and capital accumulation.
Managerial algorithms not only allow for seemingly neutral and economically maximised organisation, coordination and business management, but also have a very specific effect on workers. And we are not only referring to the organisation of the resulting working day, but also to the way in which we think of ourselves and project ourselves as workers. In other words, algorithms construct subjectivities, turning us into both object and subject at the same time: techniques always have a productive character, they configure the social space, that is to say, they configure our selves. Techniques generate subjectivation and objectivation, ways of seeing and approaching the world and, therefore, they also configure us as subjects (San Martín, 2023).
They ultimately condition us to the extent that we adapt our behaviour to them and they impose conditions of use on us. We are impelled to modulate our behaviour to the ways they function – which is imposed by the platforms themselves – so that we are not penalised and, in this way, we end up internalising a very specific way of thinking of ourselves as workers. Algorithms can be considered technologies with constitutive normative capacity in the sense that they are conceived and designed to govern people’s behaviour (San Martín, 2023). They are technical instruments or objects that are designed to condition our behaviour directly and with certain specific objectives. In other words, they define, in the case of both distribution and the work environment, what we as delivery workers have to do, when we have to do it and how we have to do it. They are, in short, normative technologies because they are designed to organise the distribution of work.
Felipe Díez is an ex-delivery driver and fellow member of Riders X Derechos, currently working on a doctoral thesis entitled ” Mi empresa es mi cuerpo” (My company is my body). He argues that the platform economy has managed to articulate a system of labour organisation through flexible – ever changing – measures which mean that we, in the full exercise of our autonomy and freedom, always generate the response that the company expects from us. A characteristic functioning of neoliberalism, in which “to govern is not to govern against freedom or in spite of it, it is to govern through freedom, that is, to actively play with the space of freedom given to individuals so that they end up submitting themselves to certain rules”.4
At this point, taking into account the political and labour relation consequences of the replacement of managerial decisions by the automation of processes and the regulatory capacity of algorithms, the question that arises is whether or not the efforts being made to generate algorithms and ethical artificial intelligence will have the desired effects.
San Martín pointed to a series of costs intrinsic to the use of algorithms in highly vulnerable contexts, such as difficult-to-solve work environments. These costs are fundamentally represented – contrary to the vision sold by companies – by the biases that algorithms introduce. The functionality of an algorithm is based on its ability to discriminate, i.e. to take advantage of these biases. For example, in the case of algorithms applied in border management, one of the items introduced is related to the different propensity of subjects to offend. However, this involves looking at characteristics that these individuals have that others do not. Often these characteristics are, for example, ethnicity. This is a functional variable for an algorithm, and adopting this system implies assuming that, directly or indirectly, ethnicity will be at the centre of this management.
In the case of the platform economy, racism forms an inherent component of the managerial decisions made by the algorithm, as it inevitably favours and reproduces the absolute availability and self-exploitation to which the most vulnerable migrant workers are pushed into. The same is true of the generalised negative effects of sexism in how artificial intelligence works; penalising women in the distribution of slots and hours.
Far from being in the minority, sexism, and the discrimination it generates, now permeates the workings of artificial intelligence algorithms. This is a problem because we are increasingly using algorithms to make crucial decisions about our lives. For example, who is and who is not eligible for a job interview or a mortgage. […] Algorithms have been shown to inherit the gender biases that are prevalent in our society. […] Moreover, these biases often tend to increase due to the large amount of data that algorithms handle and their widespread use.5
The events at the online retail platform Amazon are a good example of this phenomenon. It had to remove its recruitment algorithm as it showed a strong gender bias that penalised CVs containing the word “woman”.6
Transparency is another important factor. If we assume the use of algorithms in certain areas, this inevitably means fighting against abstract decisions made through systems with little or no transparency. It is extremely costly and difficult to access the information we need about these systems. After the adoption of the Rider Law, which obliges companies to report on their algorithms, trade unions have made huge efforts to gain access to certain algorithmic information – in relation to the organisation of work – these did not bear fruit until the implementation of the algorithm guide issued by the Ministry of Labour in June 2022. However, setting to one side the need for training to be able to interpret the information received, a new conflict is brewing based on whether the information received is complete, correct and real. Thus, the problems of transparency, even with access to the algorithm, are not definitively solved. As San Martín points out, we have to be socially aware that biases, lack of transparency and difficulties in understanding how the algorithm works are three of the costs of accepting this type of technology.
Walter “Gavitt” Ferguson was born in the early 20th century on the Caribbean coast of Central America. He lived all his life in a small town called Cahuita, in the southern Costa Rican Caribbean, where he devoted himself to writing and singing calypso music. His songs form a historical anecdotal record. In the song entitled Computer, he tells of his experience with what today we might call decision-making algorithms. He sings that he was once assigned a pension. However, when he went to withdraw it, the clerk refused to give it to him because the computer said that Don Walter had a lot of money and property. Appalled by the situation, Ferguson wrote in the song’s chorus: Nobody hate the computer, / Computer is a wicked talking parrot.7
The problem with the algorithm is that we can’t know whether the data it was handling about Ferguson was correct or not, or whether the algorithm was designed appropriately, or whether the problem was in how the data was presented for the clerk to make their decision.
Today, artificial intelligence-based tools and decision-making algorithms are that wicked talking parrot that shapes our lives. However, we now know that most of these tools are based on incomplete, biased or corrupted data; that the algorithms are adding a stereotypical view of historically marginalised populations; and that the way in which the results are delivered to us entrenches the biases and stereotypes, validating them by running them through the computer.8
So the debate that opens up here is whether the challenge is to build feminist and decolonial algorithms capable of shaping a more equitable society by correcting historical inequalities, or whether in certain contexts where there is a significant level of vulnerability for the subjects (including the fields of labour, border management, male violence, and so on), we should consider excluding the algorithmic automation of certain managerial processes. In other words, as women workers, can we create non-discriminatory algorithmic proposals for data collection in transparent, free software that promote a different kind of management of work? Or, in terms of rights, would excluding the possibility of management (exclusively) through the algorithm afford more protection? After all, these are areas where the power relations involved determine the uses of the algorithm and its technological complexity would hinder any collective re- appropriation.
To explore these questions, within the framework of this research, I was fortunate to be able to talk to both Marga Padilla, founder of Dabne – a cooperative dedicated to the development and implementation of projects based on free software – and Ona, a member of Donestech. Donestech is a collective that researches and intervenes in the field of women and new technologies. Their feminist, anti-capitalist perspective, and approach rooted in the social and solidarity economy, combined with their knowledge as programmers, create the perfect combination to make listening to them a pleasure. Based on their knowledge and expertise, both of them advised us not to condemn the algorithm. They emphasised the importance of knowing how to separate the tool from those who own it, control it and programme it. Blaming the algorithm, Padilla told us, does not correctly situate the technopolitical reality, which is that technology never functions independently. It is always embedded in webs of power and power relations. If you take the algorithm to pieces, with the idea in your head that therein lies the problem, you fail to see the technopolitics behind it, because you are seeing technology as the enemy. So, ultimately, the machinations of power, the power relations, are obfuscated behind the technological aspect.
Indeed, both warned that upon careful analysis of the situation, it becomes clear that hatred of the algorithm could become a victory for algorithmic management. This is because it focuses the problem on the algorithm, overlooking the human managers, and thereby releasing companies from their ethical and political responsibilities.
Neither Marga nor Ona were against the exclusion of algorithmic management in fields where power relations are intrinsic and unilateral (border management or the police management system for gender violence VioGén,9 for example). However, instead of a comprehensive rejection of it, they advocate a process of democratisation of the algorithm that allows us to extend the knowledge we have of it in order to re-appropriate these technologies socially and collectively.
Coopcycle, the alternative delivery network in which Mensakas (the cooperative we at Riders X Derechos created in Barcelona in 2017) are registered along with more than eighty other cooperatives around the world, has an open-source app. It has no cookies, stores hardly any data and uses no algorithm. In consideration of the labour conflict which we were coming from, being taken into account once more as people – as humans, with our own lives and different contexts – was an non-negotiable premise for us. Instead of an algorithm, the Coopcycle app deploys the role of dispatcher. This is the person in charge of assigning tasks, who assesses the personal situations of each worker at all times and combines them with delivery needs. Thus, Coopcycle, in addition to being a federation of cooperatives, is an app, a technological tool which we use to organise our work and carry out our activity, but it does not have automated processes for assigning orders.
In the case of Mensakas, my colleague Jordi and I perform the functions of the dispatcher. This allows us to get to know our colleagues and take into account the skills of each one, consider their needs and gives them the opportunity to communicate with us every time there is an incident. We know, for example, if a colleague is not on top form that day because she is on her period, or if a colleague is more tired than usual on a certain day. I have to wonder, won’t this aspect of care fade away if we automate it?
Both Ona and Marga insist on the potential of thinking about interactive algorithms capable, for example, of incorporating just these variables. This can include the fact that a colleague has her period or that a member of the team is not feeling well. Algorithms can even interact with people to distribute a certain centralised decision-making to a single person, thus contributing to a democratic process of shaping the tool itself, through transparency and supervision.
Perhaps it is precisely this process of democratisation that is the strongest obstacle. Until this happens, we have to ask what effect automation has on workers – especially women – in terms of their capacity for self-organisation to develop trade union action while marginalised. Furthermore we need to examine how we can confront this constant coercion that pushes us to make decisions to take all those actions in order to “stay in the game”. Given that this pertains precisely to technopolitics, I believe that, as long as lack of algorithmic transparency is innate to the political strategy of companies themselves, it will be impossible to walk a transparent and collective social path with regard to the main tool of labour. Of course, any counter-power sheds light on the design of alternative forms of work organisation, and critical, emancipatory ways of thinking about the tool itself. With the confidence that this inspires, it is perhaps conceivable to imagine possibilities of decentralising power and interactions in which, rather than substituting a managerial figure, algorithms can encourage us to take a collective path of construction of dynamics that streamline, optimise and make work profitable, while contributing to the democratisation of various processes. But, in the meantime, it is essential to encourage and maintain a social debate that takes into account the changes and consequences that the use of algorithms entails in the fields of the organisation of work, and trade union struggle.
A race: everyone against everyone else
Be your own boss. Manage yourself. Be an entrepreneur. Concepts such as freedom, flexibility and innovation have permeated the labour market over the last five or six years since the arrival of delivery platforms and have become the legitimising core of their model. In fact, you only have to go to Glovo’s website to read this stirring call: “Hazte repartidor. ¡Sé tu propio jefe! Realiza entregas a través de Glovo y disfruta de flexibilidad, libertad y ganancias competitivas. Únete… ” (Become a courier – be your own boss! Make deliveries with Glovo and enjoy flexibility, freedom and competitive earnings. Join…).
The use of these concepts is not just discursive rhetoric orchestrated to attract future couriers. It goes much further. It speaks of a new labour landscape, fully consistent with neoliberal logic, which seeks to produce a new type of worker. Underlying the discourse is a set of technologies that the company deploys with the aim of generating a new subjectivity in the couriers. This is a subjectivity that suggests the figure of the entrepreneur, an individualised concept of progress and success, and a new reality of competition between workers. All of this obviously affects our capacity for organisation and fighting back, as well as by conditioning the position of workers when it comes to industrial disputes.
“If you want it enough, you will achieve it”, as we are constantly told. With platform work, we are all small-time ‘entrepreneurs’. Our successes and failures rest on our own shoulders. Everything depends on our efforts to stay on our feet in the race imposed on us by the platforms to stay connected to the app. “We are our own bosses. Besides, if that delivery driver without a leg can do it (as the Deliveroo billboard said), how can you not?”
The hours of work we get depend on our scores (the scores we get by ‘freely’ choosing the hours and days we work, the risks we take on the streets to deliver the order as quickly as possible, and the degree of exhaustion we inflict on ourselves by pedalling ceaselessly in the hunt for a new order), as well as on our speed in picking up the hours that another rider may have freed up. We are set at odds with each other – as we compete for work hours and orders – and an individualised idea of “success” is imposed on us. We are disconnected and fragmented, not allowed to feel a part of something. We are alienated from any collective idea that could sow the seed of trade union organisation, of cooperation and mutual support as the backbone of progress, and a fundamental part of our identity. Thus, if you don’t earn enough income, it’s because you didn’t work hard enough. In fact, the expression “don’t be lazy” every time someone complains has been deployed several times among workers on the various Whatsapp or Telegram groups over recent years.
Neoliberal rationality, which tells us that the poor are poor because they want to be (in short, they are failed subjects who did not try hard enough in the competitive machine), and those who succeed are successful because they have earned that place through their determination, their spirit of sacrifice and their ability to take risks, is at the heart of the function and discourse of platform economies. We are all numbers there. The algorithm supposedly does not measure our class, our gender, our attributes, (as we have already seen, the biases of the algorithm are never discussed). It only measures our productivity, leaving aside any social or human factor that might condition that. In short, it depersonalises us, removing us from any social determinant, it overlooks the fact that we do not all start the race from the same position. Starting your shift after a poor night’s sleep in the shared room you sublet in a peripheral neighbourhood; having to cycle more slowly because it’s a period day; getting to work exhausted after you’ve been up all night with your little one; getting a bad delivery score because the client based their evaluation of you on the colour of your skin; none of this is starting from the same position. We are considered to be no more than a pair of legs that need to achieve the goals the company imposes. All the same. Thus, when any of us misses a goal, it must be because we do not want to, not because we have, for example, a long working day behind us that might include not just productive, but – for women – also reproductive work.
At the same time, pushing the neoliberal rhetoric of the rider as a worker who is his own boss to its limit, delivery companies, as we will see below, will try to shift a labour conflict from the collective logic of working rights to a question of individual decision. Faced with a possible regulation of the sector, the platforms focus the debate on the riders’ preferences for one model or the other, all the while encouraging couriers to ‘opt’ for the deregulated model promoted by the companies. This distortion was repeated by many media outlets, and the trade union dispute was treated as if it were actually a confrontation between different preferences between riders. The news stories endeavoured to “give a voice to all parties” by collecting statements from riders with different opinions, ignoring the absence of the main party in the conflict, the company. Focusing on whether a courier prefers to be self-employed or an employee – without an in-depth analysis of the structural conditions that can lead to making one choice or the other – has contributed to strengthening the division between workers. Although, as we have seen, the ground was already prepared by the competitive incentive mechanisms that companies deploy in the distribution of hours and orders. The portrayal of the conflict is thus shifted to a question of internal fights, worker against worker, which hides the violation of rights implicit in the worker-company dichotomy. This division between riders has also been used to measure, for example, the legitimacy of European directives or the Rider Law itself.
Precisely for this reason, Riders X Derechos, CGT Riders and other trade union groups have put huge efforts into trying to demonstrate that the conflict is not a sectoral problem, but the irruption of an economic model that entails new labour relations that make precarity even worse, if that is possible, for the most vulnerable workers.
In short, in the neoliberal framework of the platform economy, the logics of meritocracy, individualism and competition are not only the prism through which to shape workers’ subjectivities and labour relations, but also the lens through which labour conflicts are read. These then become objectified as a matter of subjective and individual desires and opinions, thus curtailing the possibility of a real debate on the social consequences of uberisation – the new economic model that platform economies are imposing at an accelerated pace.
Reality redefined and sugar-coated: positive adrenaline and other stories
In 2016, in our early days as riders, we laughed at the company’s messages encouraging us to be our “own boss”… From our perspective, the last thing we aspired to was the boss. And the idea of being one by riding a bike seemed almost absurd.
These days, we no longer see the humour in the invitation. We know, because we have experienced it first-hand, that it was not just a euphemism or even a lie. The discursive engineering deployed by the delivery platforms is not a simple mask that disguises a precarious reality. It is much more than that. It is an attempt to generate a new framework of interpretation that allows us to experience precarity as if it were not so. To redefine it in the service of a new economic order. To make it desirable.
Be your own boss, be self-managed, be an entrepreneur, focus on your development in a working environment of freedom, flexibility and innovation, etc. Delivery platforms have deployed this new lexicon since the start. It is not uncommon to come across the news of a courier with one leg portrayed as heroism, with the image of a mother with a delivery rucksack and her child in her arms referred to as family-friendly work, precarious job offers and zero social protections presented as opportunities for the future and freedom.
Governing means producing a regime of truths from which we define our world. These truths appear before us as the only field of possible interpretations, drawing the frontiers between the thinkable and the unthinkable.10 Platform economies, like all instruments of power, produce new truths as they expand: those that redefine the labour market, the relationships we establish within it, and the way we think and behave as workers and as customers.
In order to support these truths, a new language is generated that introduces its own vocabulary while changing the meaning of many terms (including but not limited to freedom, and flexibility); uberisation thus delivers a new rationality to us.
While I was working at Deliveroo, the company sent an email to the trainers (the riders you have to follow on your first day at work to learn how the app works and the rider job itself). In this email there was a list of “Wrong and Non-legal words”. For example, you could not talk about working shifts, but instead had to call them ‘missions’. Nor could you talk about things like salary (‘pay for service’), guaranteed minimum (‘automatic allocation of orders for each delivery’) or recruitment (‘collaboration’).
Wrong and non-legal words
-
- SHIFTS (you can use: mission, delivery)
- WAGE (you can use: fee for service)
- WEEKLY SCHEDULES (you can use: weekly availability)
- HOURS PER WEEK (you can use: weekly schedule)
- GUARANTEED MINIMUM (you can use: automatic allocation of orders per delivery)
- UNIFORM (you can use: delivery clothes)
- JOB (you can use: activity, delivery)
- CONTRACTING (you can use: collaboration)
Wrong expressions
- FLEXIBILITY OF HOURS DURING THE WEEK: do not use words such as ‘weekly hours’, ‘fixed hours’, ‘schedules’; use: weekly availability.
- AVAILABILITY FOR THE THREE WEEKEND SHIFTS (FRIDAY NIGHT, SATURDAY NIGHT AND SUNDAY NIGHT FROM 20:00 TO 23:00 approx). Do not use words such as ‘compulsory’, ‘shifts’; use: deliveries, missions, availability.
- PAY PER ORDER: do not use ‘guaranteed minimum’, ‘salary’; you can use: pay per order or pay per service.
This is how a new reality of labour that impoverishes us and leaves us on the margins of labour regulation (i.e. unprotected) is redefined and made palatable. We are no longer precarious workers without rights. We are small entrepreneurs who have the opportunity, with nothing more than a bike and a mobile phone, to start a career that, based on the premise of effort and competence, will separate us from the class we belong to, opening up a range of possibilities for social advancement. The company is not in charge of you; it offers you opportunities so that, thanks to flexible work, you can cut your own path. There is no discomfort at work: only risks to take and positive adrenaline. You can’t be fired (because that would mean you were recognised as an employee), only disconnected. There are no colleagues, they are not even referred to.
In its 8 March 2016 release, Uber offered a perfect example by issuing a press release arguing that the “freedom” of work afforded by its platform was: helping to drive forward – literally – another wave of women’s empowerment: the opportunity to fit work around life, not the other way around. Uber offers something unique: on-demand work, only when you want it. Drivers can earn money on their own terms and set their own hours.11
All this sugarcoating began to weigh on us. It weighed more heavily than the rucksack on our backs. Faced with these neoliberal truths, we needed to create our own. Somehow everything we were and did had been redefined. But to be redefined in neoliberal terms, to be called upon by the platforms to conduct and conceive of ourselves as a business entity (Laval and Dardot, 2013), competing for our livelihoods and taking on every kind of risk, be that in terms of market fluctuation (more or fewer orders), or in terms of our own lives (accidents), was something we did not intend to accept.
- Tiziana Terranova (2018) in Nueva Sociedad, no. 277, p. 90.
- Training session with David San Martín, Professor of Law at the University of La Rioja, organised by La Laboratoria in the context of this research.
- Christian Laval and Pierre Dardot (2013): La nueva razón del mundo: Ensayo sobre la sociedad neoliberal (The New Logic of the World: Essay on Neoliberal Society). Barcelona: Gedisa.
- Naroa Martínez and Helena Matute (2020): “El sexismo en los algoritmos: una discriminación subestimada” (Sexism in Algorithms: an Overlooked Form of Discrimination), in mujeresconciencia.com, 22 July; available online at https://mujeresconciencia.com/2020/07/22/el-sexismo-en-los-algoritmos-una-discriminacion- subestimada/.
- Isabel Rubio (2018): “Amazon prescinde de una inteligencia artificial de reclutamiento por discriminar a las mujeres” (Amazon Scraps Recruitment AI for Discriminating against Women), in elpais.com, 12 October; available online at https://elpais.com/tecnologia/2018/10/11/actualidad/1539278884_487716.html.
- Jaime Gutiérrez and Caitlin Kraft-Buchman: ” Un prólogo sobre las loras parlanchinas” (A Prologue on Talking Parrots), in Inteligencia artificial feminista. Hacia una agenda de investigación para América Latina y el Caribe. Hub de América Latina y el Caribe de la red FAIR de investigación en Inteligencia Artificial Feminista, p. 10; available online at https://archive.org/details/inteligencia-artificial-feminista/page/10/mode/1up.
- Ibid, p. 11-12.
- For more information on the problems associated with the deployment of VioGén, see Carlos del Castillo (2022): “Las víctimas denuncian fallos en VioGén, el algoritmo contra la violencia de género” (Victims Report Flaws in VioGén, the Anti-gender violence Algorithm), in eldiario.com, 9 March; available online at https://www.eldiario.es/tecnologia/victimas-denuncian-fallos-viogen-algoritmo-violencia- genero_1_8815201.html.
- Michel Foucault (1979): “Verdad y poder” (Truth and Power), in Microfísica del poder. Madrid: La Piqueta, p. 189.
- Genoveva López (2020): “Mujeres, bienvenidas a la discriminación 3.0” (Women, Welcome to Discrimination 3.0), in El Salto, 10 April; available at https://www.elsaltodiario.com/economia-digital/mujeres-bienvenidas-a- la-discriminacion-3.0.