Breakpoints and Black Boxes: Information in Global Supply Chains

Abstract:Supply chain management (SCM) deals with the procurement and assembly of goods, from raw material to the consumer. With the growing prevalence of offshore manufacturing and suppliers' reliance on "just-in-time" inventory management, SCM has become both astoundingly complex and critical to companies' competitiveness. This essay examines how data works in global supply chains, focusing on SAP SCM, the huge but hard-to-access SCM software with the greatest market share. It argues that SCM is characterized by two countervailing tendencies: the demand for perfect information about goods and movement, and the need to erect strategic barriers to the fullest knowledge about supply chains. Counterintuitively, this selective obscurantism is what makes supply chains so fast and efficient.

out by supply chains, as well as the physical infrastructure necessary to support these networks (Cowen, Deadly Life; Lichtenstein, Wal-Mart). Ned Rossiter and John Durham Peters have each examined the relevance of logistical infrastructure for the field of media studies. Scholars in Black studies, such as Christina Sharpe, Stefano Harney, and Fred Moten, have described the inseparability of logistics from slavery. Separately, anthropologists and historians have conducted research at manufacturing plants and port cities, helping to shed light on the lives of the people whose labor keep the supply chain moving (Thomas; Chu; West; Ngai and Chan). These studies inform a body of theoretical work that seeks to understand the implications of globalization for capitalism, politics, and human understandings of the world. These scholars of "critical logistics" argue for an understanding of logistics as "a calculative rationality and a suite 4 of spatial practices aimed at facilitating circulation-including, in its mainstream incarnations, the circulatory imperatives of capital and war" (Chua et al.,p. 618).
For all that has been written about logistics, however, we still know very little about how information moves along the supply chain and what that movement can tell us about the way data interacts with global capital. And yet, supply chains of course consist of information as much as they are composed of shipping containers, cranes, and ports. How else would companies be able to choreograph deliveries with such astounding speed and accuracy? This essay, then, investigates the ways that supply chains use data by focusing on one question in particular: how is it that supply chains can be so dependable, in the sense that we know exactly when our Amazon package will arrive, and yet so unknowable, in the sense that both suppliers and consumers have only the vaguest idea of where the package comes from? I argue in this essay that global supply chains work as efficiently as they do only because of strategic gaps in our knowledge about them. A supply-chain expert might find this observation surprising, since "visibility" and "transparency" are watchwords of the profession: with each new technological innovation comes new promises about more and increasingly complete data. Yet a thorough investigation of the structure and function of supply chains reveals that these circuits of commerce depend on black boxes and omissions as much as they depend on access to information. These strategic uncertainties are part of the appeal not just of global supply chains, but also of algorithmic decision-making in general. The truism that we live in the age of big data evokes visions of avalanches of information from all over the globe, synthesized and processed in order to arrive at unprecedentedly precise solutions. But, as I show here, a critical hallmark of what scholars have called "algorithmic life" is a pattern of strategic uncertainties that serve to elide precisely those moments in which our most important decisions get made (Amoore and Piotukh).
I highlight this peculiar informational zone-this dance between omniscience and ignorance-as a critical feature of supply-chain software and the industry at large.
Drawing on instructional materials, industry literature, and an analysis of supply-chain software, I argue that logistical capitalism's enabling condition is a careful disavowal of particular information, even as the supply chain ravenously consumes other data. I connect this state of knowing-while-not-knowing to defining moments in the supply chain's emergence, most notably in the calculations that characterized the maritime slave trade. Seeing like a supply chain depends upon and reinscribes some of the key assumptions of colonialism, both about human difference and about the way value can be captured and transmitted. Efficient as they are under normal conditions, a major global disaster like the COVID-19 pandemic can bring large parts of the system to a halt. As we look closer at supply chains, we see how the constant alternation between knowing and not-knowing introduces hidden cracks and fault lines into global trade routes. Like scratched glass, the supply chain's cracks are shallow enough, under most conditions, that the glass remains intact. But when struck with an event of sufficient force, these fault lines have the ability-as we have seen-to compromise the entire structure.

Fantasies of Omniscience
Understanding how supply chains use data requires us to contextualize and historicize the industry itself. Logistics (the term is often used interchangeably with "supply-chain management") has always been, and still is, intimately tied to military activity (Cowen, Deadly Life). Logistics as a field began in earnest following World War II, when military veterans applied to business the lessons they had learned about transporting wartime materials. The name "logistics" is in fact a military export to the private sector, just as many corporate logistics experts started their careers in the military (Bonacich and Hardie).
The 1943 propaganda film Troop Train, produced by the United States Office of War Information, contains many of the elements that would continue to define rhetoric about logistics throughout the twentieth and twenty-first centuries. In the "nerve center" of a troop-transfer operation, military personnel busily make notes and phone calls before a wall painted with a map of the United States rail system, which resembles a giant semiconductor. A sequence of phone calls, flipped switches, and a battalion of typists creates the impression of an information epicenter, from which logistics personnel can monitor the movement of people and goods with seamless speed and accuracy. Troop Train depicts logistics headquarters as a zone of perfect information, where monitors keep tabs on conditions with flawless efficiency.
And yet, even as "nerve centers" attempt to monitor every movement of resources, a countervailing tendency has also characterized supply chains: that of the black box, the zone where information is disavowed in favor of efficiency. Its most visible emblem might be the shipping container, whose modularity allows it to travel the world with unprecedented speed, even as its contents remain shrouded in interchangeability. It, too, is a wartime invention, deployed in quantity for the first time to run goods between California and Cam Ranh Bay during the Vietnam War. The shipping container's main selling point was its modularity: by standardizing the container's dimensions, shipping companies and manufacturers began to regularize, automate, and impose order on what had been a relatively unpredictable industry (Levinson). In its modern incarnation, supply-chain management emerged in the context of increasing competition from Japanese manufacturers, and particularly Japanese innovations in just-in-time (or "lean") manufacturing, in which a supplier aims to keep the absolute minimum inventory on hand to fulfill demand. With just-in-time manufacturing, the supplier can avoid tying up capital in unsold inventory. But this mode of distribution also means that information about demand needs to be nearperfect and near-instantaneous in order to avoid product shortages (Lichtenstein, "Supply Chains"). When a company maintains very low levels of inventory, it cannot deliver goods to consumers effectively if any component gets stalled at any station along the supply chain. Japanese companies pioneered these techniques in the 1960s through the 1980s, and American companies quickly followed suit, outsourcing goods to a patchwork of suppliers in order to maximize speed and profit.
In the twenty-first century, supply-chain management techniques have evolved to respond with extraordinary subtlety to consumer demand. If a Wal-Mart store begins to run low on a certain brand of diapers, for example, that information is automatically communicated to a supplier who initiates the restocking process, all with very little human intervention. This shift from "push" to "pull"-that is, the tendency for manufacturing and delivery to be "pulled" into an outlet by retailers rather than "pushed" in by suppliers-means that suppliers must be prepared for a high degree of volatility in demand. Gone are the days when a manufacturer could plan stocking levels according to the season or even the month (Lichtenstein, Wal-Mart).
Sixty-five years after Troop Train, the information nerve center reemerges within the interface of SAP SCM, the most widely used supply-chain management software, which I describe in more detail below. The Supply Chain Cockpit, the highest-level view of a company's supply chain, depicts distribution centers, shipping routes, and manufacturing locations, all arrayed on a map, as though they can be monitored from above (see fig. 1). In fact, however, this vision of perfect information has always been a fantasy.
The Supply Chain Cockpit is not real-time; it is a forecast. Conditions on the ground can easily supersede the vision planners have laid out, and neither the Cockpit nor most actors in the supply chain will register the change. Despite planners' pleas for more information, however, this selective obscurity is not a bug but a feature: supply chains could not be as efficient as they are if they were centrally coordinated and observed.
In his essay on the geopolitics of capitalism, David Harvey lays out a model for understanding the relationship of space and capital that is startling for the way it seems to presage an interface like the Supply Chain Cockpit (Harvey). For a capitalist economy to function, Harvey explains, capital has to keep moving. It can't ever stop, and an economy must always grow. But as capital increases, it must be reabsorbed. If it can't be reabsorbed and recirculated, it accumulates, and this causes a crisis. If, for example, goods pile up in a warehouse, they're not circulating, and they lose their value. Because capital needs to find more and more places to go, the system of global capital has devised what Harvey terms the "spatial fix": it expands outward, looking for ever more ways to keep capital moving. But the fix Harvey describes is a double-edged sword: like a junkie's fix, it is temporarily satisfying but ultimately insufficient.
Capitalism, according to Harvey, is structurally compromised by this internal contradiction between the need to grow and the need to absorb, and it must at some point confront this contradiction.
When Harvey describes the spatial fix, he means physical infrastructure and building plants. The spatial fix explains why North American cities teem with empty luxury condos, while millions of people go unhoused. But perhaps this need to keep capital moving can also describe the Supply Chain Cockpit, the event handler, and the optimization algorithm. Capital's need to circulate unceasingly has been translated into a software suite that-at least in theory-addresses potential crises with marvelous speed and sophistication. No potential accumulation can take place without triggering a solution, and models borrowed from the structure of our brains chart paths designed to eliminate even the slightest latency. Yet even as the spatial fix seems to reach the apogee of its elaboration, it's pushed toward ever greater heights of information speed and efficiency.
In a complex supply chain, some waystations might be large, easily identifiable plants, such as the Foxconn facilities that make Apple products. But the components of goods must themselves be procured from another vendor, and those from another, and so on, until we reach the site where the source materials themselves are mined, extracted, or farmed. Some of these nodes may be no more than garages retrofitted into small workshops. It is only a node's neighbors, not a central planner, that must be aware of its existence. Should something happen to one of these workshops, such as a labor strike or natural disaster, the neighboring stops on the supply chain might easily substitute a replacement, without anyone else being aware of the change. This tangle of procurement is compounded in industries with high product turnover, such as the apparel industry. Supply-chain nodes might number in the thousands, and vendors are continuously swapped out and replaced. To attempt to coordinate this activity would mean slowing it down, potentially endangering a company's ability to bring products to consumers with the speed that the market demands.
This decentralized arrangement works well for commerce under normal conditions. A broken link can easily be popped out and replaced with another supplier.
But this assumes that a battalion of suppliers are standing at the ready, waiting for the word to take their new place. This was not the case during the COVID-19 pandemic.
The disease knocked out (at least temporarily) not just a smattering of suppliers but suppliers everywhere, and no one was available to plug these gaps. Had large companies been aware of potential weaknesses in their supply chains, they could have analyzed trade routes, prepared for disruption, and created redundancy within the supply chain. But of course they didn't know about these weaknesses, since they could not obtain that information without compromising the terrific speed of the system on which they depend (Choi et al.). This is not to say, however, that all supply-chain data eludes manufacturers.
Operations within the facilities a manufacturer itself controls may, in fact, be closely planned and monitored. And yet even within these monitoring operations, strategically placed barriers and black boxes prevent any individual onlooker from viewing the supply network in full.

Breakpoints and Black Boxes: Inside SAP
The industry standard for monitoring business operations-everything from sales projections to transportation to human resources-is a suite of software called SAP (the acronym stands for Systemanalyse und Programmentwicklung, or System Analysis and Program Development) (Pollock and Williams). SAP was founded in 1972 by five German ex-employees of IBM (Pollock and Williams). Its conception was part of the push toward "systems integration" that characterized management thinking of the period. Under this systems integration concept, businesses focus not on one single functional unit of a business at a time, like distribution or marketing, but on the throughline that connects each part of the product's travel through the company, from idea to the customer's hands. For the modern corporation, this is the supply chain. This crosscutting approach to a company's operations demanded software that could integrate information from every aspect of the business (Cowen, "Logistics' Liabilities").
SAP emerged to fill that role.

13
SAP is not the only software for business operations, but it is the system with the most market share. Although few laypeople have dealt directly with an SAP interface, it is one of the largest software companies in the world, earning $22 billion in revenue in 2016 (Plunkett et al.). SAP software handles a wide variety of businesses' functions, from human resources to financial planning, inventory control, and invoicing. Over its four decades in existence, SAP has grown rapidly in size and ambition. Its current initiatives include a push toward cloud-based computing (in which businesses outsource their functions to an offsite server, rather than manage everything in-house), machine learning, and its own proprietary database software.
As one might imagine, a business system designed to handle a product's journey from idea to raw material to the customer requires highly complex software. In fact, the term "software" is misleading, for SAP's product is not a single application, but a suite of tools joined together through a shared database. This model of database-driven software integration is a clear outcome of the push for systems-integration that characterized the late 1960s and '70s; only if managers conceive of a business as a continuum of interlocking information would a shared database be desirable.
Modularity, in which highly complex functions are "black-boxed" so that an engineer need only deal with one set a time, is a key feature of SAP software. The SAP suite is subdivided into numerous products, called "components," which are then subdivided into function-specific "transactions." SAP professionals usually specialize in one of these larger components, passing a job over to another professional when it enters the domain of another component. For example, an SCM (supply-chain management) specialist can manage production and transportation planning, but will look to an ECC (Enterprise Resource Planning Central Component) specialist to modify and maintain the suite's master data. Depending on a company's needs, SAP engineers can link various modules together to form a continuous suite of software.
In a company that makes use of an integrated SAP suite for supply chain planning, planners pass data along a chain. The data begins as a months-long forecast and moves, with increasing granularity, down the line, until it reaches workers on the factory floor. Planners forecast product demand in the Demand Planning (DP) component, and then pass that information to a Supply Network Planning (SNP) specialist, whose job is to ensure that production is timed and organized to meet the demand plans. The SNP specialist produces a plan, with necessary output keyed to particular dates and locations, and passes that plan to a specialist in Production Planning and Detailed Scheduling (PP/DS). The PP/DS specialist breaks the SNP plan into smaller units of time and space, determining workers' shifts, lot production, and product movements, in intervals as small as one second. The three specialists thus work in different time scales, or "time horizons," in the parlance of SAP: DP forecasts demand at the level of months; SNP works in weeks; and PP/DS breaks time into units as small as one second (Knolmayer et al.; Snapp; Wood).
At each point in the process, a planner has the ability to make use of predictive or optimizing algorithms. SAP's DP component incorporates several types of regressions, with which a planner can take into account seasonal variability, sales promotions, and historical trends. Within the SNP component, a planner can make use of tools SAP classifies as "heuristics," "capacity leveling," and "optimizers." SNP's heuristic function distributes production across dates in order to meet the targets identified in the demand plan, while capacity leveling considers the constraints of available materials, plant capacity, and labor in order to ensure that the plan is feasible.
Finally, the optimizing algorithm incorporates storage, transport, and labor costs, iterating rapidly through multiple scenarios to generate a production plan that minimizes cost and maximizes profit. PP/DS likewise makes use of heuristics and optimizers to ensure that labor and production is optimized to furnish the desired quantities of goods at the lowest possible cost.
The information chain within SAP is composed of a combination of explicit data and production imperatives derived from obscured algorithms. The locations of warehouses, the number of available machines, the length of time it takes to get from one place to another are all sensible, if sometimes elusive, data points from the business perspective. But the process by which these values are converted into concrete production plans is more mysterious. The algorithms used to make these calculations are derived from work published in journals of supply-chain management, but few supply-chain specialists can be expected to grasp them fully. (Indeed, that is why the algorithms are implemented within the software interface, rather than manually calculated.) Thus, the process of moving from knowable business values to actionable production plans always involves an algorithmic black box. At each interval within the planning process, a specialist passes a set of data through a breakpoint, where her expertise ends and another specialist's begins. The data package transmitted between specialists contains the information necessary to begin that stage of work, but not the underlying data from which the plan was derived. Thus, a production planning specialist, for example, could not understand the data basis of a supply network planner's production calendar, even if she wanted to. A worker at a manufacturing plant feels the effects of this modularity keenly. Should she object to outrageous work demands, this person's supervisor might wield in his own defense a PP/DS plan, sent to him by someone who worked with a different set of data, produced by someone else entirely. A forecaster, meanwhile, can, with some justification, disclaim responsibility for factory conditions, since she produced only a high-level prediction about demand.
As Andrew Russell and others have demonstrated, modularity has particular effects on the communication and synthesis of information. Because no single person working on a modular system has access to all of the information contained within the circuit, it is possible for every person in the system to disclaim responsibility for the system's particular effects. Modularity is, as Russell puts it, a supremely useful "means for confronting and managing complexity in a dynamic and systemic context" (258), but it also strategically obscures knowledge.
The "bullwhip effect" is a mainstay of supply-chain management theory. First described by Jay Forrester in 1958, the effect refers to the way information about demand is distorted as it travels from a distribution site to the manufacturer (Forrester).
As a whip amplifies a flick of the wrist, so do supply chains' information networks inflate consumer demand as it travels from person to person. A store manager, noting that a product's stock levels are low, may order more cases than he thinks he needs immediately in order to avoid a shortage. The manager of a regional warehouse similarly nudges her numbers up, and so on, until the several-times-inflated number reaches the manufacturer. Meanwhile, consumer demand has not actually increased, so the manufacturer's product languishes on the shelves. SAP's market forecasting functions are designed to mitigate the bullwhip effect, but as they do so, they produce another kind of whipping motion. Small tweaks in forecasts and pricing travel through SAP's circuit of components. As the data moves, it is systematically scoured of any latency, as one might press down on an air mattress to squeeze it into the tightest possible roll. As the time horizons move from months, to weeks, to days, and finally to seconds, the demands for productivity become increasingly concrete and inexorable. The predictions meet reality only when they are conveyed to workers on assembly lines, where small increases in predicted demand are translated into longer working hours, lower pay, or unsafe working conditions. The speed of manufacturing networks is sure to increase in the years to come, even as these networks maintain their distinctive combination of perfect detail and perfect ignorance. In the last five years, SCM software like SAP has incorporated machine-learning algorithms to help managers run supply chains more efficiently. In a supply chain, machine learning can work in two directions: "demand forecasting," in which corporations attempt to calibrate supplies to consumer behavior; and, on the other end, requisitioning supplies from the appropriate vendors. Demand forecasting may be relatively familiar to most people: a large company can derive predictions from actual consumer behavior and then use those predictions to determine how many and what kind of goods to have on hand. For example, the retailer Target famously uses predictive analytics to guess, with eerie accuracy, whether a customer is pregnant (Duhigg).
But machine learning can work on the other end, too: to devise and revise supply routes for manufactured goods. These techniques can be applied in several different ways. In one approach, companies use neural networks (a kind of machine learning) to assign a degree of risk to individual suppliers based on "training data" composed of information about suppliers' past performance. The algorithm can then devise a critical path-that is, an optimized sequence of steps-to hand the raw materials through the stations of the manufacturing cycle (Teuteberg).
Another approach is based on a "multi-agent system" (MAS) model: a bundle of "agents" that interact with each other with little human intervention. An agent is really a piece of software designed to mimic a function of the supply chain. Each is programmed to act independently of its fellow agents in accomplishing its designated tasks. The "disruption management" agent monitors an incoming flow of data. When it detects an abnormality-such as slower-than-expected delivery times-the agent is programmed to trigger a solution, such as switching suppliers or reconfiguring products. The agent can then coordinate with the other agents in the system to adjust the entire model to accomplish its solution. This pool of potential solutions is itself refined and ranked in terms of desirability based on machine-learning algorithms. Over time, the MAS learns to favor those solutions that lead to optimal outcomes. In theory, at least, these agents can build, reconfigure, and optimize supply chains without the need for any ongoing human guidance (Giannakis and Louis). In this scenario, as in SAP's SCM components, no one holds all of the information. Rather, the supply path is created through machine-learning protocols into which no individual person can really claim full insight.
Machine learning is a data-hungry field: "Data is the sole nutrient in a machinelearning diet," as one SAP white paper puts it. "Algorithms need to binge on it constantly to lead a healthy and successful life" (Wellers et al.,8). And yet, logistics professionals' pursuit of perfect data is (and always has been) much more dream than reality. Even though industry newsletters tout the precision and power of machine learning, SCM depends in reality on a great deal of "noisy" data and human intervention: invoices transmitted as PDFs, rather than machine-parsable data; vendors who disclose imperfect information; humans who scan the wrong barcodes; suppliers who prefer to conduct business over the phone. Yet, the fantasy of what companies call "end-to-end visibility" remains hugely compelling.
It is clear, however, that even what business managers call "visibility" is actually strategic obscurantism. Were one to view the supply network in its entirety, its complexity would likely make it illegible. Moreover, the "perfect data" that supplychain managers demand is put to use in algorithms that apportion labor and resources in ways that SCM software is designed to obscure. From the way a T-shirt manufacturer swaps out a cotton supplier, to the way an optimization algorithm delineates workers' schedules, the supply chain is fine-tuned to produce speed and efficiency by way of strategically-placed lacunas. This algorithmic no-man's land, this realm of both knowing and not-knowing the details of what transpires, is the subject of the next section of this piece. How and when did we authorize this informational regime in which it is possible to think about capital, movement, and human life in the terms offered by the supply chain?
Heavy Weather: Cargo, Capital, and Data The slaver ship Zong set out in September 1781 from West Africa, carrying 440 enslaved African people. Three months later, near to port but faced with illness and a shortage of water, the Zong's crew made a calculation: if the enslaved people were to die onshore, the ship's owners would not recover their cost. If they were to die onboard of "natural causes," their value to the enslavers would likewise be lost. The ship's insurance, however, did cover the jettisoning of cargo. So, over a period of three days in November 1781, the crew of the Zong threw 133 men, women, and children into the sea, where they perished.
The Zong became a flashpoint for the abolitionist movement because it so baldly exposed slavery's unthinkable inhumanity. Christina Sharpe observes that the Zong also has other layers of significance (Sharpe,ch. 2 But the Zong is relevant here not only because the episode demonstrates how value works in finance capital. It is also relevant for Baucom's claims about the kind of knowledge on which the Zong and its creditors depended in order to make the specific calculation they made: that of the "typical" (96-107). The embrace of the notion of the typical-that is, the qualities one can expect from the kind of thing one is dealing with-is a hallmark of finance capital. An insurer or investor must cast the particular into types in order to sort them into categories on a table (or in a database), which he then uses to make a calculation about risk and reward. The translation of a human being into a predictable value, and thus a unit that can be circulated and traded, depends on the classification of that human being into a particular category. Without categories, there is no insurance, and without insurance, there is no Zong. This is the metamorphosis that took place even before the slaves were forced onto the Zong: that of the particular body into one of a range of computable values on the actuarial spreadsheet. Indeed, it is precisely to recover Black lives from this merciless logic of categorization and value that M. NourbeSe Philip, in her poem cycle Zong!, disarticulates and then reassembles the court case that declared the enslaved passengers insurable. Philip rips apart the clockwork madness of the Zong's legal case in order to argue for a different kind of confrontation with the material of history. In Sharpe's words, "The dead appear in Philip's Zong! beyond the logic of the ledger, beyond the mathematics of insurance" (38).
This classification of human beings is the oft-unspoken precondition for globalized labor. The sophisticated software and dizzying speed of today's supplychain networks tempt us to see these chains as symptoms of a very modern kind of global hypercapital. But the Zong reminds us that the actions of the contemporary supply chain have been authorized and made thinkable at least since the eighteenth century. The first kind of data necessary for a supply chain is data about labor-which 22 is to say, about human beings. What happens to human beings in a supply chain may be disastrous, but it is also an algorithmic imperative. A calculation about human value demanded the murder of enslaved people on the Zong, just as it demands that workers at a Samsung supplier in Huizhou, Guangdong, earn an average of 238.55 USD per month (An Investigative Report on HEG Technology). These decisions, at least rhetorically, are beyond anyone's immediate control. Companies like Apple and Nike may occasionally say they want to clean up working conditions for their subcontractors, but in truth, of course, they depend intimately on the kind of logic that categorizes and assigns lower value to the labor of people in the global South; otherwise, we wouldn't have global supply chains, at least not to any great extent.
Echoes of the Zong reverberated in 2015, when the Associated Press broke the news that Burmese migrants are being forced into slavery to work aboard shrimp boats off the coast of Thailand. Logan Kock, the shrimp company's vice president for responsible sourcing, said the following: "The supply chain is quite cloudy, especially when it comes from offshore. Is it possible a little of this stuff is leaking through? Yeah, it is possible. We are all aware of it" (Mason et al.). For the shrimp executive, the problem in the supply chain is climatological, not systemic; supply chains are cloudy, they leak, events unfold. Again, we find ourselves in a no-man's land, in which events simply unfold and all responsibility can be disclaimed. We know when our shrimp will arrive, but their route to us is saturated in a dense fog.
The supply chain depends on this balance between the information it can assimilate into itself and that it cannot; information that is accepted and information that is refused. Data in the supply-chain model is supremely interchangeable. In practice, in structure, and by design, SAP and similar systems have taken great pains to assimilate the heterogeneous data that passes through its systems. But even as the supply chain cannot abide heterogeneity in its data, it depends, ultimately, on other kinds of difference for its very existence-difference in standards of living, so that Chinese workers can be paid so little; racial and gender difference, so that it seems natural that Bangladeshi women, for example, should work for a pittance in a dangerous apparel sweatshop, or that Burmese men would find themselves enslaved on shrimping boats far from home. To accommodate these competing demands, the company in supply-chain capitalism must operate in a peculiar informational zone, one in which it ravenously consumes some data-such as that about price and locationeven as it cannot absorb other data-such as that about labor practices. In other words, it wants perfect access to information, but only some information. The result of all this is goods whose arrival we can predict to the hour, but whose conditions of assembly remain mysterious by design.
It is worth considering whether this pattern of strategic omission is unique to supply chains or part of other informational infrastructures. We might think here of the Uber driver, who is told exactly where he must go to pick up a passenger but has no idea when or if he will be called upon to drive that evening. We might also consider the retail employee whose working hours are determined not by her own availability but by an algorithm that requires her to show up exactly when she's needed, with very little advance notice. Like global supply chains, these informational landscapes combine exquisite precision with a larger context of enormous uncertainty. The particular combination of the exacting and the vague seems only to grow more common as gigeconomy labor takes a deeper hold on the structure of commerce.
Scholarship on infrastructure tends to focus on the way in which its component parts come together to make a functional whole. Yet this essay's observations of global supply chains suggest an alternate line of inquiry: what goes unsaid and unseen in the way business teaches us to perceive the world? Alberto Toscano and Jeff Kinkle have argued, following Fredric Jameson, that an inability to map the operations of capital leads to a state of disorientation-a situation of helplessness when we confront a faceless, shapeless edifice of power (Toscano and Kinkle, "Introduction"). To accurately visualize the operations of global capital would require us to connect the nodes represented by labor and resources at every station of a commodity's journey across the globe. The close scrutiny of information's movement through global supply chains can help us understand why visualizing global commerce has proved so difficult. Its scale is massive and its speed is dizzying. But, more pressingly, information drops out of the circuits of global capital flow at crucial junctures in its journey. It is too easy to view global logistics as a kind of unstoppable juggernaut: an assemblage so powerful and omniscient as to be invulnerable. And yet, as we have seen, the power of global capital derives not from omniscience but from a kind of selective sight. To map the movement of commerce, as Toscano and Kinkle encourage us to do, would require filling in gaps that defy even the most sophisticated technology's attempts to ascertain. Understanding the power of logistics is therefore not a matter of ascribing potency to key people or corporations but of shedding light on precisely those operations that have ducked out of our sight. We need to understand these informational black boxes not as vulnerabilities within infrastructure but as strategic omissions that are as critical to the operation of the system as the parts that we can see at work.
Most of the time, the informational dance between the selectively-known and unknown works surprisingly well. But large portions of the global supply chain came crashing to a halt in the first quarter of 2020 and continue to founder late into 2021.
When enough waystations on the global supply chain shut down, it was as though supply-chain managers were left clutching the ends of strings that had suddenly been snipped. Ordinarily, products make their way through the maze of intermediaries; exactly how they do that is rarely clear. But since no one really knew what had happened all the way down the chain, they could not predict how long it would take to self-heal. Gathering intelligence on these suppliers and sub-suppliers could take years, as it did for one Japanese company that attempted to map its supply chain in the wake of the 2011 tsunami (Choi et al.).
In 2020, the supply chain eventually managed to self-heal to a great extent, thanks largely to China's rapid and effective pandemic response. But its sudden, if temporary, vulnerability is instructive. Critical logistics scholars are very interested in "choke points," junctures at which supply chains have hidden vulnerabilities (Alimahomed-Wilson and Ness). But the pandemic response suggests that the most critical breakpoint is not one particular link in the chain, but the chain itself: the way that it coils and recoils, unsupervised, through a labyrinth of contractors. The dance between knowing and not-knowing is intricate and feverishly fast. Miss a step-know too much, or too little-and the players risk crashing down.