When Steve Jobs stepped on stage to thunderous applause at Apple’s Worldwide Developer Conference on 6 June, he announced a new product which everyone in the room had been expecting: the iCloud, a seamless service for syncing users’ data, documents, music and photographs between devices. But he also did something unexpected, something rarely done in the world of network technologies: he pulled back the curtain to reveal what “the cloud” really is. “If you don’t think we’re serious about this,” said Jobs, referring to the iCloud product offering, “this is our third data centre. It’s in Maiden, North Carolina. This is what it looks like.” The slide behind him showed a vast, windowless and ground-hugging white building, set in a deep forest, the size of several football fields and abutted by squat round cooling towers. “It’s a pretty large place full of stuff. Very expensive stuff.” The idea of “the cloud” is almost as old as the internet; indeed, it is one conception of the internet, as a ubiquitous, pervasive network of access points and data services, of computation as an amorphous public utility. But the reality of the cloud, of the internet itself, is that it is a physical infrastructure of cables which run beneath streets and oceans, connecting exchanges and switches to servers in offices, homes and data centres. The fragility of this network has been emphasised by recent events. In January, the former Egyptian regime effectively cut its country off from the internet with a few phone calls to the small number of state-licensed Internet Service Providers, which control virtually all the connections in and out of the country. The blackout lasted six days. In February 2008, a ship attempting to drop anchor at sea during bad weather in the Mediterranean accidentally sliced through the Flag Europe-Asia and Sea-Me-We 4 fibre-optic cables which between them carry 75 percent of all traffic to the Middle East and South Asia, a region with over 75 million internet users. In April of this year, a 75-year-old Georgian woman scavenging for copper to sell as scrap cut the main fibre link to neighbouring Armenia. Georgia supplies 90 percent of Armenia’s connectivity, and the so-called “spade-hacker” plunged 3.2 million Armenians, and a significant number of Georgians and Azerbaijanis, into data darkness for over five hours. Andrew Blum, a writer for Wired who is writing a book about the physical infrastructure of the internet, calls these physical points in the network “choke points”, geographical locations where the networks of networks connect to one another “through something as simple and tangible as a yellow-jacketed fibre-optic cable”. As well as the more overtly brand-owned data centres, like the one that Jobs dramatically revealed, many of these networks meet one another in nominally neutral “carrier hotels”. One such carrier hotel is Terremark’s “NAP of the Americas” in Miami, Florida. Terremark is a multinational data centre and network infrastructure provider, and the NAP, or network access point, provides meeting points for 160 networks. It also switches the majority of South America, Central America and the Caribbean’s digital traffic with the rest of the world, and hosts one of 13 instances of the internet’s root domain name system (DNS), the critical technology translating human-readable URLs into IP (internet protocol), the language of the internet itself. The NAP is a highly secure 750,000sq ft fortress, with concrete exterior panels – reinforced with seven inches of steel – that are designed to withstand a category-5 hurricane and a 100-year-long storm. Situated in a downtown location, some seven storeys high and topped by golf-ball radomes, the NAP is not easy to hide, but like many of its kind, it is unmarked by corporate logos, and photography is strongly discouraged. Google, which recently bought another similar and nominally independent facility, 111 8th Avenue in New York, has spent the last decade building the single largest network of data centres on Earth, nicknamed the Googleplex. In 2008, this was estimated to consist of 12 significant installations in the United States, with another three under construction, and at least five in Europe. Google, however, is notoriously secretive about these locations, to the extent that it obscures its facilities on its own mapping and satellite viewing applications. In a 2008 article in Harper’s Magazine on Google’s data centre in The Dalles, Oregon, Ginger Strand wrote that the blueprints for such buildings “are proof that the Web is no ethereal store of ideas, shimmering over our heads like the aurora borealis. It is a new heavy industry, an energy glutton that is only growing hungrier.” Data centres have evolved dramatically from their origins as the boxy rooms housing large single mainframes at the dawn of the computing era. The modern data centre incorporates back-up power generators, state-of-the-art inert gas fire suppression systems, multiple telecommunications-network connections, rack upon rack of quietly humming servers, switches and modems, and the electrical and water supplies required to keep them running under optimal conditions with minimal human presence or interaction. But the visible architecture of the data centre has changed little: typical examples are nondescript office buildings with mirrored or shuttered windows, deliberately dull to the point of deflecting unsought attention; or vast, distribution park style groundscrapers of the kind unveiled by Jobs: the size of football fields, but marked with few clues as to their actual functions. Counter-examples are rare. In 2008, Swedish internet service provider Bahnhof opened Pionen, a data centre located 100ft underground in a former nuclear bunker in the centre of Stockholm. Bahnhof deliberately styled the facility after James Bond films and 1970s science fiction, with greenhouses, waterfalls and German submarine engines and klaxons, in order to stand out in a discreet industry. “The unique design makes it a ‘talk about’ facility,” said Bahnhof chief executive Jon Karlung. “If you have been inside Pionen, you will for sure tell somebody else about it.” Citigroup’s LEED-certified data centre in Frankfurt by Arup announces its presence with a vast “green wall”, irrigated with recycled cooling water, while HSBC’s South Yorkshire National Data Centre (SYNDC) just off Junction 36 of the M1, built by Midland Bank in the mid-1970s, was modelled after a supertanker, complete with a command-and-control bridge joining the two main computer rooms, and tall green cooling funnels erupting from well-tended lawns. Its rounded corners and white facades recall the Bauhaus, as well as the bold, functionally expressive design of another data centre from the 1970s: Hubbard, Ford and Partners’ fibreglass-clad Mondial House on the Thames; now demolished but once Europe’s largest international telecommunications complex. But even these structures do their best to efface themselves. Pionen is accessed via a set of thick steel doors recessed into a cliff face, and the SYNDC is screened from the road by tall trees and fences, with inquisitive passers-by warned off by security – posters on local messageboards in Sheffield refer to it as a conspiracy-laden Tellytubbyland. Iain MacDonald, director of architect YRM, says the main reason for this discretion is security on three fronts: terrrorism, industrial espionage and theft. The legitimacy of the latter concern is confirmed by news reports: the break-in at Level 3’s facility in Braham Street, east London, in March 2006, where thieves absconded with a valuable router and brought down a major London network in the process; or the burglary at Easynet’s data centre in nearby Brick Lane the same year, where equipment worth an estimated £6 million was loaded into the back of a van and spirited away. However, it’s hard to connect discretion with real security, when building owners and locations are easily discoverable on the web and in municipal records. YRM has recently completed Telehouse West, a flagship facility at Telehouse’s data centre campus in Docklands, east London. Nine storeys high, with 19,000sq m of technical and customer space, the building stands out from other data centres, including the existing Telehouse and Global Switch facilities on the same site, and not just for its technical provisioning. The distinctive, windowless envelope incorporates a “disruptive pattern”, breaking up the monocolour facades with a series of tones based on a monochrome, silver-grey palette, resembling the pixillation of low-resolution imagery inherent in the network itself. Together with high-quality cladding, expressed cross-bracing and angled louvres that create a “crown” of visual interest, Telehouse West attempts to balance what MacDonald calls “a Lloyds-type building which expresses its services” with “an aesthetic quality”. MacDonald cites as an influence William Gibson’s Pattern Recognition, a novel concerned with the human tendency to see patterns in meaningless data and the tensions between art and corporatisation; he’s also influenced by the way in which films like Blade Runner reset the urban landscape from a Logan’s Run-inspired modernism to a “dirty hybridity”. MacDonald sees this era coming to an end as corporations seek to use architecture as branding, as at Mercedes-Benz World at Brooklands, an Aukett Fitzroy Robinson-designed scheme offering visitors a range of experiences from museum to circuit-driving to retail. “You could design a data centre and, depending what you clad it in, you might be hard pushed to see it as that different from an art gallery,” says MacDonald. New art facilities like David Chipperfield’s Turner Contemporary in Margate are morphing into digital content institutes, sharing the data centre’s challenges of managing complex internal requirements – lighting, atmosphere and temperature control – while projecting the appropriate brand values. What is at stake in the new data centre architecture is the way in which architects help to define and shape the image of the network to the general public. Data centres are the outward embodiment of a huge range of public and private services, from banking to electronic voting, government bureaucracy to social networks. As such, they stand as a new form of civic architecture, at odds with their historical desire for anonymity. Facebook’s largest facility is its new data centre in Prineville, Oregon, tapping into the same cheap electricity that powers Google’s project in The Dalles. The social network of more than 600 million users will manifest itself as a 307,000sq ft site currently employing over 1,000 construction workers – which will dwindle to just 35 jobs when operational. But in addition to the $110,000 a year Facebook has promised to local civic funds, and a franchise fee for power sold by the city, comes a new definition for data centres and their workers, articulated by site manager Ken Patchett: “We’re the blue-collar guys of the tech industry, and we’re really proud of that. This is a factory. It’s just a different kind of factory than you might be used to. It’s not a sawmill or a plywood mill, but it’s a factory nonetheless.” This sentiment is echoed in MacDonald’s description of “a new-age industrial architecture”, of cities re-industrialised rather than trying to become “cultural cities”, a modern Milan emphasising the value of engineering, craft and “making” inherent in information technology. The role of the architect in the new digital real estate is to work at different levels, in Macdonald’s words “from planning and building design right down to cultural integration with other activities”. The cloud, the network, the “new heavy industry”, is reshaping the physical landscape, from the reconfiguration of Lower Manhattan to provide low-latency access to the New York Stock Exchange, to the tangles of transatlantic fibre cables coming ashore at Widemouth Bay, an old smuggler’s haunt on the Cornish coast. A sector that was once in the shadows is now coming out into the open, revealing a tension between historical discretion and corporate projection, and bringing with it the opportunity to define a new architectural vocabulary for the digitised world. |
Image Jack Featherstone
Words James Bridle |
|
|