infrastructure

You are currently browsing the archive for the infrastructure category.

Let’s start with Facebook’s Surveillance Machine, by Zeynep Tufekci in last Monday’s New York Times. Among other things (all correct), Zeynep explains that “Facebook makes money, in other words, by profiling us and then selling our attention to advertisers, political actors and others. These are Facebook’s true customers, whom it works hard to please.”

Irony Alert: the same is true for the Times, along with every other publication that lives off adtech: tracking-based advertising. These pubs don’t just open the kimonos of their readers. They bring readers’ bare digital necks to vampires ravenous for the blood of personal data, all for the purpose of aiming “interest-based” advertising at those same readers, wherever those readers’ eyeballs may appear—or reappear in the case of “retargeted” advertising.

With no control by readers (beyond tracking protection which relatively few know how to use, and for which there is no one approach, standard or experience), and no blood valving by the publishers who bare those readers’ necks, who knows what the hell actually happens to the data?

Answer: nobody can, because the whole adtech “ecosystem” is a four-dimensional shell game with hundreds of players

or, in the case of “martech,” thousands:

For one among many views of what’s going on, here’s a compressed screen shot of what Privacy Badger showed going on in my browser behind Zeynep’s op-ed in the Times:

[Added later…] @ehsanakhgari tweets pointage to WhoTracksMe’s page on the NYTimes, which shows this:

And here’s more irony: a screen shot of the home page of RedMorph, another privacy protection extension:

That quote is from Free Tools to Keep Those Creepy Online Ads From Watching You, by Brian X. Chen and Natasha Singer, and published on 17 February 2016 in the Times.

The same irony applies to countless other correct and important reporting on the Facebook/Cambridge Analytica mess by other writers and pubs. Take, for example, Cambridge Analytica, Facebook, and the Revelations of Open Secrets, by Sue Halpern in yesterday’s New Yorker. Here’s what RedMorph shows going on behind that piece:

Note that I have the data leak toward Facebook.net blocked by default.

Here’s a view through RedMorph’s controller pop-down:

And here’s what happens when I turn off “Block Trackers and Content”:

By the way, I want to make clear that Zeynep, Brian, Natasha and Sue are all innocents here, thanks both to the “Chinese wall” between the editorial and publishing functions of the Times, and the simple fact that the route any ad takes between advertiser and reader through any number of adtech intermediaries is akin to ball falling through a pinball machine. Refresh your page while reading any of those pieces and you’ll see a different set of ads, no doubt aimed by automata guessing that you, personally, should be “impressed” by those ads. (They’ll count as “impressions” whether you are or not.)

Now…

What will happen when the Times, the New Yorker and other pubs own up to the simple fact that they are just as guilty as Facebook of leaking their readers’ data to other parties, for—in many if not most cases—God knows what purposes besides “interest-based” advertising? And what happens when the EU comes down on them too? It’s game-on after 25 May, when the EU can start fining violators of the General Data Protection Regulation (GDPR). Key fact: the GDPR protects the data blood of what they call “EU data subjects” wherever those subjects’ necks are exposed in borderless digital world.

To explain more about how this works, here is the (lightly edited) text of a tweet thread posted this morning by @JohnnyRyan of PageFair:

Facebook left its API wide open, and had no control over personal data once those data left Facebook.

But there is a wider story coming: (thread…)

Every single big website in the world is leaking data in a similar way, through “RTB bid requests” for online behavioural advertising #adtech.

Every time an ad loads on a website, the site sends the visitor’s IP address (indicating physical location), the URL they are looking at, and details about their device, to hundreds -often thousands- of companies. Here is a graphic that shows the process.

The website does this to let these companies “bid” to show their ad to this visitor. Here is a video of how the system works. In Europe this accounts for about a quarter of publishers’ gross revenue.

Once these personal data leave the publisher, via “bid request”, the publisher has no control over what happens next. I repeat that: personal data are routinely sent, every time a page loads, to hundreds/thousands of companies, with no control over what happens to them.

This means that every person, and what they look at online, is routinely profiled by companies that receive these data from the websites they visit. Where possible, these data and combined with offline data. These profiles are built up in “DMPs”.

Many of these DMPs (data management platforms) are owned by data brokers. (Side note: The FTC’s 2014 report on data brokers is shocking. See https://www.ftc.gov/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014. There is no functional difference between an #adtech DMP and Cambridge Analytica.

—Terrell McSweeny, Julie Brill and EDPS

None of this will be legal under the #GDPR. (See one reason why at https://t.co/HXOQ5gb4dL). Publishers and brands need to take care to stop using personal data in the RTB system. Data connections to sites (and apps) have to be carefully controlled by publishers.

So far, #adtech’s trade body has been content to cover over this wholesale personal data leakage with meaningless gestures that purport to address the #GDPR (see my note on @IABEurope current actions here: https://t.co/FDKBjVxqBs). It is time for a more practical position.

And advertisers, who pay for all of this, must start to demand that safe, non-personal data take over in online RTB targeting. RTB works without personal data. Brands need to demand this to protect themselves – and all Internet users too. @dwheld @stephan_lo @BobLiodice

Websites need to control
1. which data they release in to the RTB system
2. whether ads render directly in visitors’ browsers (where DSPs JavaScript can drop trackers)
3. what 3rd parties get to be on their page
@jason_kint @epc_angela @vincentpeyregne @earljwilkinson 11/12

Lets work together to fix this. 12/12

Those last three recommendations are all good, but they also assume that websites, advertisers and their third party agents are the ones with the power to do something. Not readers.

But there’s lots readers will be able to do. More about that shortly. Meanwhile, publishers can get right with readers by dropping #adtech and go back to publishing the kind of high-value brand advertising they’ve run since forever in the physical world.

That advertising, as Bob Hoffman (@adcontrarian) and Don Marti (@dmarti) have been making clear for years, is actually worth a helluva lot more than adtech, because it delivers clear creative and economic signals and comes with no cognitive overhead (for example, wondering where the hell an ad comes from and what it’s doing right now).

As I explain here, “Real advertising wants to be in a publication because it values the publication’s journalism and readership” while “adtech wants to push ads at readers anywhere it can find them.”

Going back to real advertising is the easiest fix in the world, but so far it’s nearly unthinkable because we’ve been defaulted for more than twenty years to an asymmetric power relationship between readers and publishers called client-server. I’ve been told that client-server was chosen as the name for this relationship because “slave-master” didn’t sound so good; but I think the best way to visualize it is calf-cow:

As I put it at that link (way back in 2012), Client-server, by design, subordinates visitors to websites. It does this by putting nearly all responsibility on the server side, so visitors are just users or consumers, rather than participants with equal power and shared responsibility in a truly two-way relationship between equals.

It doesn’t have to be that way. Beneath the Web, the Net’s TCP/IP protocol—the gravity that holds us all together in cyberspace—remains no less peer-to-peer and end-to-end than it was in the first place. Meaning there is nothing to the Net that prevents each of us from having plenty of power on our own.

On the Net, we don’t need to be slaves, cattle or blood bags. We can be human. In legal terms, we can operate as first parties rather than second ones. In other words, the sites of the world can click “agree” to our terms, rather than the other way around.

Customer Commons is working on exactly those terms. The first publication to agree to readers terms is Linux Journal, where I am now the editor-in-chief. The first of those terms will say “just show me ads not based on tracking me,” and is hashtagged #DoNotByte.

In Help Us Cure Online Publishing of Its Addiction to Personal Data, I explain how this models the way advertising ought to be done: by the grace of readers, with no spying.

Obeying readers’ terms also carries no risk of violating privacy laws, because every pub will have contracts with its readers to do the right thing. This is totally do-able. Read that last link to see how.

As I say there, we need help. Linux Journal still has a small staff, and Customer Commons (a California-based 501(c)(3) nonprofit) so far consists of five board members. What it aims to be is a worldwide organization of customers, as well as the place where terms we proffer can live, much as Creative Commons is where personal copyright licenses live. (Customer Commons is modeled on Creative Commons. Hats off to the Berkman Klein Center for helping bring both into the world.)

I’m also hoping other publishers, once they realize that they are no less a part of the surveillance economy than Facebook and Cambridge Analytica, will help out too.

[Later…] Not long after this post went up I talked about these topics on the Gillmor Gang. Here’s the video, plus related links.

I think the best push-back I got there came from Esteban Kolsky, (@ekolsky) who (as I recall anyway) saw less than full moral equivalence between what Facebook and Cambridge Analytica did to screw with democracy and what the New York Times and other ad-supported pubs do by baring the necks of their readers to dozens of data vampires.

He’s right that they’re not equivalent, any more than apples and oranges are equivalent. The sins are different; but they are still sins, just as apples and oranges are still both fruit. Exposing readers to data vampires is simply wrong on its face, and we need to fix it. That it’s normative in the extreme is no excuse. Nor is the fact that it makes money. There are morally uncompromised ways to make money with advertising, and those are still available.

Another push-back is the claim by many adtech third parties that the personal data blood they suck is anonymized. While that may be so, correlation is still possible. See Study: Your anonymous web browsing isn’t as anonymous as you think, by Barry Levine (@xBarryLevine) in Martech Today, which cites De-anonymizing Web Browsing Data with Social Networks, a study by Jessica Su (@jessicatsu), Ansh Shukla (@__anshukla__) and Sharad Goel (@5harad)
of Stanford and Arvind Narayanan (@random_walker) of Princeton.

(Note: Facebook and Google follow logged-in users by name. They also account for most of the adtech business.)

One commenter below noted that this blog as well carries six trackers (most of which I block).. Here is how those look on Ghostery:

So let’s fix this thing.

[Later still…] Lots of comments in Hacker News as well.

[Later again (8 April 2018)…] About the comments below (60+ so far): the version of commenting used by this blog doesn’t support threading. If it did, my responses to comments would appear below each one. Alas, some not only appear out of sequence, but others don’t appear at all. I don’t know why, but I’m trying to find out. Meanwhile, apologies.

Just before it started, the geology meeting at the Santa Barbara Central Library on Thursday looked like this from the front of the room (where I also tweeted the same pano):

Geologist Ed Keller

Our speakers were geology professor Ed Keller of UCSB and Engineering Geologist Larry Gurrola, who also works and studies with Ed. That’s Ed in the shot below.

As a geology freak, I know how easily terms like “debris flow,” “fanglomerate” and “alluvial fan” can clear a room. But this gig was SRO. That’s because around 3:15 in the morning of January 9th, debris flowed out of canyons and deposited fresh fanglomerate across the alluvial fan that comprises most of Montecito, destroying (by my count on the map below) 178 buildings, damaging more than twice that many, and killing 23 people. Two of those—a 2 year old girl and a 17 year old boy—are still interred in the fresh fanglomerate and sought by cadaver dogs. The whole thing is beyond sad and awful.

The town was evacuated after the disaster so rescue and recovery work could proceed without interference, and infrastructure could be found and repaired: a job that required removing twenty thousand truckloads of mud and rocks. That work continues while evacuation orders are gradually lifted, allowing the town to repopulate itself to the very limited degree it can.

I talked today with a friend whose business is cleaning houses. Besides grieving the dead, some of whom were friends or customers, she reports that the cleaning work is some of the worst she has ever seen, even in homes that were spared the mud and rocks. Refrigerators and freezers, sitting closed and without electricity for weeks, reek of death and rot. Other customers won’t be back because their houses are gone.

Highway 101, one of just two freeways connecting Northern and Southern California, runs through town near the coast and more than two miles from the mountain front. Three debris flows converged on the highway and used it as a catch basin, filling its deep parts to the height of at least one bridge before spilling over its far side and continuing to the edge of the sea. It took two weeks of constant excavation and repair work before traffic could move again. Most exits remain closed. Coast Village Road, Montecito’s Main Street, is open for employees of stores there, but little is open for customers yet, since infrastructural graces such as water are not fully restored. (I saw the Honor Bar operating with its own water tank, and a water truck nearby.) Opening Upper Village will take longer. Some landmark institutions, such as San Ysidro Ranch and La Casa Santa Maria, will take years to restore. (From what I gather, San Ysidro Ranch, arguably the nicest hotel in the world, was nearly destroyed. Its website thank firefighters for salvation from the Thomas Fire. But nothing, I gather, could have save it from the huge debris flow wiped out nearly everything on the flanks of San Ysidro Creek. (All the top red dots along San Ysidro Creek in the map below mark lost buildings at the Ranch.)

Here is a map with final damage assessments. I’ve augmented it with labels for the canyons and creeks (with one exception: a parallel creek west of Toro Canyon Creek):

Click on the map for a closer view, or click here to view the original. On that one you can click on every dot and read details about it.

I should pause to note that Montecito is no ordinary town. Demographically, it’s Beverly Hills draped over a prettier landscape and attractive to people who would rather not live in Beverly Hills. (In fact the number of notable persons Wikipedia lists for Montecito outnumbers those it lists for Beverly Hills by a score of 77 to 71.) Culturally, it’s a village. Last Monday in The New Yorker, one of those notable villagers, T.Coraghessan Boyle, unpacked some other differences:

I moved here twenty-five years ago, attracted by the natural beauty and semirural ambience, the short walk to the beach and the Lower Village, and the enveloping views of the Santa Ynez Mountains, which rise abruptly from the coastal plain to hold the community in a stony embrace. We have no sidewalks here, if you except the business districts of the Upper and Lower Villages—if we want sidewalks, we can take the five-minute drive into Santa Barbara or, more ambitiously, fight traffic all the way down the coast to Los Angeles. But we don’t want sidewalks. We want nature, we want dirt, trees, flowers, the chaparral that did its best to green the slopes and declivities of the mountains until last month, when the biggest wildfire in California history reduced it all to ash.

Fire is a prerequisite for debris flows, our geologists explained. So is unusually heavy rain in a steep mountain watershed. There are five named canyons, each its own watershed, above Montecito, as we see on the map above. There are more to the east, above Summerland and Carpinteria, the next two towns down the coast. Those towns also took some damage, though less than Montecito.

Ed Keller put up this slide to explain conditions that trigger debris flows, and how they work:

Ed and Larry were emphatic about this: debris flows are not landslides, nor do many start that way (though one did in Rattlesnake Canyon 1100 years ago). They are also not mudslides, so we should stop calling them that. (Though we won’t.)

Debris flows require sloped soils left bare and hydrophobic—resistant to water—after a recent wildfire has burned off the chaparral that normally (as geologists say) “hairs over” the landscape. For a good look at what soil surfaces look like, and are likely to respond to rain, look at the smooth slopes on the uphill side of 101 east of La Conchita. Notice how the surface is not only a smooth brown or gray, but has a crust on it. In a way, the soil surface has turned to glass. That’s why water runs off of it so rapidly.

Wildfires are common, and chaparral is adapted to them, becoming fuel for the next fire as it regenerates and matures. But rainfalls as intense as this one are not common. In just five minutes alone, more than half an inch of rain fell in the steep and funnel-like watersheds above Montecito. This happens about once every few hundred years, or about as often as a tsunami.

It’s hard to generalize about the combination of factors required, but Ed has worked hard to do that, and this slide of his is one way of illustrating how debris flows happen eventually in places like Montecito and Santa Barbara:

From bottom to top, here’s what it says:

  1. Fires happen almost regularly, spreading most widely where chaparral has matured to become abundant fuel, as the firefighters like to call it.
  2. Flood events are more random, given the relative rarity of rain and even more rare rains of “biblical” volume. But they do happen.
  3. Stream beds in the floors of canyons accumulate rocks and boulders that roll down the gradually eroding slopes over time. The depth of these is expressed as basin instablity. Debris flows clear out the rocks and boulders when a big flood event comes right after a fire and basin becomes stable (relatively rock-free) again.
  4. The sediment yield in a flood (F) is maximum when a debris flow (DF) occurs.
  5. Debris flows tend to happen once every few hundred years. And you’re not going to get the big ones if you don’t have the canyon stream bed full of rocks and boulders.

About this set of debris flows in particular:

  1. Destruction down Oak Creek wasn’t as bad as on Montecito, San Ysidro, Buena Vista and Romero Creeks because the canyon feeding it is smaller.
  2. When debris flows hit an obstruction, such as a bridge, they seek out a new bed to flow on. This is one of the actions that creates an alluvial fan. From the map it appears something like that happened—
    1. Where the flow widened when it hit Olive Mill Road, fanning east of Olive Mill to destroy all three blocks between Olive Mill and Santa Elena Lane before taking the Olive Mill bridge across 101 and down to the Biltmore while also helping other flows fill 101 as well. (See Mac’s comment below, and his link to a top map.)
    2. In the area between Buena Vista Creek and its East Fork, which come off different watersheds
    3. Where a debris flow forked south of Mountain Drive after destroying San Ysidro Ranch, continuing down both Randall and El Bosque Roads.

For those who caught (or are about to catch) Ellen’s Facetime with Oprah visiting neighbors, that happened among the red dots at the bottom end of the upper destruction area along San Ysidro Creek, just south of East Valley Road. Oprah’s own place is in the green area beside it on the left, looking a bit like Versailles. (Credit where due, though: Oprah’s was a good and compassionate report.)

Big question: did these debris flows clear out the canyon floors? We (meaning our geologists, sedimentologists, hydrologists and other specialists) won’t know until they trek back into the canyons to see how it all looks. Meanwhile, we do have clues. For example, here are after-and-before photos of Montecito, shot from space. And here is my close-up of the latter, shot one day after the event, when everything was still bare streambeds in the mountains and fresh muck in town:

See the white lines fanning back into the mountains through the canyons (Cold Spring, San Ysidro, Romero, Toro) above Montecito? Ed explained that these appear to be the washed out beds of creeks feeding into those canyons. Here is his slide showing Cold Spring Creek before and after the event:

Looking back at Ed’s basin threshold graphic above, one might say that there isn’t much sediment left for stream beds to yield, and that those in the floors of the canyons have returned to stability, meaning there’s little debris left to flow.

But that photo was of just one spot. There are many miles of creek beds to examine back in those canyons.

Still, one might hope that Montecito has now had its required 200-year event, and a couple more centuries will pass before we have another one.

Ed and Larry caution against such conclusions, emphasizing that most of Montecito’s and Santa Barbara’s inhabited parts gain their existence, beauty or both by grace of debris flows. If your property features boulders, Ed said, a debris flow put them there, and did that not long ago in geologic time.

For an example of boulders as landscape features, here are some we quarried out of our yard more than a decade ago, when we were building a house dug into a hillside:

This is deep in the heart of Santa Barbara.

The matrix mud we now call soil here is likely a mix of Juncal and Cozy Dell shale, Ed explained. Both are poorly lithified silt and erode easily. The boulders are a mix of Matilija and Coldwater sandstone, which comprise the hardest and most vertical parts of the Santa Ynez mountains. The two are so similar that only a trained eye can tell them apart.

All four of those geological formations were established long after dinosaurs vanished. All also accumulated originally as sediments, mostly on ocean floors, probably not far from the equator.

To illustrate one chapter in the story of how those rocks and sediments got here, UCSB has a terrific animation of how the transverse (east-west) Santa Ynez Mountains came to be where they are. Here are three frames in that movie:

What it shows is how, when the Pacific Plate was grinding its way northwest about eighteen million years ago, a hunk of that plate about a hundred miles long and the shape of a bread loaf broke off. At the top end was the future Malibu hills and at the bottom end was the future Point Conception, then situated south of what’s now Tijuana. The future Santa Barbara was west of the future Newport Beach. Then, when the Malibu end of this loaf got jammed at the future Los Angeles, the bottom end of the loaf swept out, clockwise and intact. At the start it was pointing at 5 o’clock and at the end (which isn’t), it pointed at 9:00. This was, and remains, a sideshow off the main event: the continuing crash of the Pacific Plate and the North American one.

Here is an image that helps, from that same link:

Find more geology, with lots of links, in Making sense of what happened to Montecito. I put that post up on the 15th and have been updating it since then. It’s the most popular post in the history of this blog, which I started in 2007. There are also 58 comments, so far.

I’ll be adding more to this post after I visit as much as I can of Montecito (exclusion zones permitting). Meanwhile, I hope this proves useful. Again, corrections and improvements are invited.

30 January

 

This post continues the inquiry I started with Making sense of what happened to Montecito. That post got a record number of reads for this blog, and 57 comments as well.

I expect to learn more at the community meeting this evening with UCSB geologist Ed Keller in the Faulkner Room in the main library in Santa Barbara. Here’s the Library schedule. Note that the meeting will be streamed live on Facebook.

Meanwhile, to help us focus on the geology questions, here is the final post-mudslide damage inspection map of Montecito:

I left out Carpinteria, because of the four structures flagged there, three were blue (affected) and one was yellow (minor), and none were orange (major) or red (destroyed). I’m also guessing they were damaged by flooding rather than debris flow. I also want to make the map as legible as possible, so we can focus on where the debris flows happened, and how we might understand the community’s prospects for the future.

So here are my questions, all subject to revision and correction.

  1. How much of the damage was due to debris flow alone, and how much to other factors (e.g. rain-caused flooding, broken water pipes)?
  2. Was concentration of rain the main reason why we saw flows in the canyons above Montecito, but not (or less so) elsewhere?
  3. Where exactly did the debris flow from? And has the area been surveyed well enough to predict what future debris flows might happen if we get big rains this winter and ones to follow?
  4. Do we need bigger catch basins for debris, like they have at the base of the San Gabriels, above Los Angeles’ basin?
  5. How do the slopes above Montecito and Santa Barbara differ from other places (e.g. the San Gabriels) where debris flows (and rock falls) are far more common?
  6. What geology-advised changes in our infrastructure (especially water and gas) might we make, based on what we’ve learned so far?
  7. What might we expect (that most of us don’t now) in the form of other catastrophes that show up in the geologic record? For example, earthquakes and tsunamis. See here: “This earthquake was associated with by far the largest seismic sea wave ever reported for one originating in California. Descriptive accounts indicate that it may have reached elevations of 15 feet at Gaviota, 30 to 35 feet at Santa Barbara, and 15 feet or more in Ventura. It may have even shown visible effects in the San Francisco harbor.” There is also this, which links to questions about the former report. (Still, there have been a number of catastrophic earthquakes on or affecting the South Coast, and it has been 93 years since the 1925 quake — and the whole Pacific Coast is subject to tsunamis. Here are some photos of the quake.)

Note that I don’t want to ask Ed to play a finger-pointing role here. Laying blame isn’t his job, unless he’s blaming nature for being itself.

Additional reading:

  • Dan McCaslin: Rattlesnake Canyon Fine Now for Day Hiking (Noozhawk) Pull-quote: “Santa Barbara geologist Ed Keller has said that all of Santa Barbara is built on debris flows piled up during the past 60,000 years. Around 1100 A.D., a truly massive debris flow slammed through Rattlesnake Canyon into Mission Canyon, leaving large boulders as far down as the intersection of Alamar Avenue and State Street (go check). There were Chumash villages in the area, and they may have been completely wiped out then. While some saddened Montecitans claim that sudden flash floods and debris flows should have been forecast more accurately, this seems impossible.”
  • Those deadly mudslides you’ve read about? Expect worse in the future. (Wall Street Journal) Pull-quote: “Montecito is particularly at risk as the hill slopes above town are oversteepened by faulting and rapid uplift, and much of the town is built on deposits laid down by previous floods. Some debris basins were in place, but they were quickly overtopped by the hundreds of thousands of cubic yards of water and sediment. While high post-fire runoff and erosion rates could be expected, it was not possible to accurately predict the exact location and extreme magnitude of this particular storm and resulting debris flows.”
  • Evacuation Areas Map.
  • Thomas Fire: Forty Days of Devastation (LA Times) Includes what happened to Montecito. Excellent step-by-step 3D animation.

Montecito is now a quarry with houses in it:

So far twenty dead have been removed. It will take much more time to remove twenty thousand dump truck loads of what geologists call “debris,” just to get down to where civic infrastructure (roads, water, electric, gas) can be fixed. It’s a huge thing.

The big questions:

  1. Did we know a catastrophe this huge was going to happen? (And if so, which among us were the “we” who knew?)
  2. Was there any way to prevent it?

Geologists had their expectations, expressed as degrees of likelihood and detailed on this map by the United States Geological Survey:

That was dated more than a month before huge rains revised to blood-red the colors in the mountains above town. Worries of County Supervisors and other officials were expressed in The Independent on January 3rd and 5th. Edhat also issued warnings on January 5th and 6th.

Edhat’s first report began, “Yesterday, the National Weather Service issued a weather briefing of a potential significant winter storm for Santa Barbara County on January 9-10. With the burn scar created by the Thomas Fire, the threat of flash floods and debris/mud flows is now 10 times greater than before the fire.”

But among those at risk, who knew what a “debris/mud flow” was—especially when nobody had ever seen one of those anywhere around here, even after prior fires?

The first Independent story (on January 3rd) reported, “County water expert Tom Fayram said county workers began clearing the debris basins at San Ysidro and Gobernador canyons ‘as soon as the fire department would let us in.’ It is worth noting, Lewin said, that the Coast Village Road area flooded following the 1971 Romero Fire and the 1964 Coyote Fire. While touring the impact areas in recent days, (Office of Emergency Management Director Robert) Lewin said problems have already occurred. ‘We’re starting to see gravity rock fall, he said. ‘One rock could close a road.'”

The best report I’ve seen about what geologists knew, and expected, is The Independent‘s After the Mudslides, What Does the Next Rain Hold for Montecito?, published four days after the disaster. In that report, Kevin Cooper of the U.S. Forest Service said, “no one alive has probably ever seen one before.” [January 18 update: Nick Welch in The Independent reports, “Last week’s debris flow was hardly Santa Barbara’s first. Jim Stubchaer, then an engineer with County Flood Control, remembers the avalanche of mud that took 250 homes back in November 1964 when heavy rains followed quickly on the heels of the Coyote Fire. He was there in 1969 and 1971 when it happened again.” Here is a long 2009 report on the Coyote Fire in The Independent by Ray Ford, now with Noozhawk. No mention of the homes lost in there. Perhaps Ray can weigh in.]

My point is that debris flows over Montecito ae a sure bet in geologic time, but not in the human one. In the whole history of Montecito and Santa Barbara (of which Montecito is an unincorporated part), there are no recorded debris flows that started on mountain slopes and spread all the way to the sea. But on January 9th we had several debris flows on that scale, originating simultaneously in the canyons feeding Montecito, San Ysidro and Romero Creeks. Those creeks are dry most of the time, and beautiful areas in which to build homes: so beautiful, in fact, that Montecito is the other Beverly Hills. (That’s why all these famous people have called it home.)

One well-studied prehistoric debris flow in Santa Barbara emptied a natural lake that is now Skofield Park,dumping long-gone mud and lots of rocks in Rattlesnake Canyon, leaving its clearest evidence in a charming tree-shaded boulder field next to Mission Creek called Rocky Nook Park.

What geologists at UCSB learned from that flow is detailed in a 2001 report titled UCSB Scientists Study Ancient Debris Flows. It begins, “The next ‘big one’ in Santa Barbara may not be an earthquake but a boulder-carrying flood.” It also says that flood would “most likely occur every few thousand years.”

And we got one in Montecito last Tuesday.

I’ve read somewhere that studies of charcoal from campfires buried in Rocky Nook Park date that debris flow at around 500 years ago. This is a good example of how the geologic present fails to include present human memory. Still, you can get an idea of how big this flow was. Stand in Rattlesnake Canyon downstream from Skofield Park and look at the steep rocky slopes below houses on the south side of the canyon. It isn’t hard to imagine the violence that tore out the smooth hillside that had been there before.

To help a bit more with that exercise, here is a Google Streetview of Scofield Park, looking down at Santa Barbara through Rattlesnake Canyon:

I added the red line to show the approximate height of the natural dam that broke and released that debris flow.

I’ve also learned that the loaf-shaped Riviera landform in Santa Barbara is not a hunk of solid rock, but rather what remains of a giant landslide that slid off the south face of the Santa Ynez Mountains and became free-standing after creeks eroded out the valley behind. I’ve also read that Mission Creek flows westward around the Riviera and behind the Mission because the Riviera itself is also sliding the same direction on its own tectonic sled.

We only see these sleds moving, however, when geologic and human time converge. That happened last Tuesday when rains Kevin Cooper calls “biblical” hit in the darkest hours, saturating the mountain face creek beds that were burned by the Thomas Fire just last month. As a result, debris flows gooped down the canyons and stream valleys below, across Montecito to the sea, depositing lots of geology on top of what was already there.

So in retrospect, those slopes in various colors in the top map above should have been dark red instead. But, to be fair, much of what geology knows is learned the hard way.

Our home, one zip code west of Montecito, is fine. But we can’t count how many people we know who are affected directly. One friend barely escaped. Some victims were friends of friends. Some of the stories are beyond awful.

We all process tragedies like this in the ways we know best, and mine is by reporting on stuff, hopefully in ways others are not, or at least not yet. So I’ll start with this map showing damaged and destroyed buildings along the creeks:

At this writing the map is 70% complete. [January 17 update: 95%.] I’ve clicked on all the red dots (which mark destroyed buildings, most of which are homes), and I’ve copied and pasted the addresses that pop up into the following outline, adding a few links.

Going downstream along Cold Spring Creek, Hot Springs Creek and Montecito Creek (which the others feed), gone are—
  1. 817 Ashley Road
  2. 817 Ashley Road (out building)
  3. 797 Ashley Road
  4. 780 Ashley Road. Amazing architectural treasure that last sold for $12.9 million in ’13.
  5. 809 Ashley Road
  6. 809 Ashley Road (there are two at one address)
  7. 747 Indian Lane
  8. 631 Parra Grande Lane. That’s the mansion where the final scene in Scarface was shot.
  9. 590 Meadowood Lane
  10. 830 Rockbridge Road
  11. 800 Rockbridge Road
  12. 790 Rockbridge Road
  13. 787 Riven Rock Road B
  14. 1261 East Valley Road
  15. 1240 East Valley Road A (mansion)
  16. 1240 East Valley Road B (out building)
  17. 1254 East Valley Drive
  18. 1255 East Valley Road
  19. 1247 East Valley Road A
  20. 1247 East Valley Road B (attached)
  21. 1231 East Valley Road A
  22. 1231 East Valley Road B (detached)
  23. 1231 East Valley Road C (detached)
  24. 1221 East Valley Road A
  25. 1221 East Valley Road B
  26. 369 Hot Springs Road
  27. 341 Hot Springs Road A
  28. 341 Hot Springs Road B
  29. 341 Hot Springs Road C
  30. 355 Hot Springs Road
  31. 335 Hot Springs Road A
  32. 335 Hot Springs Road B
  33. 333 Hot Springs Road (Not marked in final map)
  34. 341 Hot Springs Road A
  35. 341 Hot Springs Road B
  36. 341 Hot Springs Road C
  37. 340 Hot Springs Road
  38. 319 Hot Springs Road
  39. 325 Olive Mill Road
  40. 285 Olive Mill Road
  41. 275 Olive Mill Road
  42. 325 Olive Mill Road
  43. 220 Olive Mill Road
  44. 200 Olive Mill Road
  45. 275 Olive Mill Road
  46. 180 Olive Mill Road
  47. 170 Olive Mill Road
  48. 144 Olive Mill Road
  49. 137 Olive Mill Road
  50. 139 Olive Mill Road
  51. 127 Olive Mill Road
  52. 196 Santa Elena Lane
  53. 192 Santa Elena Lane
  54. 179 Santa Isabel Lane
  55. 175 Santa Elena Lane
  56. 142 Santo Tomas Lane
  57. 82 Olive Mill Road
  58. 1308 Danielson Road
  59. 81 Depot Road
  60. 75 Depot Road
Along Oak Creek—
  1. 601 San Ysidro Road
  2. 560 San Ysidro Road B
Along San Ysidro Creek—
  1. 953 West Park Lane
  2. 941 West Park Lane
  3. 931 West park Lane
  4. 925 West park Lane
  5. 903 West park Lane
  6. 893 West park Lane
  7. 805 W Park Lane
  8. 881 West park Lane
  9. 881 West park Lane (separate building, same address)
  10. 1689 Mountain Drive
  11. 900 San Ysidro Lane C (all the Lane addresses appear to be in San Ysidro Ranch)
  12. 900 San Ysidro Lane Cottage B
  13. 900 San Ysidro Lane Cottage A
  14. 900 San Ysidro Lane Cottage D
  15. 900 San Ysidro Lane E
  16. 900 San Ysidro Lane F
  17. 900 San Ysidro Lane G
  18. 900 San Ysidro Lane H
  19. 900 San Ysidro Lane I
  20. 900 San Ysidro Lane J
  21. 900 San Ysidro Lane K
  22. 900 San Ysidro Lane L
  23. 900 San Ysidro Lane M
  24. 900 San Ysidro Lane N
  25. 900 San Ysidro Lane O
  26. 900 San Ysidro Lane R
  27. 900 San Ysidro Lane S
  28. 900 San Ysidro Lane T
  29. 888 San Ysidro Lane A
  30. 888 San Ysidro Lane B
  31. 888 San Ysidro Lane C
  32. 888 San Ysidro Lane D
  33. 888 San Ysidro Lane E
  34. 888 San Ysidro Lane F
  35. 805 West Park Lane B
  36. 799 East Mountain Drive
  37. 1801 East Mountain Lane
  38. 1807 East Mountain Drive
  39. 771 Via Manana Road
  40. 899 El Bosque Road
  41. 771 Via Manana Road
  42. 898 El Bosque Road
  43. 800 El Bosque Road A (Casa de Maria)
  44. 800 El Bosque Road B (Casa de Maria)
  45. 800 El Bosque Road C (Casa de Maria)
  46. 559 El Bosque Road (This is between Oak Creek and San Ysidro Creek)
  47. 680 Randall Road
  48. 670 Randall Road
  49. 660 Randall Road
  50. 650 Randall Road
  51. 640 Randall Road
  52. 630 Randall Road
  53. 619 Randall Road
  54. 1685 East Valley Road A
  55. 1685 East Valley Road B
  56. 1685 East Valley Road C
  57. 1696 East Valley Road
  58. 1760 Valley Road A
  59. 1725 Valley Road A
  60. 1705 Glenn Oaks Drive A
  61. 1705 Glen Oaks Drive B
  62. 1710 Glen Oaks Drive A
  63. 1790 Glen Oaks Drive A
  64. 1701 Glen Oaks Drive A
  65. 1705 Glen Oaks Drive A
  66. 1705 East Valley Road A
  67. 1705 East Valley Road B
  68. 1705 East Valley Road C
  69. 1780 Glen Oaks Drive N/A
  70. 1780 Glen Oaks Drive (one on top of the other)
  71. 1774 Glen Oaks Drive
  72. 1707 East Valley Road A
  73. 1685 East Valley Road C
  74. 1709 East Valley Road
  75. 1709 East Valley Road B
  76. 1775 Glen Oaks Drive A
  77. 1775 Glen Oaks Drive B
  78. 1779 Glen Oaks Drive A
  79. 1779 Glen Oaks Drive B
  80. 1779 Glen Oaks Drive C
  81. 1781 Glen Oaks Drive A
  82. 1711 East Valley Road (This and what follow are adjacent to Oprah)
  83. 1715 East Valley Road A
  84. 1715 East Valley Road B
  85. 1719 East Valley Road
  86. 1721 East Valley Road A (This might survive. See Dan Seibert’s comment below)
  87. 1721 East Valley Road B (This might survive. See Dan Seibert’s comment below)
  88. 1721 East Valley Road C (This might survive. See Dan Seibert’s comment below)
  89. 1694 San Leandro Lane A
  90. 1694 San Leandro Lane D
  91. 1690 San Leandro Lane C
  92. 1690 San Leandro Lane A
  93. 1694 San Leandro Lane B
  94. 1696 San Leandro Lane
  95. 1710 San Leandro Lane A
  96. 1710 San Leandro Lane B
  97. 190 Tiburon Bay Lane
  98. 193 Tiburon Bay Lane A
  99. 193 Tiburon Bay Lane B
  100. 193 Tiburon Bay Lane C
  101. 197 Tiburon Bay Lane A
Along Buena Vista Creek—
  1. 923 Buena Vista Avenue
  2. 1984 Tollis Avenue A
  3. 1984 Tollis Avenue B
  4. 1984 Tollis Avenue C
  5. 670 Lilac Drive
  6. 658 Lilac Drive
  7. 2075 Alisos Drive (marked earlier, but I don’t see it in the final map)
  8. 627 Oak Grove Lane
Along Romero Creek—
  1. 1000 Romero Canyon Road
  2. 1050 Romero Canyon Road
  3. 860 Romero Canyon Road
  4. 768 Winding Creek Lane
  5. 745 Winding Creek Lane
  6. 744 Winding Creek Lane
  7. 2281 Featherhill Avenue B

Below Toro Canyon—

  1. 876 Toro Canyon Road
  2. 572 Toro Canyon Park Road

Along Arroyo Paredon, between Summerland and Carpinteria, not far east of the Toro Canyon—

  1. 2000 Cravens Lane

Ten flanking Highway 101 by the ocean are marked as damaged, including four on Padero Lane.

When I add those up, I get 142 163* 178† among the destroyed alone.

[* This is on January 17, when the map says it is 95% complete. All the additions appear to be along San Ysidro Creek, especially on San Ysidro Lane, which I believe is mostly in San Ysidro Ranch. Apparently nearly the whole place has been destroyed. Adjectives such as “lovely” fail to describe what it was.]

[† This is on January 18, when the map is complete. I’ll need to go over it again, because there are subtractions as well as additions. Additional note: on March 22, the resident at 809 Ashley Road asked me to make sure that address was also added. There are two homes at that address, both gone.]

Now let’s go back and look more closely at this again from the geological perspective.

What we see is a town revised by nature in full disregard for what was there before—and in full obedience to the pattern of alluvial deposition on the flanks of all fresh mountains that erode down almost as fast as they go up.

This same pattern accounts for much of California, including all of the South Coast and the Los Angeles basin.

To see what I mean, hover your mind above Atlanta and look north at the southern Appalachians. Then dial history back five million years. What you see won’t look much different. Do the same above Los Angeles or San Francisco and nothing will be the same, or even close. Or even there at all.

Five million years is about 1/1000th of Earth’s history. If that history were compressed to a day, California showed up in less than the last forty seconds. In that short time California has formed and re-formed constantly, and is among the most provisional landscapes in the world. All of it is coming up, sliding down, spreading out and rearranging itself, and will continue doing so through all the future that’s worth bothering to foresee. Debris flows are among nature’s most casual methods for revising landscapes. (By the way, I am writing this in a San Marino house that sits atop the Raymond Fault scarp, which on the surface takes the form of a forty-foot hill. The stack of rock strata under the bottom of that hill is displaced 17,000 feet from the identical suite under the base at the top. Many earthquakes produced that displacement, while erosion has buffed 16,960 feet of rock and soil off the top.)

So we might start to look at the Santa Ynez Mountains behind Santa Barbara and Montecito not as a stable land form but rather as a volcano of mud and rock that’s sure to go off every few dozen or hundreds of years—and will possibly deliver a repeat performance if we get more heavy rains and there is plenty of debris left to flow out of mountain areas adjacent to those that flowed on January 9th. If there’s a lot of it, why even bother saving Montecito?

Here’s why:

One enters the Engineering building at the University of Wyoming under that stone plaque, which celebrates what may be our species’ greatest achievement and conceit: controlling nature. (It’s also why geology is starting to call our present epoch the anthropocene.)

This also forecasts exactly what we will do for Montecito. In the long run we’ll lose to nature. But meanwhile we strive on.

In our new strivings, it will help to look toward other places in California that are more experienced with debris flows, because they happen almost constantly there. The largest of these by far is Los Angeles, which has placed catch basins at the mouths of all the large canyons coming out of the San Gabriel Mountains. Most of these dwarf the ones above Montecito. All resemble empty reservoirs. Some are actually quarries for rocks and gravel that roll in constantly from the eroding creek beds above. None are pretty.

To understand the challenge involved, it helps to read John McPhee’s classic book The Control of Nature, which takes its title from the inscription above. Fortunately, you can start right now by reading the first essay in a pair that became the relevant chapter of that book. It’s free on the Web and called Los Angeles Against the Mountains I. Here’s an excerpt:

Debris flows amass in stream valleys and more or less resemble fresh concrete. They consist of water mixed with a good deal of solid material, most of which is above sand size. Some of it is Chevrolet size. Boulders bigger than cars ride long distances in debris flows. Boulders grouped like fish eggs pour downhill in debris flows. The dark material coming toward the Genofiles was not only full of boulders; it was so full of automobiles it was like bread dough mixed with raisins.

The Genofiles were a family that barely survived a debris flow on a slope of Verdugo Mountain, overlooking Los Angeles from Glendale. Here’s another story, about another site not far away:

The snout of the debris flow was twenty feet high, tapering behind. Debris flows sometimes ooze along, and sometimes move as fast as the fastest river rapids. The huge dark snout was moving nearly five hundred feet a minute and the rest of the flow behind was coming twice as fast, making roll waves as it piled forward against itself—this great slug, as geologists would describe it, this discrete slug, this heaving violence of wet cement. Already included in the debris were propane tanks, outbuildings, picnic tables, canyon live oaks, alders, sycamores, cottonwoods, a Lincoln Continental, an Oldsmobile, and countless boulders five feet thick. All this was spread wide a couple of hundred feet, and as the debris flow went through Hidden Springs it tore out more trees, picked up house trailers and more cars and more boulders, and knocked Gabe Hinterberg’s lodge completely off its foundation. Mary and Cal Drake were standing in their living room when a wall came off. “We got outside somehow,” he said later. “I just got away. She was trying to follow me. Evidently, her feet slipped out from under her. She slid right down into the main channel.” The family next door were picked up and pushed against their own ceiling. Two were carried away. Whole houses were torn loose with people inside them. A house was ripped in half. A bridge was obliterated. A large part of town was carried a mile downstream and buried in the reservoir behind Big Tujunga Dam. Thirteen people were part of the debris. Most of the bodies were never found.

This is close to exactly what happened to Montecito in the wee hours of January 9th. (As of March 22, two of the 23 dead still haven’t been recovered, and probably never will be.)

As of now the 8000-plus residents of Montecito are evacuated and forbidden to return for at least another two weeks—and maybe much longer if officials declare the hills above town ready to flow again.

Highway 101—one of just two major freeways between Southern and Northern California, is closed indefinitely, because it is now itself a stream bed, and re-landscaping the area around it, to get water going where it should, will take some time. So will fixing the road, and perhaps bridges as well.

Meanwhile getting in and out of Santa Barbara from east of Montecito by car requires a detour akin to driving from Manhattan to Queens by way of Vermont. And there have already been accidents, I’ve heard, on highway 166, which is the main detour road. We’ll be taking that detour or one like it on Thursday when we head home via Los Angeles after we fly there from New York, where I’m packing up now.

Expect this post to grow and change.

Bonus links:

 

Tags: , , , ,

[Update: 7:22am Monday December 11] Two views of ThomasFire developments. First, MODIS fire detections, plotted on Google Earth Pro, current at 7am Pacific time:

Second, a screenshot of the NCWG (National Wildfire Coordinating Group) map of the area, 7:18am Pacific time:

On the map itself, you can click on each of those squares and get more specific data. Here is the latest from VIIRS, which appears to be the source of the five hot spots in Montecito, above:

This explains now MODIS and VIIRS work together.

In listening to what local media I can (over the Net, from where I am in Los Angeles), I’ve heard nothing about the five hot spots detected in Montecito. KCLU reports that winds are slack, and smoke moving straight up, which means that firefighters may be able to restrict growth of the fire to the back country behind the spine of Santa Ynez mountains, behind Santa Barbara and Montecito.

[December 10, 3:45pm] MODIS fire data, plotted on Google Earth. The view is straight east. You can see the Thomas Fire advancing through the back country westward toward Santa Barbara, and already encroaching on Carpinteria:

Those are fire detections. Radiative power data is also at that first link.

Here is a collection of links to sources of useful information aboiut the #ThomasFire:

 

Tags: , ,

That was yesterday. Hard to tell from just looking at it, but that’s a 180° shot, panning from east to west across California’s South Coast, most of which is masked by smoke from the Thomas Fire.

We weren’t in the smoke then, but we are now, so there’s not much to shoot. Just something more to wear: a dust mask. Yesterday I picked up two of the few left at the nearest hardware store, and now I’m wearing one around the house. Since wildfire smoke is bad news for lungs, that seems like a good idea.

I’m also noticing dead air coming from radio stations whose transmitters have likely burned up. And websites that seem dead to the fire as well. Here’s a list of signals that I’m pretty sure is off the air right now. All their transmitters are within the Thomas Fire perimeter:

Some are on Red Mountain (on the west of Highway 33, which connects Ventura with Ojai); some are in the Ventura Hills; and some are on Sulphur Mountain, which is the high ridge on the south side of Ojai. One is on Santa Paula Mountain, with a backup on Red Mountain. (That’s KOCP. I don’t hear it, and normally do.)

In some cases I’m hearing a live signal but dead air. In others I’m hearing nothing at all. In still other cases I’m hearing something faint. And some signals are too small, directional or isolated for me to check from 30 miles (give or take) away. So, fact checking is welcome. There’s a chance some of these are on the air with lower power at temporary locations.

The links in the list above go to technical information for each station, including exact transmitter locations and facilities, rather than to the stations themselves. Here’s a short cut to those, from the great Radio-Locator.com.

Nearly all the Ventura area FM stations — KHAY, KRUZ, KFYV, KMLA, KCAQ , KMRO, KSSC and KOCP — have nothing about the fire on their websites. Kinda sad, that. I’ve only found only two local stations doing what they should be doing at times like this. One is KCLU/88.3, the public station in Thousand Oaks. KCLU also serves the South Coast with an AM and an FM signal in Santa Barbara. The other is KVTA/1590. The latter is almost inaudible here right now. I suppose that’s because of a power outage. Its transmitter, like those of the other two AM stations in town, is down in a flat area unlikely to burn.

KBBY, on Rincon Mountain (a bit west of Red Mountain, but in an evacuation area with reported spot fires), is still on the air. Its website also has no mention of the fire. Same with KHAY/100.7, on Red Mountain, which was off the air but is now back on. Likewise KMLA/103.7, licensed to El Rio but serving the Ventura area.

KXLM/102.9 which transmits from the flats, is on the air.

Other sources of fire coverage are KPCC, KCRW and KNX.

 

 

 

Tags:

Here’s what I wrote about pirate radio in New York, back in 2013 . I hoped to bait major media attention with that. Got zip.

Then I wrote this in 2015 (when I also took the screen shot, above, of a local pirate’s ID on my kitchen radio). I got a couple people interested, including one college student, but we couldn’t coordinate our schedules and the moments were lost.

Now comes news of pirate radio crackdowns by the FCC*, yet little of that news concerns the demand these stations supply. The default story is about FCC vs. Pirates, not how pirates address the inadequacies of FCC-licensed broadcast radio. (One good exception: this story in the Miami Herald about an FCC-fined pirate that programs for a population licensed radio doesn’t serve.)

To sample the situation, drive your car up Broadway north of 181st Street in Manhattan (above which the city gets very hilly, and there is maximal signal shadowing by big apartment buildings), or into the middle of the Bronx (same kind of setting), on any weekend evening. Then hit SCAN on your radio. Betcha a third of the stations you’ll hear are pirates, and the announcers will be speaking Spanish or Caribbean English. Some stations will have ads. Even if you only hear three or four signals (I’m on the wrong coast for checking on this), you’re tapping into something real happening which—far as I know—continues to attract approximately zero interest among popular media. (Could be it’s a thing on Twitter, but I don’t know.)

But there is a story here, about a marketplace of the literal sort. As I say in both those posts (at the top two links above), I wish I knew Spanish. For a reporter who does, there’s some great meat to chew on here. And it’s not just about the FCC playing a game of whack-a-mole. It’s about what licensed broadcasting alone can’t or won’t do.

Low power FM transmitters are cheap, by the way. The good ones are in low four figures. (One example.) The okay ones are in the two- and three-figure range. (Examples on Amazon and eBay.)

By the way, anything more than a small fraction of one watt is almost certainly in violation of Part 15 of the FCC rules, and therefore illegal. But hey, there’s a market for these things, so they sell.

By the way, is anyone visiting the topic of what will happen if Cumulus and/or iHeart can’t pay their debts? If either or both go down, a huge percentage of over-the-air radio in the U.S. goes with them.

The easy thing to blame is bad corporate decisions of one kind or another. The harder one is considering what the digital world is doing to undermine and replace the analog one.

If you’re wondering about why pirate radio is so big in New York yet relatively nowhere in Los Angeles (the next-largest broadcast market), here’s the main reason: New York FM stations are weak. None are more than 6000 watts, and those are on the Empire State Building, only about 1300 feet up in the air above the center of a metro that’s thick with signal shadowing by buildings that bang up FM signals. In nearby New Jersey and the outer boroughs, you can put out a 10 or a 50 watt signal from a whip antenna on top of a house or a high-rise, on a channel right next to a licensed one, and cover a zip code or two with little trouble. It’s hard to do that in most of Los Angeles, where stations radiate from 6000-foot Mt. Wilson, at powers up to 110000 watts, and strong signals pack the dial from one end to the other. There are similar situations in Seattle, Portland, San Diego, Denver and San Francisco (though a few more terrain shadows to operate in). In flat places without thick clusters of high-rises in their outlying areas—Miami, New Orleans, Memphis, Houston, Dallas, Chicago, Minneapolis, Detroit—there are few places for pirates to hide among the buildings. In those places it’s relatively easy to locate and smack down a pirate, especially if they’re operating in a wide open way (as was the Miami example).

Santa Barbara is one of the world’s great sea coast towns. It’s also in a good position to be one of the world’s great Internet coast towns too.

Luckily, Santa Barbara is advantaged by its location not just on the ocean, but on some of the thickest Internet trunk lines (called “backbones”) in the world. These run through town beside the railroad and Highway 101. Some are owned by the state college and university system. Others are privately owned. In fact Level(3), now part of CenturyLink, has long had a tap on that trunk, and a large data center, in the heart of the Funk Zone. Here it is:

Last I checked, Level(3) was in the business of wholesaling access to its backbone. So was the UC system.

Yet Santa Barbara is still disadvantaged by depending on a single “high speed” Internet service provider: Cox Communications, which is also the town’s incumbent cable TV company. Like most cable companies, Cox is widely disliked by its customers. It has also recently imposed caps on data use.

Cox’s only competitor is Frontier Communications, which provides Internet access over old phone lines previously run by Verizon and GTE. Cable connections provide higher bandwidth than phone lines, but both are limited to fractions of the capacity provided by fiber optic cables. While it’s possible for cable companies to upgrade service to what’s called DOCSIS 3.1, there has been little in the history of Santa Barbara’s dealings with Cox to suggest that Cox will grace the city with its best possible service. (In fact Cox’s only hint toward fiber is in nearby Goleta, not in Santa Barbara.)

About a decade ago, when I was involved in a grass roots effort to get the city to start providing Internet service on its own over fiber optic connections, Cox told us that Santa Barbara was last in line for upgrading the company’s facilities. Other cities were bigger and more important to Cox, which is based in Atlanta.

Back then we lacked a champion for the Internet cause on the Santa Barbara City Council. The mayor liked the idea, and so did a couple of Council members, but the attitude was, “We’ll wait until Palo Alto does something like this and then copy that.” So the effort died.

But we have a champion now, running for City Council in the 6th District, which covers much of downtown: Jack Ucciferri. A story by Gwendolyn Wu in The Independent yesterday begins, “As District 6 City Council candidate Jack Ucciferri went door-to-door to campaign, he found that many Santa Barbara residents had one thing in common: a mutual disdain for the Cox Communications internet monopoly. ‘Every person I talk to agrees with me,’ Ucciferri said.” Specifically, “Ucciferri is dreaming of a fiber optic plan for Santa Barbara. Down south, the cities of Santa Monica and Oxnard already have or are preparing plans for fiber optic cable networks.”

One of the biggest issues for Santa Barbara is the decline of business downtown, especially along State Street, the city’s heart, where the most common sign on storefronts is “For Lease.” Jack’s district contains more of State Street than any other. I can think of nothing that will help State Street—and Santa Barbara—more than to have world-class Internet access and speeds, which would be a huge attraction for many businesses large and small.

So I urge readers in Jack’s district to give him the votes he needs to champion the cause of making Santa Barbara a leader in the digital world, rather than yet another cable backwater, which it will surely remain if he loses.

[Later…] Jack lost on Tuesday, but came in second of three candidates. The winner was the long-standing incumbent, Gregg Hart. (Here’s Noozhawk’s coverage.) I don’t see this as a loss for Jack or his cause. Conversations leading up to the election (including one with a candidate wh won in another district) have led me to believe the time is right to at least fiber up Santa Barbara’s troubled downtown, where The Retail Apocalypse is well underway.

 

 

Nothing challenges our understanding of infrastructure better than a crisis, and we have a big one now in Houston. We do with every giant storm, of course. New York is still recovering from Sandy and New Orleans from Katrina. Reforms and adaptations always follow, as civilization learns from experience.

Look at aviation, for example. Houston is the 4th largest city in the U.S. and George Bush International Airport (aka IAH) is a major hub for United Airlines. For the last few days traffic there has been sphinctered down to emergency flights alone. You can see how this looks on FlightAware’s Miserymap:

Go there and click on the blue play button to see how flight cancellations have played over time, and how the flood in Houston has affected Dallas as well. Click on the airport’s donut to see what routes are most affected. Frequent fliers like myself rely on tools like this one, made possible by a collection of digital technologies working over the Internet.

The airport itself is on Houston’s north side, and not flooded. Its main problem instead has been people. Countless workers have been unable to come in because they’re trapped in the flood, busy helping neighbors or barely starting to deal with lives of their own and others that have been inconvenienced, ruined or in sudden need of large repair.

Aviation just one of modern civilization’s infrastructures. Roads are another. Early in the flood, when cars were first stranded on roads, Google Maps, which gets its traffic information from cell phones, showed grids of solid red lines on the city’s flooded streets. Now those same streets are blank, because the cell phones have departed and the cars aren’t moving.

The cell phone system itself, however, has been one of the stars in the Houston drama. Harvey shows progress on emergency communications since Katrina, says a Wired headline from yesterday. Only 4% of the areas cells were knocked out.

Right now the flood waters are at their record heights, or even rising. Learnings about extant infrastructures have already commenced, and will accumulate as the city drains and dries. It should help to have a deeper understanding of what infrastructure really is, and what it’s doing where it is, than we have so far.

I say that because infrastructure is still new as a concept. As a word, infrastructure has only been in common use since the 1960s:

In The Etymology of Infrastructure and the Infrastructure of the InternetStephen Lewis writes,

Infrastructure indeed entered the English language as a loan word from French in which it had been a railroad engineering term.  A 1927 edition of the Oxford indeed mentioned the word in the context of “… the tunnels, bridges, culverts, and ‘infrastructure work’ of the French railroads.”  After World War II, “infrastructure” reemerged as in-house jargon within NATO, this time referring to fixed installations necessary for the operations of armed forces and to capital investments considered necessary to secure the security of Europe…

Within my own memory the use of the word “infrastructure” had spilled into the contexts of urban management and regions national development and into the private sector… used to refer to those massive capital investments (water, subways, roads, bridges, tunnels, schools, hospitals, etc.) necessary to city’s economy and the lives of its inhabitants and businesses enterprises but too massive and too critical to be conceived, implemented, and run at a profit or to be trusted to the private sector…

In recent years, in the United States at least, infrastructure is a word widely used but an aspect of economic life and social cohesion known more by its collapse and abandonment and raffling off to the private sector than by its implementation, well-functioning, and expansion.

As Steve also mentions in that piece, he and I are among the relatively small number of people (at least compared to those occupying the familiar academic disciplines) who have paid close attention to the topic for some time.

The top dog in this pack (at least for me) is Brett Frischmann, the Villanova Law professor whose book Infrastructure: The Social Value of Shared Resources (Oxford, 2013) anchors the small and still young canon of work on the topic. Writes Brett,

Infrastructure resources entail long term commitments with deep consequences for the public. Infrastructures are a prerequisite for economic and social development. Infrastructures shape complex systems of human activity, including economic, cultural, and political systems. That is, infrastructures affect the behaviour of individuals, firms, households, and other organizations by providing and shaping the available opportunities of these actors to participate in these systems and to interact with each other.

The emphasis is mine, because I am curious about how shaping works. Specifically, How does infrastructure shape all those things—and each of us as well?

Here is a good example of people being shaped, in this case by mobile phones:

I shot that photo on my own phone in a New York subway a few months ago. As you see, everybody in that car is fully preoccupied with their personal rectangle. These people are not the same as they were ten or even five years ago. Nor are the “firms, households and other organizations” in which they participate. Nor is the subway itself, now that all four major mobile phone carriers cover every station in the city. At good speeds too:

We don’t know if Marshall McLuhan said “we shape our tools and then our tools shape us,” but it was clearly one of his core teachings (In fact the line comes from Father John Culkin, SJ, a Professor of Communication at Fordham and a colleague of McLuhan’s. Whether or not Culkin got it from McLuhan we’ll never know.) As aphorisms go, it’s a close relative to the subtitle of McLuhan’s magnum opus, Understanding Media: the Extensions of Man (Berkeley, 1964, 1994, 2003). The two are compressed into his most quoted line, “the medium is the message,” which says that every medium changes us while also extending us.

In The Medium is the Massage: an Inventory of Effects (Gingko, 1967, 2001), McLuhan explains it this way: “All media work us over completely. They are so pervasive… that they leave no part of us untouched unaffected, unaltered… Any understanding of social and cultural change is impossible without a knowledge of the way media work as environments.”

Specifically, “All media are extensions of some human faculty—psychic or physical. The wheel is an extension of the foot.The book is an extension of the eye. Clothing, an extension of the skin. Electric curcuitry, an extension of the central nervous system. Media, by altering the environment, evoke in us unique ratios of sense perceptins. The extension of any once sense alters the way we think and act—the way we perceive the world. When these things change, men change.”

He also wasn’t just talking communications media. He was talking about everything we make, which in turn make us. As Eric McLuhan (Marshall’s son and collaborator) explains in Laws of Media: The New Science (Toronto, 1988), “media” meant “everything man[kind] makes and does, every procedure, every style, every artefact, every poem, song, painting, gimmick, gadget, theory—every product of human effort.”

Chief among the laws Marshall and Eric minted is the tetrad of media effects. (A tetrad is a group of four.) It says every medium, every technology, has effects that refract in four dimensions that also affect each other. Here’s a graphic representation of them:

They apply these laws heuristically, through questions:

  1. What does a medium enhance?
  2. What does it obsolesce?
  3. What does it retrieve that had been obsolesced earlier?
  4. What does it reverse or flip into when pushed to its extreme (for example, by becoming ubiquitous)?

Questions are required because there can be many different effects, and many different answers. All can change. All can be argued. All can work us over.

One workover happened right here, with this blog. In fact, feeling worked over was one of the reasons I dug back into McLuhan, who I had been ignoring for decades.

Here’s the workover…

In the heyday of blogging, back in the early ’00s, this blog’s predecessor (at doc.weblogs.com) had about 20,000 subscribers to its RSS feed, and readers that numbered in up to dozens of thousand per day. Now it gets dozens. On a good day, maybe hundreds. What happened?

In two words, social media. When I put that in the middle of the tetrad, four answers that jumped to mind:

In the ENHANCED corner, Social media surely makes everyone more social, in the purely convivial sense of the word. Suddenly we have hundreds or thousands of “friends” (Facebook, Swarm, Instagram), “followers” (Twitter) and “contacts” (Linkedin). Never mind that we know few of their birthdays, parents names or other stuff we used to care about. We’re social with them now.

Blogging clearly got OBSOLESCED, but—far more importantly—so did the rest of journalism. And I say this as a journalist who once made a living at the profession and now, like everybody else who once did the same, now make squat. What used to be business of journalism is now the business of “content production,” because that’s what social media and its publishing co-dependents get paid by advertising robots to produce in the world. What’s more, anybody can now participate. Look at that subway car photo above. Any one of those people, or all of them, are journalists now. They write and post in journals of various kinds on social media. Some of what they produce is news, if you want to call it that. But hell, news itself is worked over completely. (More about that in a minute.)

We’ve RETRIEVED gossip, which journalism, the academy and the legal profession had obsolesced (by saying, essentially, “we’re in charge of truth and facts”). In Sapiens: A Brief History of Humankind (Harper, 2015), Yuval Noah Harari says gossip was essential for our survival as hunter-gatherers: “Social cooperation is our key for survival and reproduction. It is not enough for individual men and women to know the whereabouts of lions and bisons.. It’s much more important for them to know who in their band hates whom, who is sleeping with whom, who is honest and who is a cheat.” And now we can do that with anybody and everybody, across the vast yet spaceless nowhere we call the Internet, and to hell with the old formalisms of journalism, education and law.

And social media has also clearly REVERSED us into tribes, especially in the news we produce and consume, much of it to wage verbal war with each other. Or worse. For a view of how that works, check out The Wall Street Journal‘s Red Feed / Blue Feed site, which shows the completely opposed (and hostile) views of the world that Facebook injects into the news feeds of people its algorithms consider “very liberal” or “very conservative.”

Is social media infrastructure? I suppose so. The mobile phone network certainly is. And right now we’re glad to have it, because Houston, the fourth largest city in the U.S., is suffering perhaps the worst natural disaster in the country’s history, and the cell phone system is holding up remarkably well, so far. Countless lives are being saved by it, and it will certainly remain the most essential communication system as the city recovers and rebuilds.

Meanwhile, however, it also makes sense to refract the mobile phone through the tetrad. I did that right after I shot the photo above, in this blog post. In it I said smartphones—

  • Enhance conversation
  • Obsolesce mass media (print, radio, TV, cinema, whatever)
  • Retrieve personal agency (the ability to act with effect in the world)
  • Reverse into isolation (also into lost privacy through exposure to surveillance and exploitation)

In the same graphic, it looks like this:

But why listen to me when the McLuhans were on the case almost three decades ago? This is from Gregory Sandstrom‘s “Laws of media—The four effects: A Mcluhan contribution to social epistemology” (SERCC, November 11, 2012)—

The REVERSES items might be off, the but others are right on. (Whoa: cameras!)

The problem here, however, is the tendency we have to get caught up in effects. While those are all interesting, the McLuhans want us to look below those, to causes. This is hard because effects are figures, and causes are grounds: the contexts from which figures arise. From Marshall and Eric McLuhan’s Media and Formal Cause (Neopoesis, 2011): “Novelty becomes cliché through use. And constant use creates a new hidden environment while simultaneously pushing the old invisible ground into prominence, as a new figure, clearly visible for the first time. Every innovation scraps its immediate predecessor and retrieves still older figures; it causes floods of antiquities or nostalgic art forms and stimulates the search for ‘museum pieces’.”

We see this illustrated by Isabelle Adams in her paper “What Would McLuhan Say about the Smartphone? Applying McLuhan’s Tetrad to the Smartphone” (Glocality, 2106):

 

Laws of Media again: “The motor car retrieved the countryside, scrapped the inner core of the city, and created suburban megalopolis. Invention is the mother of necessities, old and new.”

We tend to see it the other way around, with necessity mothering invention. It should help to learn from the McLuhans that most of what we think we need is what we invent in order to need it.

Beyond clothing, shelter and tools made of sticks and stones, all the artifacts that fill civilized life are ones most of us didn’t know we needed until some maker in our midst invented them.

And some tools—extensions of our bodies—don’t become necessities until somebody invents a new way to use them. Palm, Nokia and Blackberry all made smart phones a decade before the iPhone and the Android. Was it those two operating systems that made them everybody suddenly want one? No, apps were the inventions that mothered mass necessity for mobile phones, just like it was websites the made us need graphical browsers, which made us need personal computers connected by the Internet.

All those things are effects that the McLuhans want us to look beneath. But they don’t want us to look for the obvious causes of the this-made-that-happen kind. In Media and Formal Cause, Eric McLuhan writes:

Formal causality kicks in whenever “coming events cast their shadows before them.” Formal cause is still, in our time, hugely mysterious. The literate mind finds it is too paradoxical and irrational. It deals with environmental processes and it works outside of time. The effects—those long shadows—arrive first; the causes take a while longer.

Formal cause was one of four listed first by Aristotle:

  • Material—what something is made of.
  • Efficient—how one thing acts on another, causing change.
  • Formal—what makes the thing form a coherent whole.
  • Final—the purpose to which a thing is put.

In Understanding Media, Marshall McLuhan writes, “Any technology gradually creates a totally new human environment”, adding:

Environments are not passive wrappings but active processes….The railway did not introduce movement or transportation or wheel or road into society, but it accelerated and enlarged the scale of previous human functions, creating totally new kinds of cities and new kinds of work and leisure.

Thus railways were a formal cause that scaled up new kinds of cities, work and leisure.  “People don’t want to know the cause of anything”, Marshall said (and Eric quotes, in Media and Formal Cause). “They do not want to know why radio caused Hitler and Gandhi alike. They do not want to know that print caused anything whatever. As users of these media, they wish merely to get inside, hoping perhaps to add another layer to their environment….”

In Media and Formal Cause, Eric also sources Jane Jacobs:

Current theory in many fields—economics, history, anthropology—assumes that cities are built upon a rural economic base. If my observations and reasoning are correct, the reverse is true: that rural economies, including agricultural work, are directly built upon city economies and city work….Rural production is literally the creation of city consumption. That is to say, city economics invent the things that are to become city imports from the rural world.

Which brings us back to Houston. What forms will it cause as we repair it?

(I’m still not done, but need to get to my next appointment. Stay tuned.)

 

 

Who Owns the Internet? — What Big Tech’s Monopoly Powers Mean for our Culture is Elizabeth Kolbert‘s review in The New Yorker of several books, one of which I’ve read: Jonathan Taplin’s Move Fast and Break Things—How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy.

The main takeaway for me, to both Elizabeth’s piece and Jon’s book, is making clear that Google and Facebook are at the heart of today’s personal data extraction industry, and that this industry defines (as well as supports) much of our lives online.

Our data, and data about us, is the crude that Facebook and Google extract, refine and sell to advertisers. This by itself would not be a Bad Thing if it were done with our clearly expressed (rather than merely implied) permission, and if we had our own valves to control personal data flows with scale across all the companies we deal with, rather than countless different valves, many worthless, buried in the settings pages of the Web’s personal data extraction systems, as well as in all the extractive mobile apps of the world.

It’s natural to look for policy solutions to the problems Jon and others visit in the books Elizabeth reviews. And there are some good regulations around already. Most notably, the GDPR in Europe has energized countless developers (some listed here) to start providing tools individuals (no longer just “consumers” or “users”) can employ to control personal data flows into the world, and how that data might be used. Even if surveillance marketers find ways around the GDPR (which some will), advertisers themselves are starting to realize that tracking people like animals only fails outright, but that the human beings who constitute the actual marketplace have mounted the biggest boycott in world history against it.

But I also worry because I consider both Facebook and Google epiphenomenal. Large and all-powerful though they may be today, they are (like all tech companies, especially ones whose B2B customers and B2C consumers are different populations—commercial broadcasters, for example) shallow and temporary effects rather than deep and enduring causes.

I say this as an inveterate participant in Silicon Valley who can name many long-gone companies that once occupied Google’s and Facebook’s locations there—and I am sure many more will occupy the same spaces in a fullness of time that will surely include at least one Next Big Thing that obsolesces advertising as we know it today online. Such as, for example, discovering that we don’t need advertising at all.

Even the biggest personal data extraction companies are also not utilities on the scale or even the importance of power and water distribution (which we need to live), or the extraction industries behind either. Nor have these companies yet benefitted from the corrective influence of fully empowered individuals and societies: voices that can be heard directly, consciously and personally, rather than mere data flows observed by machines.

That direct influence will be far more helpful than anything they’re learning now just by following our shadows and sniffing our exhaust, mostly against our wishes. (To grok how little we like being spied on, read The Tradeoff Fallacy: How Marketers are Misrepresenting American Consumers and Opening Them Up to Exploiitation, a report by Joseph Turow, Michael Hennessy and Nora Draper of the Annenberg School for Communication at the University of Pennsylvania.)

Our influence will be most corrective when all personal data extraction companies become what lawyers call second parties. That’s when they agree to our terms as first partiesThese terms are in development today at Customer Commons, Kantara and elsewhere. They will prevail once they get deployed in our browsers and apps, and companies start agreeing (which they will in many cases because doing so gives them instant GDPR compliance, which is required by next May, with severe fines for noncompliance).

Meanwhile new government policies that see us only as passive victims will risk protecting yesterday from last Thursday with regulations that last decades or longer. So let’s hold off on that until we have terms of our own, start performing as first parties (on an Internet designed to support exactly that), and the GDPR takes full effect. (Not that more consumer-protecting federal regulation is going to happen in the U.S. anyway under the current administration: all the flow is in the other direction.)

By the way, I believe nobody “owns” the Internet, any more than anybody owns gravity or sunlight. For more on why, see Cluetrain’s New Clues, which David Weinberger and I put up 1.5 years ago.

« Older entries