Last month I went to Boston to be part of the Public Media conference, which I described to everyone I saw there as the NPR conference, even though most of the people there didn’t work for NPR and I knew it.
I was actually trying to make a point, one that otherwise would have taken a lot of words to express, but could be said simply if I was willing to look a little inept and uninformed. The point is this — the distinction between the different parts of the public media ecosystem are lost on people outside the ecosystem. I tend to think of it all as “public radio” — more today than in the past — and eventually, I think they will too.
Before the Internet, I listened to KQED. That meant listening to shows I wasn’t interested in, like Pacific Time or Latino USA. Now, after having lived in Seattle, Boston and Florida, I’m an NPR listener. I found shows on WBUR that KQED doesn’t carry. My favorite show comes from WNYC. I’m a fan of DIane Rehm who does her work at WAMU, but I first heard her on WJCT. I still listen to Fresh Air from WHYY, but I only listen to the podcast, and only when the program interests me.
In a few years, the transition to the Internet will be so complete that the link between the call letters and a local area will be meaningless. The stations won’t even broadcast. Then someone at NPR will swallow the hard truth that the distinctions mean so little to anyone outside their industry that they might as well just collapse it down and call the whole thing NPR.
Which brings me around to the lecture that my friends and colleagues in the blogosphere have tried to deliver in the last 24 hours to Mr Zell, the new owner of a bunch of big important newspapers.
It could be that Zell is brilliant, and is saying something that simplifies the truth to make a bigger point, and he doesn’t mind if you think he’s inept if some people get the bigger picture — which is he thinks of the Internet and Google as being the same thing, and you know what — I bet a lot of other people do too, and they have a point. Like the public radio stations, maybe we’re fooling ourselves if we think we’re not writing for Google, as they are fooling themselves into thinking they’re not creating for NPR. We want to cling to our theory that each of us is independent of the others, but what if he’s right, and it’s us vs them. What if his friends in the newspaper business decide they want to compete against us directly. What if my pointers into the LA Times and the NY Times stop working? Or what if he offers you a job to come write for his company so your pointers do work?
So stop and think a bit before you stop listening, and try to get beyond your impulse to dismiss him just because he said something that’s technically inaccurate. He could be smart as a fox.
It was so great to see new episodes both shows tonight. I missed Tony and Carmen, Bobby and Tony’s sister (esp the story about her boyfriend, heh). I missed Eric, Turtle, Drama, and I hope Vince gets to play Pablo Escobar, and would you guys just forgive Ari! Lovely lovely lovely. I missed the whole thing. Can’t wait for new episodes of Big Love and my absolute fave, The Wire. I’m a total fan. Love, Dave
One of the most intriguing comments came from Paul Ding, who suggests that the overhead of htaccess files may be too large a burden to bear and says that one could (clever!) use the file system to do what I was trying to do with the htaccess file. That may be true, but I want to know if Apache really reads and parses the htaccess file for every access. Is it not optimized to store the commands in an internal format and then check the mod date before re-loading and parsing the file? Either way, it doesn’t seem to make a difference on my server, whose performance monitor hovers near the baseline even with lots of commands in various htaccess files.
It’s been a really interesting weekend, most of it out of view of blogging. I have been continuing the project that involved Apache. I’m doing a static rendering of the Harvard site that hosts the RSS 2.0 spec. It’s going well, with the help of the community, it occurs to me that one of the things Scripting News could be is an online tech support workgroup for Apache.
I think we must all go through this rite of passage, the docs for Apache are so cryptic and inadequate. The design of Apache itself is weak. But it is workable, you know that eventually you’ll puzzle it out, and if you can find the right people to help, they can show you how to do what you need to do quickly and surely.
I’m lucky because the techies who read this site really know their stuff. I know how good they are, because when I’m hunting for answers to Apache questions, the best resources are discussion threads scattered around the web, where people like me asked questions of people like you, and got good answers. But I got more thorough and informed answers than I saw anywhere else, and most important, they explained the theory behind the solutions, so I could in turn pass on my knowledge later.
The cool thing about Scripting News has always been how smart these people are, how good-natured they are, and how they like to show off what they know! This is a very useful combination of skills.
Anyway, as of yesterday I had completed the exporting of the named pages on the site, they’re all linked into the index page on the new static site. These are, generally, the spec itself, the pages that link from the spec, the example files, and various documents announcing the transition of the spec from UserLand to Harvard ownership. This morning I’m working on exporting the blog posts. Then we come to the comments, and I think I’ll stop there, because there has been so much comment spam on this site, that after the technical work is done, comes the editorial work of deciding what’s spam and what’s not spam, and I’ve been very carefully avoiding questions that involve editorial judgement. My goal has been to turn over the content of the site, so the new rendering will be as future-safe as we know how to make a site in 2007. It’s been an incredible learning experience!