Export Mailman Subscriber Address Lists Without SSH/Shell Access

I recently had a need to export the contents of a GNU Mailman-based mailing list for WordCamp Philly, so I poked around the web-based Mailman admin interface, hoping to find an easy way to export a .CSV or at least a list of names, to no avail. The only supported, native way is to use a couple of Mailman shell scripts from the command line. Alas, since this particular list is hosted on a shared hosting account, I lacked the proper access to pursue this route.
There are a few ways to get around this, but they involve

  1. overly-complicated internal Mailman commands emailed directly to the listserv WITH THE ADMIN PASSWORD CONTAINED IN THE EMAIL ITSELF or
  2. use of third-party Python scripts with external dependencies or
  3. cutting-and-pasting by hand.

Perhaps you heard that laughter from where you sit? That was me. Laughing maniacally at those “solutions”.
I needed simple, I needed quick, so after a bit of Google scouting, I came across the following (ever-so-slightly modified/generalized):

wget -O - --post-data 'adminpw=${admin password}'  http://${listserv domain name}/admin.cgi/${list name}/members | egrep "_realname" | sed 's/^.*value="\([^"]*\)".*value="\([^"]*\)".*$/\1,\2/' | sed 's/%40/@/' > maillist.csv

This single-liner should work on any Linux/Mac/BSD/etc. box with a decently up-to-date version of wget installed on it and it should dump a nice comma-separated list into a file, maillist.csv. Let’s break it down a bit so that you can see how it’s working its magic.

wget -O - --post-data 'adminpw=${admin password}'  http://${listserv domain name}/admin.cgi/${list name}/members

Here, ${admin password} is the actual Mailman list admin password, ${listserv domain name} is your actual Mailman server’s DNS name (in my case, it was lists.phillywp.org, but yours will obviously differ) and ${list name} is the actual list slug.
NOTE: Some Mailman setups have pretty URLs turned on, in which case you may need to use admin/ instead of admin.cgi. The fastest way to determine the proper URL is to simply visit your Mailman list admin page and take note of the full URL.
This portion of the code retrieves the contents of the ${list name}‘s membership list by essentially screen-scraping the entire page. This is a step in the right direction, but obviously not very human-readable. The output is littered with HTML output that makes picking the subscribers’ needles out of that particular haystack a bit of a chore. The next two bits help on that count.

| egrep "_realname"

This bit strips all lines except those containing the subscribers’ real names. Again, useful, but still a bit too much info for our purposes.

| sed 's/^.*value="\([^"]*\)".*value="\([^"]*\)".*$/\1,\2/'

Here we look for two values contained in the line, specifically the subscriber’s real name and email address. We then output those two values as essentially Real Name, email Address.

| sed 's/%40/@/'

There’s one small problem with the comma-separated values we exported in the step above — the @ symbol is actually encoded as an HTML entity, namely %40. By running that sed, you’re changing %40 into the universally-understood @
So, when all is said and done, you should have a nice, concise, comma-delimited text file containing a complete list of your Mailman subscribers, one per line. I hope someone else finds this to be as time-saving a proposition as I did.

From The Dep’t Of Unintentional Hilarity

Red Hot And Hate
I don’t think the hiring committee really knows what it’s asking for on this one.
“Fear leads to anger. Anger leads to hate. Hate leads to suffering. RedHate leads to a general dislike of humanity and root access.”

…To The Dump, To The Dump, To The Dump Dump DUMP!

That’s right, folks, it’s time for another browser-clearing session ’round these parts, so strap in and enjoy.
What’s worse than finding out that a former classmate was published in Linux Journal? Finding out that he works for Google and races BMWs on the weekends, of course. Hi, Laz!
The world’s oldest profession? Not recession proof. How did I know about this? CNN Headline news decided it was newsworthy enough for a noontime story slot. I think we’re going to need a smaller violin, folks.
The creator of that Big Picture blog I mentioned the other day was interviewed over at Waxy.org. Pretty interesting stuff. And, if you’re interested in seeing ground-level shots from the Midwest flooding, the Red Cross started a category on their WordPress blog devoted to galleries from their on-the-scene staff. There are some very cool and moving shots among those posted, so give it a look.
Continue reading “…To The Dump, To The Dump, To The Dump Dump DUMP!”

Just Call Me MacGyver: Systems Administrator


I’m working a bit late today as I had to schedule some downtime for a critical piece of hardware that needed to be powered down, lightly disassembled, have a part replaced and then powered back up. As I unscrewed the part to be replaced, I thought to myself “Boy, it’d sure stink if I dropped this case screw down into the bottom of the case, what with this rack-mounted system being without true rails and all and thus really hard to get into.” Bet you can guess what happened next.
Cursing my luck, I sprinted back to my desk, grabbed my flashlight and the closest thing to a mirror I had on hand — a Dell system recovery CD. Back to the rack I went and, using the CD as an adjunct periscope, I located and managed to fish out the screw, no fuss, no muss.
I’m so proud of me.

First-Rate Geeky Command Line Head-Smackage

BASH - in the flesh.
WARNING: GEEKY CONTENT AHEAD
If you have no desire to read about login shells, Linux, source code management or other similarly geeky content, you’d best be skipping this one. -ed.
Have you ever allowed a nuisance to go on for literally years simply because you couldn’t be bothered to do enough research to effectively nip it in the bud? I personally had two such nuisances (of a particularly geeky variety) come crashing down this past week.

BASHing My Head In

I, like many UNIX users that spend a good deal of time in a command line environment, prefer to customize my environment so that I can save myself keystrokes, work and headaches. Through judicious use of environment variables, aliases and custom shell prompts, I have made it easy for me to be able to determine where in a filesystem I am at a glance, run commands from any number of frequently-accessed binary directories, ssh to my various and sundry boxes, etc. I have done this on every UNIX box that I have spent any considerable amount of time on since at least my early days in college and, as I am a dyed-in-the-wool BASH user, I have always stored my preferences in a file called .bashrc that sits in the root of my home directory. While at Lehigh, having a .bashrc was sufficient to automatically customize my environment every time I logged in. However, ever since joining my current firm, I have been unable to get any of the UNIX boxes at work to recognize my configuration file automatically. Instead, I have had to type bash each time I logged in in order to obtain the customizations.
Two days ago, I had a brainstorm – I realized that some users were known to squirrel their preferences away in a file called .bash_profile and, in a fit of pique, I symbollically-linked my .bashrc to ~/.bash_profile, then logged in to a random UNIX box. Lo and behold, I was immediately presented with my fully-customized shell. I was at once elated and furious – I have, over the past six years or so, typed “bash” countless times, meaning that I could have saved myself and my fingers 4 x countless keystrokes, wear and tear and keyboard mileage. Grrrr.

Subversive Behavior

I update all of the installations of WordPress that I maintain via Subversion and have largely automated the process via a shell script, although I have left a few of them out of the script so that I can update them more and/or less frequently as situations require. In both real-time and in my scripts, I traverse into the base directory of each blog and run a Subversion update; in other words, `cd [blog directory];svn up`. I was goofing around a couple of days ago and decided to actually pass the directory as an argument to the Subversion update, so I ran a test `svn update [blog directory]` from the base of my Dreamhost home directory. Et voila!, it worked like a charm. To date I have thus effectively wasted thousands of both keystrokes and CPU cycles traversing my directory tree instead of simply running a single command.
I share these insights in the hopes that they will save someone, somewhere some measure of blood, sweat, tears, effort and tedious manpage reading.

Burnination, The Follow-Up

With apologies to Trogdor: Shortly after I posted my review of Toast, I stumbled across a comparison review of a few of the top Mac burning software packages. Toast makes an appearance, as do Burn and two others, as well as a 5th one that was suggested in the reviews’ comments (LiquidCD) which got me to thinking about other media-related downloads worth your time.
First up, Windows users looking to convert their media over to handheld-appropriate formats ought to look into Videora which handles the conversion tasks for the Microsoft-addled. Next up is Democracy, an incredible video aggregator with support for RSS “channels” and BitTorrent downloads. It’s available for Mac, Linux and Windows, so platform concerns should be nil. Mac users looking to correctly tag their iPod-ready videos so that they show up correctly in iTunes should look into Lostify, your one-stop-shop for all your video tagging needs.
Last of all, those of you looking to get caught up on TV shows you missed should check out ShareTV, a site that looks to centralize torrents for a lot of the top-flight shows currently on TV in one easily-accessible website. Be sure to give it a look.

Computing News And Notes

A few small compuer-related things have popped up over the last few days and I thought I’d take a second or two to jot down my observations for the benefit of all you fellow Intertron users out there.

  1. Apple’s recently-released 10.4.8 update to Mac OS X added a nifty “zoom using scroll wheel feature, allowing users to use their mouse wheel (or, in my case, a two-fingered scroll on the touchpad of my MBP) to zoom in on the area of the screen directly below the mouse cursor. Very nice for on-the-fly graphic design and handy for those with poor eyesight too, I’d imagine.
  2. On a second Apple note, I spent around 6 hours of my work day yesterday cursing Apple’s very name, as a firmware update for the newly-arrived Mac Pro (which will function as an OpenDirectory server until we are able to get our hands on some new Intel Xserves, at which point the Pro will most likely become my primary workstation. W00t!) adamantly refused to apply. I followed Apple’s instructions exactly time and again and was ultimately frustrated in my attempts to apply the EFI update. I had nearly exhausted my Google Fu when I happened across a random comment on a blog entry (I’ve since misplaced the actual search result) stating that the EFI update won’t run from a RAID array, as Mac OS X doesn’t actually support booting from a RAID setup on Intel boxes. Nice for Apple to tell me this, as I had (you guessed it) been attempting this update process from a nice 500GB RAID 0 array. Crikey. I threw a 250GB disk I had lying about into the box, installed the non-server OS X 10.4.7 from the DVDs that came with the Pro, ran Software Update and managed to update the firmware a mere 5 minutes after finishing the install. Arrrgh!
  3. If you’re a Red Hat Enterprise Linux user and you’re considering obtaining an Alienware Aurora desktop on which to use your chosen operating system, I have one simple piece of advice: Don’t. Buy. An Alienware. ‘Least not an AMD64-based one. I bought an Aurora SLI for work with the notion that it would be a screamer; instead, it has been a nigh-unending pain in the butt. The sky2 driver apparently freaks out every once in a while, bringing the machine to its knees and forcing a hard reboot, the onboard soundcard is really not an option, and NVIDIA’s Linux SLI drivers are prone to occasional lockups. This, combined with the fact that Alienware’s customer support stinks would suggest to me that RHEL users (and perhaps Whitebox/CentOS users, by extension) would be wise to avoid the Aurora. I make no claims for Mandriva, Fedora, Ubuntu, Gentoo, etc. users. Caveat emptor is the moral, I guess. I look forward to moving to the Mac Pro as my full-time workstation soon.

U.N.helpful

Jules Crittenden of the Boston Herald really took it to the French in re: their hypocricy in Lebanon in recent days. To wit:

In recent weeks, France stepped forward to act as a broker of peace in Lebanon. “Act” is the key verb in that last sentence, as it now would seem that the only other verifiable part of the sentence is “in recent weeks.”
To correctly parse that sentence, one must understand that when France suggested it wanted to broker peace in Lebanon, it did not necessarily mean “broker” or “peace” or “Lebanon” in the way we might understand those words. The same is true when France further suggested it wanted to “lead” a “strong” “multinational” “force” there.

Heh. Go and read the whole thing – it’s a biting take on the folly of “international” “action” in the Middle East.