RabbitMQ Queue Auto-Delete

Breadcrumbs for myself.

Queue config options:

  • exclusive: delete when declaring connection closes. No other
    connection can even access this. Only queues can be exclusive.

  • autodelete: delete when the last downstream thing (i.e. consumers for
    a queue, or queues for an exchange) goes away. Note that this isn't
    bound to channels or connections at all really.

  • exclusive is more frequently useful than autodelete

More about auto-delete:

  • auto_delete queues will delete themselves when their consumer count drops to 0 from a higher number. Otherwise
    they would delete themselves as soon as they were declared...

  • The downside of exclusive queues is that they can only have consumers on the connection that created them. So this doesn't work for shared queues, or queues created by some app that are subsequently to be consumed from by some other app.

  • Auto-Delete queues not getting deleted

Other options:

Fragile Tools

Not as much fun as Fraggle Rock.

I'm currently dependent on the following fragile tools:

  1. Twitter
  2. Google Reader
  3. To a lesser extent, Firefox and Chrome

What do I mean by "fragile tools"? Consider Twitter: It started as a fun ecosystem for development, where anyone could whip up an interesting alternative way to interact with the service. Now? Not so much. Remember "track"? Yeah.

And Google Reader seems to have only one developer -- just babysitting it. The API is not documented and can change at any moment, without notice. Plus, Google has been sunsetting services left and right. Having killed off virtually all competitors, when will Google shut down Reader?

Firefox and Chrome are both moving to narrow their feature sets. Both seem incredulous that I might want to enter the URL of an XML file and view that file in my browser window. On the plus side, both are light-years ahead of Internet Explorer or Safari in usability and reliability.

Any of these tools could very quickly stop offering key features or go away entirely. The precariousness of my position as a user of those tools leaves me in a constant state of anxiety.

P.S. This post primarily is intended to test a one of the plugins I'm using. So, it's really just a bunch of conclusory statements without any references or argument.

Making FiOS Actiontec Router and Apple Bonjour Services Play Together

I have Verizon FiOS at home. It's very fast, but if you want optimal speed and all the features the service offers, you must use the router supplied by Verizon. This week, Verizon replaced my old router with a new one: Actiontec MI424WR-GEN3I. Unlike the older model it replaced, the new router is a Gigabit LAN router and a 802.11n wireless router. That's great! But with the new speed also came a new problem: Bonjour services were't working properly on my home network. Specifically, my MacBook (using WiFi) could not find other computers and printers on the network.

Apparently, this is a common problem.

First, I tried this supposed solution: disable 802.11b mode. No luck.

Then, I saw this suggestion: create an ACL whitelist entry for 224.0.0.251/255.255.255.255 in the IGMP proxy settings on the Actiontec router. That sounded ridiculous, so I kept looking.

Finally, I came across this: disable the IGMP proxy on the Actiontec router. I am loathe to disable a default setting that I don't understand, but this fixed my problem instantly, and I have yet to see any negative consequences.

Using watch with a bash alias

I love the Unix watch command. On OSX, you can install it easily with Homebrew:

brew install watch

Something I didn't realize until 10 minutes ago is that if you want to watch the output of something in your bash aliases, watch will complain because it cannot find the command. This is because watch evaluates the command you pass to it with 'sh -c', which does not expand aliases. However, if you also create an alias for watch itself, aliases will work. So, you can add the following to your .bashrc:

alias watch='watch '

Note the trailing space inside the quotation marks.

Link:

Fixing Node.js v0.8.2 Build on Linux

There's a nasty gcc bug on RedHat (RHEL 6) and CentOS Linux (and related) that gets triggered when you try to build Node.js v0.8.2: pure virtual method called.

Solution: Run make install CFLAGS+=-O2 CXXFLAGS+=-O2 instead of just make install.

More info:

Mongoose Indexes and RAM Usage

If you're using Mongoose, you've changed your indexes, and you're wondering why you've run out of RAM, go into the Mongo shell and manually drop any indexes you are no longer using. Mongoose has no method for deleting indexes you're not using any more, so they accumulate, gobbling up RAM.

Now that you've cleaned out those unused indexes, restart Mongo. After the cache warms up (and depending on how many indexes you deleted), you could see a dramatic decrease in RAM consumption.

A Gotcha Using Node.js + Request In a Daemon

I have a Node.js program running as a daemon on a Linux VPS. Periodically, it polls a list of URLs using request. When it first starts, everything runs smoothly. But after running for a while, it starts getting 400 errors, and the longer it runs, the more URLs return 400 errors.

I could not understand what was going on. My code was basically structured like this:

Given that code, we know the req object is initialized with each function call. So, how could this script degrade over time?

Well, I finally tracked it down: COOKIES!

Yup, request has cookies enabled by default. So, I think what was happening was that cookies were being set (presumably, top domain-level cookies having the same name at different URLs or subdomains on the same domain) but the values in request's cookie jar were not being returned properly. That means the remote host was getting invalid cookies -- hence the 400 response for a "Bad Request."

I haven't yet spent the time to figure out if this is a bug in request. It's on my TODO list.

In the meantime, I've disabled cookies in the req object:
var req = { url: url, timeout: options.timeout, jar: false };

It's now working as expected.

Current Projects

A couple of fun things I've been working on to learn some new programming skills, namely node.js and MongoDB.

1. News Bit -- Remember Share Your OPML? Me too!

2. Linkblog -- A super easy linkblog tool.

I plan to open source the linkblog code soon. It also uses a shorturl tool I'm working on -- shorturl is open source. I still need to hook the hit stats (collected by the shorturl tool) into the linkblog tool, but other than that the tools work well!