Rails 2.2 Integration Tests Always Use the Cookie Store

If you’re upgrading an existing Rails app to 2.2 or later, and you aren’t already using the cookie session store, you will likely find that your integration tests are failing. The symptom will be a “500 Internal Server Error” mixed in with “E”s for each failing test. The stack trace at the end will start with:

NoMethodError: You have a nil object when you didn't expect it!
You might have expected an instance of ActiveRecord::Base.
The error occurred while evaluating nil.[]=
.../vendor/rails/actionpack/lib/action_controller/integration.rb:294:in `process'

If you dig into integration.rb, you’ll find it’s trying to access the @header hash.

The cause of this problem is that integration tests are now hard-coded to use the cookie session store. If you aren’t using it, there’s a good chance you haven’t defined a secret for it in your environment.rb. To solve, add this to your environment.rb:

  config.action_controller.session = { :session_key => '_session_key',
    :secret => 'long unguessable string' }

Get your own “long unguessable string” by running rake secret.

Code the Phone

I’ve started a new blog, Code the Phone, with a friend of mine. The purpose of it is all things related to application development and the business of selling apps on smartphones. Not surprisingly, we aren’t talking about anything other than the iPhone at the moment.

Subversion vs. Git

If you don’t like making up your own mind and prefer to ride the wave of whatever happens to be popular right now, you might think that, when it comes to source control, there are two choices: Subversion and Git, and no middle ground.

(Yes, there are others, but I’m trying to keep this simple.)

This is just not true. Daniel Jalkut addresses this from one perspective in Zealotry For Good And Evil, but I’d like to add to his arguments from the perspective of a single developer or small team.

Subversion works in mostly the same way that source control has worked for decades, so for anyone that’s been in the industry for a while, there are few surprises. This means that, for the most part, the source control system stays out of the way and does the jobs we need it to.

Git was built to solve a different problem: how to provide source control to a globally distributed team, where one centralized, master repository was viewed as a bug, not a feature. Individuals or small teams don’t have this problem.

Fundamentally, though, rather than having arguments about which source control system is “better,” recognize that they are tools, with different strengths, and use the one that feels the best to you and lets you get your work done efficiently.

Personally, I’m experimenting with using Git as a means to version controlling projects for clients where I expect to make changes in many places, but I don’t expect the project to last very long. For these, it seems more trouble than it’s worth to carve out a place in my central Subversion repository, especially because I won’t need to provide outside access to it. Git’s placement of the repository within my working tree is a feature here.

PostgreSQL and Ruby on Rails

The current state state of PostgreSQL on Rails (early 2009) is a bit of a muddy mess. There are three different gems you can use to connect ActiveRecord to PostgreSQL and no guidance I can find about which of the two native adapters to use. I’m going to try and clear this up.

postgres-pr is a pure Ruby adapter. It doesn’t require native libraries and should work in most situations. However, because it isn’t native, it’s the low performer and likely won’t offer access to all of PostgreSQL’s features.

postgres is the old adapter. It appears to be maintained now by Jeff Davis, who forked it from Dave Lee. If you are compiling against PostgreSQL 8.3, you must use Jeff’s version (currently 0.7.9.2008.01.28), which includes build fixes for 8.3.

pg is the new adapter, also maintained by Jeff Davis. He says this one has a better design than postgres and offers more features. It does not work prior to Rails 2.1, however. Rails 2.1 and later will attempt to load this driver first and fall back to postgres. If you use any plug-ins that monkeypatch the database driver, you might have problems with pg (my own sql_logging, for one, is broken — I’ll fix this shortly).

So:

  • If you can build native extensions, are on Rails 2.1 or later, use pg.
  • If you can build native extensions, but are on Rails 2.0 or earlier, use postgres.
  • If you cannot build native extensions, use postgres-pr.

Building native extensions on OS X can be tricky, though. Here’s what I use on an Intel Mac, using the Ruby and Rubygems that ship with OS X and PostgreSQL 8.3 from MacPorts. (Unlike Robby’s guide, I do not advocate moving the system’s Ruby out of the way. You’re likely to break other stuff if you do so.)

sudo env ARCHFLAGS='-arch i386' gem install pg --remote -- --with-pgsql-include=/opt/local/include/postgresql83 --with-pgsql-lib=/opt/local/lib/postgresql83

If you use a different version of PostgreSQL than 8.3, make the appropriate substitution. If you’re still on PowerPC, change the ARCHFLAGS to -arch ppc. If you want to use postgres instead of pg:

sudo env ARCHFLAGS='-arch i386' gem install postgres --remote -- --with-pgsql-include=/opt/local/include/postgresql83 --with-pgsql-lib=/opt/local/lib/postgresql83

Generate wsesslog Workloads for httperf

Over the last couple of days I’ve been bringing up an isolated test environment for a customer’s new site. (As an aside, one of the great things about moving to an Intel Mac is that I can run nearly any OS I want under VMware Fusion at near native speeds. You can’t beat testing in an identical environment, and I can throw pretty respectable virtual hardware at it, too: up to a 4-core with gigs of memory. If only Apple would let me virtualize OS X client.)

I’m using httperf to simulate client load on the test server and quickly decided that --wsesslog looked like the best choice for simulating an actual browser’s effect on the server.

A problem: how to generate those session workloads? I certainly don’t want to do this by hand for even one page. I want to generate a hit on every file referenced by the target page, but ignore anything hosted elsewhere.

A solution:

#!/usr/bin/env ruby
 
require 'rubygems'
require 'hpricot'
require 'open-uri'
 
if ARGV.length < 1
  $stderr.puts "usage: #{$0} url
 
  'url' must include the protocol prefix, e.g. http://"
  exit 1
end
 
url = ARGV.shift
if url =~ %r{^(https?://)([-a-z0-9.]+(:\d+)?)(.*/)([^/]*)$}i
  $protocol = $1
  $host = $2
  $document_dir = $4
  document_url = $5
else
  $stderr.puts 'Could not parse protocol and host from URL'
  exit 1
end
 
doc = Hpricot(open(url))
 
def puts_link(uri)
  return if uri.nil?
 
  if uri =~ %r{^#{$protocol}#{$host}(.*)$}
    puts "    #{$1}"
  elsif uri !~ %r{^https?://}
    if uri =~ %r{^/}
      puts "    #{uri}"
    else
      puts "    #{$document_dir}#{uri}"
    end
  end
end
 
puts "# httperf wsesslog for #{url} generated #{Time.now}"
puts
 
puts "#{$document_dir}#{document_url}"
 
(doc/"link[@rel='stylesheet']").each do |stylesheet|
  puts_link stylesheet.attributes['href']
end
 
(doc/"style").each do |style|
  style.inner_html.scan(/@import\s+(['"])([^\1]+)\1;/).each do |match|
    puts_link match[1]
  end
end
 
(doc/"script").each do |script|
  puts_link script.attributes['src']
end
 
(doc/"img").each do |img|
  puts_link img.attributes['src']
end
Digital TV Transition

The transition to all-digital transmissions for over-the-air television might be put off until June if Congress has its way, but this is one of those things that just needs to be done and get it over with.

The pros and cons of the switch have been reported enough already, so I want to talk about something I think will be a big deal to almost everyone that gets their TV over an antenna.

The way a digital converter box works is to take the place of the analog tuner in your TV or VCR. In a way, it’s helpful to think of it sort of like a VCR. When you play a tape, you tune your TV to channel 3, then do everything on the VCR. The converter box works the same way. It comes with its own remote and you tune the TV channel on the box, not the TV. So, first annoyance: new remote.

Here’s the worst part, though: taping shows. I think the people most likely to be affected by the transition are the people who still record shows using timed recordings on an old VCR. This won’t work any more. The VCR can’t tune the channel because it has an analog tuner inside it. That means it needs a converter box, but the VCR can’t tell the converter box to change the channel. You’d need to do that, by hand, before walking away. Forget taping two shows on the same night, back-to-back, on different channels. The original TiVo had a little IR transmitter you hung in front of your cable or satellite box to solve this problem (it transmitted the remote codes to change the channel), but no VCR I know of has one.

Want to watch one show while taping another (assuming you can live with the above problem)? Get two converter boxes: the VCR needs one for the show it’s taping and you’ll need a second one for the channel you want to watch on your TV. They better be different brands, too, or else you need a way to tell converter box #1 to ignore the remote for box #2.

My mom recently bought a Panasonic VCR/DVD-R combo unit with a digital tuner, but it refuses to record digital channels on the VCR side, so it’s not a great solution for people used to tapes.

February or June? I’m not sure it matters. The full impact of the transition won’t be known until people are forced to live in a digital-only world and they discover what doesn’t work anymore.

Announcing In Season 1.0

I am pleased to announce the immediate availability of In Season 1.0 for the iPhone and iPod touch. In Season is a produce shopping guide, inspired by a couple of recent books that address the problem of missing flavor in most produce found in American markets.

Here in America, we’ve become rather used to the idea that fresh fruits and vegetables are available whenever we want them. What we don’t always realize is that this convenience comes at the cost of flavor and price. Plants grow according to a schedule, and while we can force things somewhat (hothouse tomatoes, for example), if you want peaches in the dead of winter, they aren’t coming from the northern hemisphere.

Shipping produce from South America is a long trip to an American market, though, so food has to be bred to survive the journey and someone (the consumer) has to cover the costs of that travel.

There are even disadvantages to strawberries grown and sold in season. Strawberries are so fragile, growers have had to breed them exclusively for shipment, resulting in a berry that has only a pale shadow of true strawberry flavor left.

I created In Season to help my own family, and hopefully others, with this problem. Food tastes better when it’s grown according to its natural schedule, and even more so if you can find a local farmer supplying produce to your market. It will take less effort to grow it and supply will be higher, so you will pay less. Locally-grown produce also means less fuel is burned bringing that food to market, further bringing prices down and reducing your carbon footprint at the same time.

The economics of the App Store being what they are, version 1.0 is a toe in the water. If it is well received and I can justify further development, I have some great ideas to make it an extremely useful and educational app.

If you are interested in those books I mention above, they are terrific reads: How to Pick a Peach: The Search for Flavor from Farm to Table, by Russ Parsons and Animal, Vegetable, Miracle: A Year of Food Life, by Barbara Kingsolver.

It’s Just Software

This is a post I’ve been meaning to write for a very long time, but I’ve always had other, more pressing work, to do, and so I never wrote it.

Rands posted his version today, though he’s using my point as an introduction to an Election Day moral than as a post about software development itself.

Still, the first half of his post is spot-on. In software, given enough time, it’s possible to do just about anything. There’s a classic joke in software development circles: fast, cheap and good: pick two. It’s not a question of if, it’s a question of when.

My customers don’t always understand this. A seemingly minor change can have wide-reaching ramifications, perhaps doubling the time (and cost) of a project.

It’s great to have an idea of what you want to build, but it’s just as important to listen to an expert when it comes to implementation. “Can we do this?” is not a good question. The answer is almost always “yes.” “I’d like to do this, but I only want to spend this much (or, I’d like it by this date)” is far better.

Apple Sanctioned Developer Forums

Hallelujah: Apple is now offering official developer forums for the members of the iPhone Developer Program. It’s tied to your Apple ID, so they know who’s in and who’s not, and part of the usage agreement implies that they may provide forums for the discussion of pre-release software.

I hope they extend this to Mac development, too. The Apple mailing lists are great, but up until Leopard was released, the moderators on cocoa-dev were constantly fighting the tide of questions about Leopard-only APIs.

(Seen on Daring Fireball.)