My Geek Project
I’ve begun contributing posts at My Geek Project. This site is mostly about the server administration part of IT, rather than software development, but there is a good mix of contributors.
I’ve begun contributing posts at My Geek Project. This site is mostly about the server administration part of IT, rather than software development, but there is a good mix of contributors.
The current state state of PostgreSQL on Rails (early 2009) is a bit of a muddy mess. There are three different gems you can use to connect ActiveRecord to PostgreSQL and no guidance I can find about which of the two native adapters to use. I’m going to try and clear this up.
postgres-pr is a pure Ruby adapter. It doesn’t require native libraries and should work in most situations. However, because it isn’t native, it’s the low performer and likely won’t offer access to all of PostgreSQL’s features.
postgres is the old adapter. It appears to be maintained now by Jeff Davis, who forked it from Dave Lee. If you are compiling against PostgreSQL 8.3, you must use Jeff’s version (currently 0.7.9.2008.01.28), which includes build fixes for 8.3.
pg is the new adapter, also maintained by Jeff Davis. He says this one has a better design than postgres and offers more features. It does not work prior to Rails 2.1, however. Rails 2.1 and later will attempt to load this driver first and fall back to postgres. If you use any plug-ins that monkeypatch the database driver, you might have problems with pg (my own sql_logging, for one, is broken — I’ll fix this shortly).
So:
Building native extensions on OS X can be tricky, though. Here’s what I use on an Intel Mac, using the Ruby and Rubygems that ship with OS X and PostgreSQL 8.3 from MacPorts. (Unlike Robby’s guide, I do not advocate moving the system’s Ruby out of the way. You’re likely to break other stuff if you do so.)
sudo env ARCHFLAGS='-arch i386' gem install pg --remote -- --with-pgsql-include=/opt/local/include/postgresql83 --with-pgsql-lib=/opt/local/lib/postgresql83
If you use a different version of PostgreSQL than 8.3, make the appropriate substitution. If you’re still on PowerPC, change the ARCHFLAGS to -arch ppc. If you want to use postgres instead of pg:
sudo env ARCHFLAGS='-arch i386' gem install postgres --remote -- --with-pgsql-include=/opt/local/include/postgresql83 --with-pgsql-lib=/opt/local/lib/postgresql83
Over the last couple of days I’ve been bringing up an isolated test environment for a customer’s new site. (As an aside, one of the great things about moving to an Intel Mac is that I can run nearly any OS I want under VMware Fusion at near native speeds. You can’t beat testing in an identical environment, and I can throw pretty respectable virtual hardware at it, too: up to a 4-core with gigs of memory. If only Apple would let me virtualize OS X client.)
I’m using httperf to simulate client load on the test server and quickly decided that --wsesslog looked like the best choice for simulating an actual browser’s effect on the server.
A problem: how to generate those session workloads? I certainly don’t want to do this by hand for even one page. I want to generate a hit on every file referenced by the target page, but ignore anything hosted elsewhere.
A solution:
#!/usr/bin/env ruby require 'rubygems' require 'hpricot' require 'open-uri' if ARGV.length < 1 $stderr.puts "usage: #{$0} url 'url' must include the protocol prefix, e.g. http://" exit 1 end url = ARGV.shift if url =~ %r{^(https?://)([-a-z0-9.]+(:\d+)?)(.*/)([^/]*)$}i $protocol = $1 $host = $2 $document_dir = $4 document_url = $5 else $stderr.puts 'Could not parse protocol and host from URL' exit 1 end doc = Hpricot(open(url)) def puts_link(uri) return if uri.nil? if uri =~ %r{^#{$protocol}#{$host}(.*)$} puts " #{$1}" elsif uri !~ %r{^https?://} if uri =~ %r{^/} puts " #{uri}" else puts " #{$document_dir}#{uri}" end end end puts "# httperf wsesslog for #{url} generated #{Time.now}" puts puts "#{$document_dir}#{document_url}" (doc/"link[@rel='stylesheet']").each do |stylesheet| puts_link stylesheet.attributes['href'] end (doc/"style").each do |style| style.inner_html.scan(/@import\s+(['"])([^\1]+)\1;/).each do |match| puts_link match[1] end end (doc/"script").each do |script| puts_link script.attributes['src'] end (doc/"img").each do |img| puts_link img.attributes['src'] end |
The transition to all-digital transmissions for over-the-air television might be put off until June if Congress has its way, but this is one of those things that just needs to be done and get it over with.
The pros and cons of the switch have been reported enough already, so I want to talk about something I think will be a big deal to almost everyone that gets their TV over an antenna.
The way a digital converter box works is to take the place of the analog tuner in your TV or VCR. In a way, it’s helpful to think of it sort of like a VCR. When you play a tape, you tune your TV to channel 3, then do everything on the VCR. The converter box works the same way. It comes with its own remote and you tune the TV channel on the box, not the TV. So, first annoyance: new remote.
Here’s the worst part, though: taping shows. I think the people most likely to be affected by the transition are the people who still record shows using timed recordings on an old VCR. This won’t work any more. The VCR can’t tune the channel because it has an analog tuner inside it. That means it needs a converter box, but the VCR can’t tell the converter box to change the channel. You’d need to do that, by hand, before walking away. Forget taping two shows on the same night, back-to-back, on different channels. The original TiVo had a little IR transmitter you hung in front of your cable or satellite box to solve this problem (it transmitted the remote codes to change the channel), but no VCR I know of has one.
Want to watch one show while taping another (assuming you can live with the above problem)? Get two converter boxes: the VCR needs one for the show it’s taping and you’ll need a second one for the channel you want to watch on your TV. They better be different brands, too, or else you need a way to tell converter box #1 to ignore the remote for box #2.
My mom recently bought a Panasonic VCR/DVD-R combo unit with a digital tuner, but it refuses to record digital channels on the VCR side, so it’s not a great solution for people used to tapes.
February or June? I’m not sure it matters. The full impact of the transition won’t be known until people are forced to live in a digital-only world and they discover what doesn’t work anymore.
I am pleased to announce the immediate availability of In Season 1.0 for the iPhone and iPod touch. In Season is a produce shopping guide, inspired by a couple of recent books that address the problem of missing flavor in most produce found in American markets.
Here in America, we’ve become rather used to the idea that fresh fruits and vegetables are available whenever we want them. What we don’t always realize is that this convenience comes at the cost of flavor and price. Plants grow according to a schedule, and while we can force things somewhat (hothouse tomatoes, for example), if you want peaches in the dead of winter, they aren’t coming from the northern hemisphere.
Shipping produce from South America is a long trip to an American market, though, so food has to be bred to survive the journey and someone (the consumer) has to cover the costs of that travel.
There are even disadvantages to strawberries grown and sold in season. Strawberries are so fragile, growers have had to breed them exclusively for shipment, resulting in a berry that has only a pale shadow of true strawberry flavor left.
I created In Season to help my own family, and hopefully others, with this problem. Food tastes better when it’s grown according to its natural schedule, and even more so if you can find a local farmer supplying produce to your market. It will take less effort to grow it and supply will be higher, so you will pay less. Locally-grown produce also means less fuel is burned bringing that food to market, further bringing prices down and reducing your carbon footprint at the same time.
The economics of the App Store being what they are, version 1.0 is a toe in the water. If it is well received and I can justify further development, I have some great ideas to make it an extremely useful and educational app.
If you are interested in those books I mention above, they are terrific reads: How to Pick a Peach: The Search for Flavor from Farm to Table, by Russ Parsons and Animal, Vegetable, Miracle: A Year of Food Life, by Barbara Kingsolver.
This is a post I’ve been meaning to write for a very long time, but I’ve always had other, more pressing work, to do, and so I never wrote it.
Rands posted his version today, though he’s using my point as an introduction to an Election Day moral than as a post about software development itself.
Still, the first half of his post is spot-on. In software, given enough time, it’s possible to do just about anything. There’s a classic joke in software development circles: fast, cheap and good: pick two. It’s not a question of if, it’s a question of when.
My customers don’t always understand this. A seemingly minor change can have wide-reaching ramifications, perhaps doubling the time (and cost) of a project.
It’s great to have an idea of what you want to build, but it’s just as important to listen to an expert when it comes to implementation. “Can we do this?” is not a good question. The answer is almost always “yes.” “I’d like to do this, but I only want to spend this much (or, I’d like it by this date)” is far better.
Hallelujah: Apple is now offering official developer forums for the members of the iPhone Developer Program. It’s tied to your Apple ID, so they know who’s in and who’s not, and part of the usage agreement implies that they may provide forums for the discussion of pre-release software.
I hope they extend this to Mac development, too. The Apple mailing lists are great, but up until Leopard was released, the moderators on cocoa-dev were constantly fighting the tide of questions about Leopard-only APIs.
(Seen on Daring Fireball.)
Surfin’ Safari has a great post talking about some recent improvements to the Web Inspector in the latest WebKit nightly builds. Safari’s Web Inspector has long been a really great tool for examining the DOM and simple profiling of a requested page and its resources, but these new updates make it a serious tool for web developers.
If you’ve never tried a WebKit nightly build before, it is easy (on a Mac, anyway) to run it side-by-side with Safari. You don’t have to give up your stable, released browser to check out these new features. Still, it will be very nice when these improvements make it into an official release.
For Firefox users, YSlow is a very nice extension built on top of Firebug that provides a detailed report card on page load performance, including suggestions for improving it.
This evening I finally received an email from Apple informing me that our first application for the iPhone and iPod touch is ready for sale. I’m pleased to announce Pat Counter, a simple way to keep track of a running count without having to hold something in your hand or look at a screen to tap a button.
It’s a simple application, but it was an interesting experience in developing and shipping an application for a new platform. Rands was right about 1.0: it’s amazing how much work and time it takes to take care of all the little details, even in something as simple as Pat Counter. Valley start-ups really do go out of business because it can be so hard. (Aside: Rands’s book Managing Humans is quite good.)
I wasn’t able to find any data points out there about lead times, but for me it was five days between submission of the binary to Apple and approval for sale. That includes a weekend, but it seems like apps are still approved then, just at a slower rate.
Finally, a word of advice: make sure your binary is right the first time. I goofed my initial build and it took seven days before Apple told me about it, by which time I’d already noticed it myself. Replacing the binary was effectively the same as the initial submission; I went to the back of the line. Read the instructions for a distribution build closely and follow them exactly.
Update: It’s now available on the App Store with a direct link, but isn’t in the search results.
Update 2 (Sep. 15): It now shows up in search results, a mere five days after approval.
Apple Inc. announced Mac OS X 10.6 “Snow Leopard” at WWDC’08 on Monday, but other than a press release and a page each for 10.6 Client and Server, very little information was given that isn’t covered under the WWDC NDA.
Still, one “feature” is interesting, especially in the face of (mostly confirmed) rumors that 10.6 will drop support for the PowerPC:
Snow Leopard dramatically reduces the footprint of Mac OS X, making it even more efficient for users, and giving them back valuable hard drive space for their music and photos.
Dropping PowerPC will save reduce the size of executable code by around 50%. Here is the output of otool -f for iTunes:
Fat headers fat_magic 0xcafebabe nfat_arch 2 architecture 0 cputype 18 cpusubtype 0 capabilities 0x0 offset 4096 size 17345088 align 2^12 (4096) architecture 1 cputype 7 cpusubtype 3 capabilities 0x0 offset 17350656 size 17053824 align 2^12 (4096)
The PowerPC’s CPU type is 18 (look in /usr/include/mac/machine.h), so the PowerPC code in the main iTunes executable accounts for 50.4% of the total. I’m going to assume that this ratio will mostly hold true for frameworks, too.
The entire iTunes bundle is 122 MB, though. Trimming the 16.5 MB + 1 MB of bundled frameworks off of that is only about 14%; not what I’d call a “dramatic” footprint reduction. What else might they have planned?
Another couple of options are transparent compression at the filesystem layer and conversion of image resources to a vector format.
Apple’s own page says that read/write ZFS will be in 10.6 Server, so it’s reasonable to expect it will be in Client, too, perhaps with some of the more advanced features disabled. ZFS includes an option to transparently compress data. NTFS on Windows has had this for years, and third-party products did it on FAT years before that, so it’s entirely reasonable to expect that Macs will finally get this, too. 71% of the iTunes bundle is resources, mostly localizations, and inside those, the largest directories are for the in-app help, which is HTML.
Support for resolution independence has been rumored for years, and while Apple has supported it to some degree since 10.4, it hasn’t really caught on. Might they finally be converting all of the system image resources to a vector format? I don’t have any numbers on it, but a vector graphic is certain to take up much less space than a 512×512 PNG.