Skip to content

The Alma Mater in the news… again

This time the news isn’t good press:

The death of a man found Thursday in a stairwell in a Rensselaer Polytechnic Institute building has been ruled a suicide.

Read it yourself

Templating languages, templating languages… and yet more of them

In the realm of web development, anytime you start using some software platform to deal with the tedium of writing web pages by hand, it’s quite often it leads to some sort of templating system to handle it.

While nice in theory and can promise many things (Seperation of logic from presentation, blah blah blah) from the standpoint of someone trying to use these systems it’s more like being thrown a new language every week.

As a personal anecdote, when I started learning Zope there was the promise that DTML would be the tool that would solve all one’s problems when working with Zope. However, as was often than not there were always these small things that got in the way like:

  1. Documentation never seemed up to date
  2. Documentation never seemed complete
  3. I was too stupid to figure out how to handle what I wanted to do
  4. The W3C released another what?? Does DTML handle that even???

So in the middle of it all I learned about this new fangled and shinier system called Zope Page Templates (ZPT) for short. I never really learned it that much as I was finally coming to terms with using DTML without being a complete newbie. Of course this ended up being a major hindrance to understanding this nice CMS package built on top of Zope called Plone however after this, I ended up moving on and not bothering much with Zope.

So how much of that built up knowledge did I get to take with me? About zero. Sure I could pick up yet another template language system that came along but that required once again to go through the docs and find out how to write something as basic as if again since it wasn’t EXACTLY the same. Anyways, my current view is that learning more templates is not on one of my higher priorities these days unless it gets me something I want and is rather portable. So seeing something like Hobo and something called Liquid that supposedly is tied in somehow with some Rails Blog app called Mephisto isn’t making me that enamored with some parts of Rails.

DHH captures it very well in
his blog post on templating languages

The pursuit of “no code”-templates reminds me of the search for the holy grail of the MDA camp with “no code”-programs. It’s mirage, but its also a play on words of the “a rose by any other name…” variety.

Amen.

A sad tale of technology dependence

I love technology misery stories for some reason. I guess working at a job where it is your job to expect things to break in weird ways does that to you… I guess we can reword that as

I LOVE corner cases

A sad tale of technology dependence

I made sure nobody was ever going to guess this password. Not even me.

And then I immediately forgot it.

To quote Simpson Garfinkel:

Authentication systems frequently fail because they are actually based on something that you have forgotten, something that you have lost, or something that you no longer are. Performance-based biometrics (e.g. keystroke dynamics) fail when they are based on something that you could once do well but can no longer do, or something that other people can do consistently, but you do erratically.

Thanks to Mr Buu Nguyen for this link

Finally a gem release of ruby-opengl

Just wanted to let folks know that I’ve finally figured out how to get the build system in place for ruby-opengl to:

  • Gemify itself
  • Build native extensions during Gem installation using mkrf

Which means (I hope) that there should be an easier way to get OpenGL working with Ruby. Currently it should support installing in Linux and OS X. Installation should be as difficult as:

gem install -y ruby-opengl

Deinstallation should be similar.

For win32 users, I’d suggest using the old bindings provided with the all-in-one installer although I’d like to get a gem built for win32 so hopefully it can get included in the all-in-one installer.

References

Messages like Warning: require_gem is obsolete. Use gem instead. driving you nuts?

If you’re seeing tons of those messages like I’ve been and wondering what in the world that is all about. After Googling around I finally found a definitive answer on Jason Young’s Blog what it was all about.

What is it

Since rubygems 0.9.0 the command require_gem has been deprecated in favor of just plain gem. Part of the reasoning for it going the way of the dodo is tied to some
problems with autorequire. I don’t know any of the deep voodoo that powers rubygems but uh.. okay.. sure.

How to fix it

If you’re still getting these warnings you’re recommended to try re-installing all gems you might have installed if you have a 0.9.0 or lower installation. Happy Ruby Hacking.

References

Optimized for more memory than yesterday’s big iron servers

Here’s something I never want my Operating System to be asking of me

users should consider 4GB of RAM if they really want optimum Vista performance

Migrating instiki from one database type to another

The Rant

Instiki is one of the premier wikis for Ruby on Rails which is another way of saying the other rails-based solutions don’t look that great so far from what I’ve seen. Here is an except from the instiki website:

1. Download
2. Run “instiki” with Ruby 1.8.4 or greater. (on windows, that’s instiki.cmd)
3. Chuckle… There Is No Step Three™! 🙂 (See note)

Yeah, nice and simple… to install. Here’s some facts that instiki DOESN’T tell you without digging around:

  • It uses sqlite as its database backend to store all wiki information (except images and file uploads).
  • It’s NOT Rails 1.2.x ready
  • It was written BEFORE ActiveRecord and was migrated to ActiveRecord later
  • You can export your data but have no way to import your data easily (more later)
  • The documentation from a USER perspective is pretty sparse
  • Active development seems to be moving at a snail’s crawl

So what does this all mean? Well, if you want to migrate your instiki instance to a different instance you’d like to think you have a couple of options.

  1. Just copy the whole application to the new server, kickstart done (This works)
  2. Dump out the wiki in some export friendly format and re-import it back in

Option #1 works quite well however if you want to switch database backends you’re stuck with option #2. However save yourself hours of time (I didn’t) and DON’T try the following things (unless you’re a rails god. In which case why are you reading this?)

  • Try exporting the wiki in textile format and importing it…. oops import is busted. A stub is there (http://localhost:2500/wiki/import) but it is definitely broken
  • Dump the sqlite database with sqlite utilities then re-import them into target database. This doesn’t work with MySQL as far as I know. I bet it’s broken with Postgresql too
  • Look at the source code
    to find where the issue might be. Don’t worry, even the changelogs say import is busted and there is no other sign of some way to input data easily
  • Write some tool to import textile in. That would require reading the APIs, right? bleh
  • Write some sort of ActiveRecord translator? Well sort of… but it’d be nice if there was a tool available. And if you’re not a rails guru you’re sort of screwed in this case.

Now I’m sure DHH (the creator of instiki) will grumble about something about this not being ‘appropriate’ for its original intentions and blah-blah. Gee that’s nice. Maybe you better put that down in the caveats, before I commit a whole bunch of data into it.

The Fix

Okay, enough ranting about half-baked Open Source Rails apps. Here’s the fix.

1. Add plugin script to instiki app

Most normal rails app will include a file called plugin inside $RAILS_ROOT/script/plugin. Unfortunately, instiki doesn’t include it. Luckily the script is really small since it just calls out to the rails framework to handle most of the hardwork. So try this:

$ echo "
#!/usr/bin/env ruby
require File.dirname(__FILE__) + '/../config/boot'
require 'commands/plugin'
" >> $INSTIKI_HOME/script/plugins
$ chmod +x $INSTIKI_HOME/script/plugin  # Make it executable
2. Install the manage_fixtures plugin

I’m afraid I don’t have time to explain what a Rails Fixture is in this post but just think of it as a portable way of representing data going into a database for rails. It works well but don’t try hundreds of megabytes of data.

I also won’t go much into rails plugins except to say it’s a way of extending a rails application with more functionality (probably for a Rails developer rather than an end user of the application). We’re going to use a really sweet tool called manage_fixtures which allows us to move around data between different databases with ease (assuming the rails app isn’t doing weird SQL crap).

$ cd $INSTIKI_HOME
$ script/plugin discover    # Hit enter a gazillion times
$ script/plugin install manage_fixtures
3. Export the data into fixtures

The following will export all your db entries as fixtures into $INSTIKI_HOME/test/fixtures. You need to set the RAILS_ENV to production because by default it will try to use the development database.

$ cd $INSTIKI_HOME         # You SHOULD be here anyways
$ RAILS_ENV='production' rake  db:fixtures:export_all
4. Change the database connection to desired

Now change the file in $INSTIKI_HOME/config/databases.yml from:

production:
  adapter: sqlite3
  database: db/production.db.sqlite3

to the following (adjust mysql parameters to suit your setup)

production:
  adapter: mysql
  host: localhost
  database: instiki
  username: root
  password:
4. Create database and the appropriate tables

The following commands will create a database named instiki (or whatever you put in databases.yml) and make sure the database schema in the database matches instiki’s schema.

$ echo "create database instiki" | mysql -u root
$ RAILS_ENV='production' rake migrate
5. Finally re-import the data from fixtures

Whew! Finally we can re-import the data into MySQL (or whatever) and continue on using instiki for something.

$  RAILS_ENV='production' rake db:fixtures:import_all

A quick review on routers for an ISP

OpenBSD gets high remarks even though their OpenBGPD offering is rather new.

Read it yourself

Public Key Infrastracture gone wrong (or why you MUST have Windows to function in S. Korea)

Gen Kanai writes about S. Korea’s dependence on Internet Exploder due to misplanning a Public Key Infrastructure

South Korean legislation did not allow 40 bit encryption for online transactions (and Bill Clinton did not allow for the export of 128 bit encryption until December 1999) and the demand for 128 bit encryption was so great that the South Korean government funded (via the Korean Information Security Agency) a block cipher called SEED. SEED is, of course, used nowhere else except South Korea



one legacy of the fall of Netscape is that Korean computer/Internet users only have an Active X control to do any encrypted communication online

Read it yourself

Reviews on the Nokia N800

Eugenia of OSNews writes a nice lengthy review on the Nokia N800. It has many improvements compared to its predecessor. Some things they got right:

  • Good Battery life (10-15 hrs standby, 3-5 hours in actual usage)
  • Faster processor
  • Great Wi-Fi reception
  • Some support for VoIP (GoogleTalk, Gizmo?)
  • Future Skype Support
  • Opera Web Browser version handles Flash now
  • Sure support for SD cards >1GB
  • Reasonable price (~$400 USD)

However there are some flaws:

  • They broke backwards compatibility with the previous N770 apps
  • Flash can’t handle Google Videos or Youtube
  • Craptastic MPEG-4 support

I have to say no Youtube and crappy MPEG-4 support for me is a dealbuster. There are a gazillion of these small devices that playback video however none of them play back ENOUGH video formats for me to be compelling. Although the N800 this time comes much closer. I guess I’ll wait for the N900 to roll about.

Other Reviews

There are other reviews floating on the net… here’s a list. I didn’t bother reading them as Eugenia’s was thorough enough to not require sifting through tons of probably useless fanboy lushing in order to find how it stacks up.

Linux and the MSI-7265 Motherboard

In an
earlier post

I wrote about the pains of Core 2 Duo motherboards and Linux support. Since September, there has been quite a bit of progress in the Linux community to support the JMicron SATA/PATA controller that is on the Intel P965 based motherboards. However, I’ve found that things are STILL not all rosy with my
MSI-7265 (aka the P965 Neo F model) motherboard. After looking around I found an interesting review on the MSI-7265 motherboard:
The Register Hardware writes:

The P965 Neo is a simple, no fuss but no frills motherboard with a decent layout

What this means is that the MSI-7265 does not offer very good access to many of the devices via the BIOS versus some other P965-based motherboards do such as the Asus P5B. After a bit of wrangling of my Gentoo installation I was able to get Linux installed, upgrade the kernel to 2.6.19-*mumble some Gentoo patch*, and boot properly.

What’s working

  • The SATA AND the PATA controller on the jmicron chipset. (Allows using a DVD drive and the hard drive)
  • Onboard Audio
  • The onboard Realtek 8110SC-based ethernet port (requires getting a driver from the Realtek website)
  • The USB ports

What’s NOT working

  • Any of the Serial ATA ports off the ICH8 controller
  • Suspend (It sleeps but never comes back up)

References

The Reg Hardware Review on the MSI 7265 Motherboard
Ubuntu Thread on MSI P965 Neo Install
An older Ubuntu thread on the MSI P965 Compatibility
Linux SATA Driver Status Page (Really useful!)
A Japanese post on trying to get the MSI P965 working (They failed)

Rubygems gotcha. Dependencies won’t work if you install with a local gemfile

The Problem

Dependencies won’t install when I try to install from a local gem. For example if you have a my-cool-project which
needs your-awesome-lib (available via Gems) using a command like:

gem install pkg/mycoolproject-1.0.gem

It WON’T pull in your-awesome-lib. This is documented in this email post on the Ruby Talk ML

Workarounds

At present the problem still exists and I can’t seem to find a bug related to it on Rubyforge (Perhaps a bug should be submitted?) so you can get around this by installing your dependencies first then installing your local gem

gem install -r your-awesome-lib
gem install pkg/my-cool-project-1.0.gem

A discussion on Mercurial’s repository format (Good or Bad?)

Keith Packard of X11 fame wrote a blog post on revision control repository formats. In it he describes why he chose git as the new system that will manage the source code for Xorg and he had some choice comments on Mercurial repository format

Mercurial uses a truncated forward delta scheme where file revisions are appended to the repository file, as a string of deltas with occasional complete copies of the file (to provide a time bound on operations). This suffers from two possible problems—the first is fairly obvious where corrupted writes of new revisions can affect old revisions of the file. The second is more subtle — system failure during commit will leave the file contents half written. Mercurial has recovery techniques to detect this, but they involve truncating existing files

However, some people who use Mercurial have answered back to those criticisms.
RVBurke writes:

Mercurial uses a compact representation of data with separate revlog files for each tracked file, manifest and changelog, which are all append-only. Due to the append-only nature of those writes, the changes in each new revision don’t affect previous revisions. You are that way as safe as you can be in any other system with respect to writes and the space usage is very good.

To achieve similar space efficiency git needs to pack the repository data. This is done rewriting the repo, and the operation has to be done from time to time (repack).

IF the atomic append-only writes to the manifest and revlog files in Mercurial can be considered dangerous, then repacking is even more so, as it forces a rewrite of all the repo data, multiplying the chance of a failure.

So, if any corruption can happen on a faulty write it will hit git (unpacked) or Mercurial in the same way, but anytime you pack your repo in git you’re risking your data and the write fails you can corrupt its repository.

and Matt Mackal in an email to the Mercurial mailing list has the following reply:

Mercurial files are append-only. New revisions (corrupted or not), do
not affect old revisions. If a hardware or operating system bug causes
a write to scribble over the wrong part of a file, all bets are off of
course. But such bugs could just as easily scribble over any other
part of the disk, so I think this is no more of an issue than it is
for any other possible system.

Personally, I’m a bit confused. I’m not an expert on repository formats so I don’t fully understand whether the criticism or the defense is stronger. However, I have chosen to use Mercurial in most of my own personal projects because it has one feature for sure that git does not. Windows compatibility. While many will denigrate native Windows compatibility. It’s also good to remember that a huge portion of machines on this planet still run Windows. The fact that there’s little information on running git on windows besides ‘Install cygwin’ makes it hard to evaluate whether git is ‘good enough’ on Windows or not.

Wireless drivers and Open Source

The Jem report has a
great article describing the intracies and difficulties in
why wireless drivers suck under Linux/FreeBSD and most other Free operating systems out there.
A choice quote from an Atmel representative is extremely enlightening on why some companies are far more open to OSS drivers:

You’ve only got three real chances for success: you can be first to market with a technology, or you can have valuable and unique features that no one else has and the market wants, or you can have the lowest price.

Atmel wasn’t first, didn’t have any new unique features, and wasn’t the cheapest, either. With the PC and OEM markets being somewhat locked out, we repositioned to focus on the embedded space where the market was experiencing and predicting large growth. In the embedded market, if you don’t get documentation to developers, then you both fail.

Basically, the players that are friendly to OSS basically lost the market initiative so OSS drivers are a way to grab a certain sub-segment of the market. Kind of sad to hear this to tell you the truth but understandable.

Read it yourself

Island off India claimed by Global warming?

At least the
Independent seems to think so
. While I don’t deny global warming it’d be nice if
they had more information on how old these island have been above water…