You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Ruby and web applications: surfing the nerd’s waves

ø

My life is stop and go. My latest: reading all about household employee’s tax. Took me two days, but I think I got it. The forms to send, the amount owed… First there is social security and medicare, kicks in at 1400$/year, then federal and state tax, kicks in at 1000$ per quarter, then there is worker’s comp, kicks in at 16 hours/week. As I am getting more help for my mother, there is more and more to do… beyond paying the person.

I got the agile rails book, so I finaly had a chance to look at it. Luckily, I had tried to install a wiki before, so I had php and mysql installed. All I had to do was reinstall mysql because I could not find the passwords. Then I found a Rails installation incantation which worked for my computer, and I was able to follow the examples — more or less. This is when I found out that unlike Ruby which folks who know a computer language can understand immediately, Rails is for web based computer experts only. Like RubyCocoa it is a giant pre-written code which is way beyond my means to fathom.

OK, so I can’t run a big powerboat, but I found Ruby, and it can still lead somewhere. After all the reason I got started with Ruby was Rails, and the reason I wanted Rails was because I wanted to know how to run our experiments off the web, without learning Java, Perl, pithon… Ruby it turns out is enough to write web applications with something called cgi.
There is even a book written on how to write experiments with cgi, “How to Conduct Behavioral Research over the Internet: A Beginner’s Guide to HTML and CGI/Perl”.

Of course learning ruby cgi may not be easy: “A lot of the focus on Ruby Web development these days is on the Rails platform (http://www.rubyonrails.com/), so it can be hard to find much for plain old CGI. There’s a very simple article (http://coolnamehere.com/geekery/ruby/web/cgi.php), the Ruby/Web chapter from “Programming Ruby 1st Edition” (http://ruby-doc.org/docs/ProgrammingRuby/html/web.html). I had trouble finding anything else when I dug around on Google.”

And then, there is the matter of the web server. Apparently, Tiger runs on Apache. But that is not the only choices: “lightTPD is the BEST webserver out there, much faster than apache and much much much more flexible/configurable/secure than tux, and its fcgi php support is simply unbeatable”. Well, but remember I am surfing the waves of the pros, and Apache is installed on my tiger, while lighttpd isn’t. But already it is suggesting that while apache ruby cgi is probably the way to go, I may get better result with lighttpd, ruby and fcgi… another install delight at the horizon.

And then there are other webservers, mongrel, webrick, which is compared to cherrypy for python and Lua!!! and also apache and lighttpd. again, I am getting too far into the wave “Zed and I worked over the weekend on smoothing out the divide between Camping (the 4k web framework) and Mongrel (the slim new Ruby web server mentioned last week.) In just a few days, Mongrel has caught the scent and is totally Campnivorous. Development gems await you.” This web server was written this week-end! On the internet, you can get lost because you are off the beaten path and there is not enough information to get going, or because you are too close to the beta crest and even the installation instructions are greek.

How strange to immerse myself in this noisy world of nerds. There is a physical web, with emails going from computer to computer in search of their destination. And there is the verbal web, with advice and information thrown into the air and connections established between folks who understand each other, while the rest falls back into nothingness. Folks like me get sucked in by mirages that shimmer out of reach, but our personal disappointments and setbacks are but a day on the beach surfing the waves, while the water carves out our future landscape!?

Digital sculpting, is there a bridge between computers and sculpture?

ø

I started with a well defined project which is on a back burner, until I master the many skills required: rubycocoa for the window interface, multithreading for the real time processing of data, HID interface.

This led me to object oriented programming, and Ruby, the onlyl sane programming language – Objective #C and Java are too formal, too in your face in their requirement
of voodoo incantations. — I like Ruby. I have written a program to read data at least 3 different ways, and gained some confidence, and I am eager for more.

And then my nephew showed me some 3D head rendering he made in lightwave. He claims that working with 3D and interacting with the shape/aspect has helped his drawing ability. It occured to me that digital sculpting might be a way for me to work on modeling, that is on seeing a simplified structure underlying the figure or the head.

Here is an example of what can be done.
http://3dny.org/?p=11

As far as the figure is concerned, 3D realistic programs like poser is not for me. What folks do is model a bone, and then model a muscle on top of the bone. The animation gives no information on how a real human being’s butt difforms as they walk. It is moving bones, the muscle information on top goes along as cosmetics. Even if they corrected every so many frames, what you get is one person’s understanding of anatomy, not the physical reality.

Bone animation can still be interesting. “Introducing Maya for beginners” has a skeleton with joints, and I thought this would do as well or perhaps better than the skeleton dangling in the studio. At least it would be possible to set it in all kinds of impossible gravity situation.

The next question is whether to move the skeleton manually with a mouse, or whether to animate it with a script. The advantage of the script is that you can reproduce everything exactly, the disadvantage is that the script required to go from one position to another may not be obvious. With some experience, moving the body manually in Maya may not take long, but Maya is not cheap!

This lead me to OpenGL which seems the base of computer graphics anyways. If I could transfer the skeleton from maya or elsewhere to OpenGL and find how to move the joints in code or with real time mouse interaction, I could have a nice dedicated tool to teach sculpting. I think that is a great idea for an inexpensive software, but I have not seen it out there.

Luckily I found that one can program OpenGL with Ruby, so now I have ruby, ruby-openGL and rubycocoa to learn, along with rubyonrails of course — which requires MySqL — which I installed to try to set up a wiki. And maya or zbrush to see what I can figure out about modeling the head.

Why do I suddenly have so much to learn, when for years, it was routine. Why the explosion of technology I want to master? Sure, the prospect of looking for work is a great incentive, but it just seems to me that things are finally moving and coming to a sync. For years it was C++, and then C++ and Java, and that was all I knew of.

I think things are moving because of the internet and the many folks who help others in dedicated discussion sites. Without them, I could never have found the instructions or the help to install the various things I have tried to install. In each case it took me days, but I succeeded. I think this means the internet is helping many moderately techie “newbies” like me to get on the wagon and acquire skills which before where unreachable except through a structured class.

Wouldn’t it be amazing, if my knowledge of computers and my love of clay and sculpting the figure could come together in one activity, and all that mostly at home, where I can watch my mother, and make sure she is ok! Who knows, maybe in a year or two!

I found a Ruby class on the internet!

ø

Ruby is an object oriented programming language which can post on the web with Ruby on rails. There are books on Ruby and on ruby on rails, so it is not very new, but still new enough that some people want to learn it.

there is an announcement on
http://www.linuxchix.org/

You have to sign in with courses
http://mailman.linuxchix.org/mailman/listinfo/courses
and then the threads are found here
http://mailman.linuxchix.org/pipermail/courses/2005-November/date.html#2026

It is really excelent. Everybody I talked to had a positive impression of Ruby. This course will give me the push I need to really look into Ruby instead of just talking about it. My own post hasn’t shown up on the thread yet, but hopefully it will eventually, and I look forward to having other people to ask questions from, and see what they are learning.

On the install, my main difficulty was how to install ruby from source
Luckily, I found a blog that saved me,
http://www.downes.ca/cgi-bin/page.cgi?post=31620
I went into the directory in the terminal window,
ls …
than I think I ran
./configure
make
sudo make install
rubycocoa had an installation package which made it easy to install.

I am working myself through
http://www.macdevcenter.com/pub/a/mac/2004/10/05/cocoa.html?page=1
and although the cocoa part, Nib, connecting outlets etc is very similar to what I did with objective C, the code does not have any reference to memory so far, which was the part that really ennoyed me.

With Ruby, I can learn cocoa and object oriented programming, so it is not such a big detour.

The joys of cocoa and Objective C programming

7

In earlier blogs, I must have complained about the lack of apple support because figuring out how to talk to a USB device seemed so hard. But now I think it is amazing that I was able to use some apple code without any understanding of the structure of the code, and talk to my data both synchronously and asynchronously in USB mode. (I did not tackle the HID side of the story yet).

This is not the first time I have “lifted” some piece of code, assembler code, C functions, etc, because that code does the job, without worrying too much about how I could write that bit of code myself.

But this time, I backed off from the light stone project and I am taking the time to learn Cocoa and Objective C. Ail Ail Ail. MAMA MIA! This is not at all like learning a programming language. It is like learning a high level language like Matlab, Sas, except that it is NOT user friendly, it is programmer friendly, maybe. I have been trying to figure out why learning this “language” blocks me.

A simple example from chapter 14 from Hillegass’s book. This example is supposed to dazzle us with the ease with which we can create a graphic window and draw on it. This ease is achieved by having all the work of creating windows all done for us by functions already written by apple.

But unlike a program like Matlab which hides the bells and whistles from beginners, Cocoa requires you to grab 20 different threads and weave them together. For someone like me who functions on understanding and not on memory, this is very taxing. Not unlike a cooking recipee, except you don’t choose where the ingredients are kept. grab a project from the “File” menu, create a new class from the class menu on “main menu.nib” then grab an icon from “cocoa containers” and drag it on “window”, find the “info” option on the “tools” menu on the “interface builder” … On and on! And I am supposed to remember all this for all the different kinds of interface??? Yak.

The other thing that blocks me is the arbitrariness of the code which is nicely swept under the carpet in the book. Page 224 instructs us to enter the code

– (void)drawRect:(NSRect)rect

{
NSRect bounds = [self bounds];

[[NSColor redColor] set];

[NSBezierPath fillRect:bounds];

}

Voila, no explanation as to what this code does, how it is structured, why this particular code.

The miracle tool on the mac is AppKiDo 0.94. In the kindom of the blind, it is the stick that may just save you from axphyxiation and death from despair.

Type in the method “bounds”, and you find it belongs to NSview. It returns the windows size and coordinates.

type in the method “redColor”, and you find it is a class method which belongs to NSVview.
it is a class method because it is not applied on an instance.
OK so you have an RGB code, what does “set” do? Too many possibilities, better to go to the window, click on NSObject, click on NSColor look under all classes methods and all instances methods, and there is “set”, under instance methods. So apparently redColor applies to a class, but returns an instance, and so “set” is and instance method that says use this color for future drawing. Clear as Mud this beautiful code, no?

NSBezierPath is a class, so fillRect is a class method. it fills the rectangle with the above color. Note that NSArray is another class under NSobject. Now with NSArray, you created an instance of the class with alloc and init, before you used it. Why in this case, we would act on a class to draw in a window instance. Well that is the arbitrariness I am talking about.

At the level of the programmers, it is all very clear and beautiful and grammatical, and built with simple rules all developers can use to build these beautiful programs the user craves for her/his new Mac OSX platform. But at the level of the beginning user, it is all a medley of contradictory invocations best learned by memory. How can such a language replace basic C? YAK.

Asynchronous threads, or “runloops” to monitor a “source”

3

Once I had the synchronous read, it was just a matter of continuing with the example and setting up an asynchronous read (which does not loose data the way the synchronous read does).

I will take out the declarations and error checks which are in the USB simple example code

To start with, I declared gbuffer to be of size 320000 instead of 8 (to capture 5 mn of data) and set that declaration at the head of the program (so it is global)

In the synchronous case, all the instructions execute one after the other. Nothing can be done while the data is read. In the asynchronous case, the data is read from the input pipe at the same time as the main program is executing its tasks (on a single processor, the two codes alternate)

One way to set up two pieces of code to execute simultaneously is to start a “thread”.

But if the thread needs to remain running so that it can process requests at a future time, one uses a thread’s run loop. The run loop monitors input sources attached to the thread and dispatches the events it receives to the thread’s installed handler functions. After the handlers finish processing the event, the run loop returns the thread to its idle state and waits for more events.

Event-driven applications receive their events in a run loop. A run loop monitors sources of input to the application and dispatches control when sources become ready for processing. When processing is complete, control passes back to the run loop which then waits for the next event.

with the command, CFrunlooprun control is passed to the run loop until CFRunLoopStop

And this is more or less what happens in the transferdata function: create a source, CreateInterfaceAsyncEventSource, add the source to the runloop, CFRunLoopAddSource, request the input of data for a certain number of bytes, ReadPipeAsync, and start the runloop so it can monitor the source and read the data as it becomes available.

next I need to read about the notion of a callbackfunction

void
MyCallBackFunction(void *dummy, IOReturn result, void *arg0)
{
printf(“MyCallbackfunction: %d, %d, %d\n”, (int)dummy, (int)result, (int)arg0);

CFRunLoopStop(CFRunLoopGetCurrent());
}

void transferData(IOUSBInterfaceInterface **intf, UInt8 inPipeRef)
{

numBytesRead = sizeof(gBuffer);

err = (*intf)->CreateInterfaceAsyncEventSource(intf, &cfSource);

CFRunLoopAddSource(CFRunLoopGetCurrent(), cfSource, kCFRunLoopDefaultMode);

err = (*intf)->ReadPipeAsync(intf, inPipeRef, gBuffer, numBytesRead, (IOAsyncCallback1)MyCallBackFunction, (void*)(UInt32)inPipeRef);

CFRunLoopRun();

//when the program reaches this line of code
// the reading of the transmission of the data is over, and one can read
// the data in gbuffer


}

Reading data from an USB input interrupt pipe (endpoint)

2

Taking the kext file out, the Device was treated as an HID device. I have found two xcode projects which work, HID explorer and HID examples, but I am stuck right now, because these HID projects talk about elements, and USB prober talks about endpoint. I have not yet found what the link is between endpoints and elements.
Putting the kext file back in, I found several xcode projects for the USB, USBPrivateDAtaSample and SimpleUserClient (which still give me error messages) and USBSimple Example which I got to work: I put the vendor and product id, looked for interrupt pipes instead of bulk pipes, and added a synchronous readpipe command.

char gBuffer[8];

UInt32 numBytesRead;

numBytesRead = sizeof(gBuffer);

err = (*intf)->ReadPipe(intf, inPipeRef, gBuffer, &numBytesRead);

if (err) printf(“Unable to perform interrupt read %2x %4x %4x\n”,err_get_system(err),err_get_sub(err),err_get_code(err));

for (i = 1; i USElessthanSYMBOLHERE numBytesRead; i++) printf(“Read %x (%ld bytes) from interrupt endpoint\n”, gBuffer[i],gBuffer[0]);

(the first byte of data is a number, 7 usually, the rest are characters)

The key to success was being able to interpret the error message! I read how to print and interprete the error code ox38 ox00 ox2e8

http://developer.apple.com/qa/qa2001/qa1075.html

looking up in IOreturn.h, 0x028 is data overrun!

#define kIOReturnOverrun iokit_common_err(0x2e8) //

The other key to success was the apple USB forum site who helped with the error code

http://lists.apple.com/archives/Usb

where I was told that I should try reading at least maxPacketSize bytes, where maxPacketSize bytes is a USB prober info on the enpoint pipe. So I tried with a buffer size of 8, instead of 64 or 2, and bingo, the error message went away, and I was reading data!

Of course it helps that I have the PC opensource code to figure out what to do with the data!!!!

So a bright day, I talked to my device. Now I have to learn how to access it asynchronously.

Mac OSX: using a unix command to look for a kext file

ø

Apple’s tech help on the usb discussion list helped me out:

I have a kernel extension matching to the IOUSBInterface
of my device and preventing the IOUSBHIDDriver from matching to the
device.

To search for it

find /System/Library/Extensions -name “Info.plist” -exec grep -H
1155 “{}” \;

This searches all the Info.plist files for one that contains
“1155”, the idVendor of the device. Judging from the IOProbeScore
of 100000 and the values in the ioregistry, the kext in question is
doing a vendor specific match by specifying idVendor/idProduct/
bcdDevice/bConfigurationValue/bInterfaceNumber.

The result of the search is
/System/Library/Extensions/STMicroTestDriver.kext/Contents/Info.plist:
1155

When I remove the file STMicroTestDriver.kext, the wild divine can no longer find the device, but HID explorer lists the lightstone along with everything else.

TheWD software needs access to the device, but without the kext, the HID Driver will open the device and they can’t open it.

If the HID driver doesn’t match to the device, I can’t use the HID mgr to talk to it.

So I need to go two incompatible ways.

1 remove the kext file (disabling WD),so I can use HID explorer and matlab

2. keep the kext file in place, and learn to use IOUSBInterfaceInterface calls to talk to the device (as long as the WD driver/app isn’t running). So I can send/receive USB requests.

HID Explorer

ø

My tech support told me about HID Explorer.
http://developer.apple.com/samplecode/HID_Explorer/HID_Explorer.html

It is a complete projet with C code which does something when I run it: It provides information on all HID devices. So 1. I can find out what I have, 2. I can see examples of a program I can run in xcode, debug etc.

Unfortunately, something is off with my device, HID explorer does not see it as an HID device. this was confirmed by running

ioreg -lw 0

2.7 meg of information! the main one being that there is no hid info in there at all for my device.

What next! This is like learning a language. There are folks out there who know it! Human beings have an amazing ability for languages. Helas!!! not all of us!

The mac console and USB drivers

1

Today, I feel pretty optimistic about this project, even though I am not any closer to the goal:
“many devices these days [including this sensor] are made as generic HID (Human Interface Devices) that comply with the USB specifications, so you can enumerate and create a handle to the device (treat it like a file) within c/c++.”

Generic is good. I don’t have to write a driver. C is good, I don’t need to learn about the kernel. Ok, so it can be done, but how?

What I learned about my USB device and its driver

In applications, there is a utility called console which records error message and background processes. Clicking on the log icon shows that there are several logs going on. But only the default one shows any activity when I start the game.

USBSimpleExample: Starting

Found device 0x000091ab

dealWithDevice: found 1 configurations

found interface: 0x000092db

dealWithInterface: found 2 pipes

dealWithPipes: grabbing BULK IN pipe index 1, number 1

dealWithPipes: grabbing BULK OUT pipe index 2, number 2

It seems to me that the game called a generic USB driver called USBSimpleExample which sets up the pipe communications for the device throught the hierarchy described below:

The device is associated to pipes which are connections from the host controller to a logical entity on the device named an endpoint. The pipes are synonymous to byte streams
each pipe is uni-directional, either in or out of the device. endpoint 0 is used to control the device on the bus

To access an endpoint, a hierarchical configuration must be obtained.

1. The device connected to the bus has one (and only one) device descriptor

2. the device descriptor has one or more configuration descriptors. These configurations often correspond to states, e.g. active vs. low power mode.

3. Each configuration descriptor in turn has one or more interface descriptors, which describe certain aspects of the device, so that it may be used for different purposes: for example, a camera may have both audio and video interfaces.

4. Each interface descriptors in turn have one default interface setting and possibly more alternate interface settings

5. Each interface setting has an endpoint descriptor, as outlined above. An endpoint may however be reused among several interfaces and alternate interface settings.

Devices that attach to the bus can be full-custom devices requiring a full-custom device driver to be used, or may belong to a device class. the same device driver may be used for any device that claims to be a member of a certain class. HID is one such device class.
http://en.wikipedia.org/wiki/Universal_Serial_Bus

So there is a generic USB driver. Somehow you tell it you want to talk to your particular device, it finds the device, talks to the device, finds out from the device how many pipes to set up, in this case one for in, one for out.

So I can learn how to access a generic USB device, or how to access a HID compliant device. A third possibility is to see if I can access my sensor through matlab and the psychtoolbox. Quite a few ways to go.
http://psychtoolbox.org/usb.html

Boredom and the wild divine biofeedback light stone

2

I have decided to try to talk to my wild divine hardware. This means diving into the computer world, and getting way over my head. By the look of it, I will never make it. But if I do make it, I better leave a trail behind, for any lost soul out there, who finds the earth is not complex enough as it is, and wants to explore human’s ephemeral technical world.

So the problem is simple enough. I have a biosensor which measures heart beat and electrical conductance, or some such. this sensor talks through a USB port to a software game, the wild divine. How can I write a program that accesses the sensor and plots the data in real time.

The net says an application cannot access a hardware device directly. It needs to talk to a “Device Manager” or “I/O Kit” or else talk to a manager which will talk to this device manager, and the device manager talks to a “device driver” which talks to the “hardware device”.

I have the game installed on my computer. So although I have not idea where, or if this file is readable, there should already be a device driver for the sensor.

Also, I am not starting from scratch. Someone created an open software for PC which is supposed to do what I want to do. http://sourceforge.net/projects/lsm/ It has about a 100 files, which is not really encouraging, but there is a file with USB and lightstone in the name that seems to read information from the sensor, so that file may be the meat of the matter (the driver) for the PC.

Through a couple fas computer help desks, I have located two sources of help. There is a group of employees at Harvard called abcd, which gathers people into email lists concerning various hardware/software interests. There is also a computing society linked through the computer department.

Through ABCD I got a response from BU about a tutorial in linux on how to write a device driver
http://www.linuxjournal.com/article/7353
This gives no information on what language/compiler etc is used to create, build and load the driver, or to bind it to hardware, so it is not a lets start from scratch tutorial, but at least, it suggests that a device driver can be writen.

Log in