You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Cloud Delivery Platform — Thought Model

I was asked to take a look at a thought model for a Cloud Delivery Platform. I’m not going to post that graphic, because it wasn’t ready for prime time. It was confused and I’m not really sure what it was trying to show. So I created a bit of a diagram showing what I thought was important. (Just click on it, it doesn’t seem to fit right, and the next size down is too small to read.)

cloud_delivery_platform

Halfway through I realized I’m not terribly good at making these things. I’ll get better. But there’s a few things I wanted to draw attention to in this graphic.

The Roles

This was the most important thing to me is to highlight the crossover of roles for each step. There are few steps in the process that should be perfomed by a single role. Devops is about communication.

Local Development

Coding, building, testing, packaging are all things that should be done at the same time. You don’t code 1000 lines then build. You build with every small change. In the same way, you should have tests written with the small increments. And that gets extended to “packaging”.

Packaging

I’m probably missing a very important word in my vocabulary here, but I’m going to go easy on myself, it’s late. What I’m meaning here is the process of setting up an image, or a deployable product. The configuration that will collect external dependencies and run tests when it gets to the deployed environment. This should all be done locally first. It should all be put into version control, separate from the application code. And this needs to be done in collaboration between the developer, who knows the application, and operations (or as they’ve been renamed in my organization, “Devops Engineers”) those who know how to package things appropriately.

Local Concurrence vs Cloud Linearity

Why isn’t spell check underlining “linearity”. That can’t be real. As I mentioned before, the things done locally are all done at the same time (build, test, package). That’s all just development. When it gets to the cloud, it’s a done product. Nothing should be further developed there. Nothing should be getting reworked. Everything needs to come entirely from Version Control and no manual finagling. (obviously?) So the cloud portion is linear. Everything happens in order. If it fails at any point, it’s sent back to local dev and fixed there.
Maybe this is all so obvious, it doesn’t need to be said. Maybe I’d feel better if it were just said anyway.

Posted in Continuous Integration, Uncategorized, Version Control. Tags: , , , , , , , . Comments Off on Cloud Delivery Platform — Thought Model »

Integrating Jasmine with Travis CI

One of the things I’ve been wanting to automate with our Harmony Lab project is the javascript test suite so that it runs via Travis CI. I tried once before, but hit a wall and abandoned the effort. I recently had the opportunity to work on this as part of a professional development day in ATG, which is an excellent practice that I think all departments should embrace, but that’s a topic for another day. If you’re not familiar with Travis, it’s a free continuous integration service that provides hooks for github so that whenever you push to a branch, it can run some tests and return pass/fail (among other things). Getting this to work with a suite of python unit tests is easy enough according to the docs, but incorporating javascript tests is less straightforward.

Harmony Lab JS tests use the Jasmine testing framework and all of the unit tests are created as RequireJS modules. This is nice because each unit test, which I’ll call a spec file, independently defines the dependencies it needs and then loads them asynchronously (if you’re not using RequireJS, I highly recommend it!). Running the Harmony Lab tests locally is a simple matter of pointing your browser to http://localhost:8000/jasmine. This makes a request to Django which traverses the spec directory on the file system and finds all spec files, and then returns a response that executes all of the specs and reports the results via jasmine’s HTMl reporter. But for headless testing, we don’t want to be running a django server unless it’s absolutely necessary. It would be nice if we could just execute some javascript in a static html page.

It turns out, we can! The result of integrating jasmine with phantomjs and travis CI is Harmony Lab Pull Request #39. You can check out the PR for all the nitty-gritty details. The main stumbling block was getting requirejs to play nicely with phantomjs and getting the specs to load properly. The phantomjs javascript code, that is, the javascript that controls the browser, was the simplest part since it only needed to listen for the final pass/fail message from jasmine via console.log and then propagate that to travis.

Posted in Continuous Integration, Harmony Lab, Javascript. Comments Off on Integrating Jasmine with Travis CI »