You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Cloud Delivery Platform — Thought Model

I was asked to take a look at a thought model for a Cloud Delivery Platform. I’m not going to post that graphic, because it wasn’t ready for prime time. It was confused and I’m not really sure what it was trying to show. So I created a bit of a diagram showing what I thought was important. (Just click on it, it doesn’t seem to fit right, and the next size down is too small to read.)

cloud_delivery_platform

Halfway through I realized I’m not terribly good at making these things. I’ll get better. But there’s a few things I wanted to draw attention to in this graphic.

The Roles

This was the most important thing to me is to highlight the crossover of roles for each step. There are few steps in the process that should be perfomed by a single role. Devops is about communication.

Local Development

Coding, building, testing, packaging are all things that should be done at the same time. You don’t code 1000 lines then build. You build with every small change. In the same way, you should have tests written with the small increments. And that gets extended to “packaging”.

Packaging

I’m probably missing a very important word in my vocabulary here, but I’m going to go easy on myself, it’s late. What I’m meaning here is the process of setting up an image, or a deployable product. The configuration that will collect external dependencies and run tests when it gets to the deployed environment. This should all be done locally first. It should all be put into version control, separate from the application code. And this needs to be done in collaboration between the developer, who knows the application, and operations (or as they’ve been renamed in my organization, “Devops Engineers”) those who know how to package things appropriately.

Local Concurrence vs Cloud Linearity

Why isn’t spell check underlining “linearity”. That can’t be real. As I mentioned before, the things done locally are all done at the same time (build, test, package). That’s all just development. When it gets to the cloud, it’s a done product. Nothing should be further developed there. Nothing should be getting reworked. Everything needs to come entirely from Version Control and no manual finagling. (obviously?) So the cloud portion is linear. Everything happens in order. If it fails at any point, it’s sent back to local dev and fixed there.
Maybe this is all so obvious, it doesn’t need to be said. Maybe I’d feel better if it were just said anyway.

Posted in Continuous Integration, Uncategorized, Version Control. Tags: , , , , , , , . Comments Off on Cloud Delivery Platform — Thought Model »

Jira vs Github :: Agile vs Open Source

A few years ago, my team got into Open Source. Specifically, we started writing all of our apps on github (as opposed to our SVN). We wanted to do this because we wanted to invite scrutiny. We never expected people to look at our stuff, we just felt that by putting it out in the open, we’d want to do better internally.

We went whole-hog. Organized stories with issues, organized sprints with milestones. It was pretty hot stuff. And it was all in the open. Potentially, someone could come by, see what we’re doing and offer to take a story from the backlog of issues. That’s open source. ish. We have a lot more we can do. A lot of growth to do.

Then we started using Jira. The board system within Jira Agile was excellent. Allowed for better organization, reporting!, and visual representations of work. It’s great. It’s Agile. But it also replaced what we were doing with Github issues.

We essentially replaced Open Source in favor of Agile. Our organization is great, we’re keeping track of things fantastically, but we’re no longer open. We don’t have transparency on what we’re working on anymore. People can’t potentially help. Our code is out there, but we’re not inviting. Our process is no longer out there.

So what’s our solution? We don’t have one yet. But what /can/ we do?

We need to put our vision statement out there. We need to put our plans out there. We need to expose what it is we’re doing. We also need to stay agile, keep our tools intact, keep our reporting.

This means we probably need to be duplicating efforts. Open Source and Agile are both hard work and organization. That they can’t line up and be the same effort is not a blocker, just an “oh well”.

Posted in Agile, ATG, Git, Open Source, SVN. Tags: , , , . Comments Off on Jira vs Github :: Agile vs Open Source »

Integrating Jasmine with Travis CI

One of the things I’ve been wanting to automate with our Harmony Lab project is the javascript test suite so that it runs via Travis CI. I tried once before, but hit a wall and abandoned the effort. I recently had the opportunity to work on this as part of a professional development day in ATG, which is an excellent practice that I think all departments should embrace, but that’s a topic for another day. If you’re not familiar with Travis, it’s a free continuous integration service that provides hooks for github so that whenever you push to a branch, it can run some tests and return pass/fail (among other things). Getting this to work with a suite of python unit tests is easy enough according to the docs, but incorporating javascript tests is less straightforward.

Harmony Lab JS tests use the Jasmine testing framework and all of the unit tests are created as RequireJS modules. This is nice because each unit test, which I’ll call a spec file, independently defines the dependencies it needs and then loads them asynchronously (if you’re not using RequireJS, I highly recommend it!). Running the Harmony Lab tests locally is a simple matter of pointing your browser to http://localhost:8000/jasmine. This makes a request to Django which traverses the spec directory on the file system and finds all spec files, and then returns a response that executes all of the specs and reports the results via jasmine’s HTMl reporter. But for headless testing, we don’t want to be running a django server unless it’s absolutely necessary. It would be nice if we could just execute some javascript in a static html page.

It turns out, we can! The result of integrating jasmine with phantomjs and travis CI is Harmony Lab Pull Request #39. You can check out the PR for all the nitty-gritty details. The main stumbling block was getting requirejs to play nicely with phantomjs and getting the specs to load properly. The phantomjs javascript code, that is, the javascript that controls the browser, was the simplest part since it only needed to listen for the final pass/fail message from jasmine via console.log and then propagate that to travis.

Posted in Continuous Integration, Harmony Lab, Javascript. Comments Off on Integrating Jasmine with Travis CI »

Basic LTI Tutorial Using PHP

Introduction

This tutorial will get you up and running with a development environment, complete with a virtual machine running an Apache 2 server with PHP, a basic LTI library written in PHP, and a simple basic LTI-compliant LMS. By the end of the tutorial, you will have written a simple “Hello, World!” LTI tool, and you will be ready to delve into the world of LTI coding. Please note that we assume you already have the following knowledge:

  • An understanding of directories and files, and how to navigate through them using a terminal window on your operating system;
  • How to install software packages on your operating system;
  • An understanding of fundamental web concepts, such as HTML, CSS, HTTP requests, POST requests, etc.
  • Some experience in setting up web servers;
  • Basic knowledge of PHP;
  • An understanding of basic LTI concepts and, in particular, the Basic LTI (v1.0) specifications.

Ready to begin? No? Then grab some coffee or another refreshing beverage. We’ll wait. . .

Ready now? We hope so! Because we’re getting started.

(more…)

Pages: 1 2 3 4

Posted in LTI. Comments Off on Basic LTI Tutorial Using PHP »

Mobility Workshop

gold stick figures sitting together working on their gold laptopsA couple of months ago, we had a very successful presentation on vagrant/puppet. Given by a developer, it sparked motivation in some devops enthusiasts to give a workshop. It went really well. (Vagrant Tutorial)

They had a great format with their workshops that seemed effective. They were super enthusiastic, which translated to super prepared. They had created a wordpress post that was a very detailed, step by step instruction on getting started.

The interest and effectiveness of this sparked motivation to do “other topics”. So I put together this tutorial on mobile development, “featuring jQuery Mobile”.

I thought the idea of something where I had the material prepared beforehand and could just say “go” was a great place to start. This shouldn’t be a presentation, it should be an opportunity for people to get their hands dirty.

I was running through tutorials and found a lot of them had put their code on github. At some point I had the idea to use branches to “step” through the tutorial. The way I envisioned the workshop going was to start with nothing, and build out a mobile app through logical steps — as a lot of tutorials do. Do a header, do a menu, do a list, do a link, do a transition, do a search. Coming up with a list of “things to do” was easy.

Putting it together with github also meant I could just put the directions in the base readme and it would be a completely self-contained tutorial. (Not to mention, having it in git allowed me to force people who I know don’t want to make the switch out of SVN to use a VCS that is so much nicer.)

Anyway, the finished draft took a long time. It was a lot of easy stuff, but time consuming.

(Mobility Workshop)

The most important part came after the draft was finished. I gathered some team members and some cross-team members — basically whoever would come sit with me and had them run through it to see if it made sense. They were brutal. It was great. A lot of this work happened after hours, so the language was at times, very stream of consciousness. Having people with varied familiarity with the topics covered allowed for some invaluable revisions.

Overall, the workshop proved to be a wonderful exercise in collaboration and teamwork, and regardless of how the actual workshop goes, it has left me better than I started, so that’s good enough.

Posted in Design & Modeling, Development, Open Source. Tags: , , , , , , , . Comments Off on Mobility Workshop »

The Newbie: How to Set Up SSHFS on Mac OS X

Recently, I wanted to find a simple way of mounting a remote Linux file system from my Macintosh laptop. And by “simple,” I wanted the procedure to consist of mostly downloading and installing a tool, running a command, and not having to delve too deeply into editing configuration files. Fortunately, I was able to figure this out without too much trouble, and thought I would record my experience here. The procedure involves two applications, FUSE For OS X and SSHFS, both of which can be found on the FUSE for OS X web site. FUSE for OS X is a library/framework for Mac OS X developers to establish remote connections; SSHFS is an application built upon the FUSE framework.

First, let’s establish some terminology. We’ll simply refer to the remote server that I wanted to connect to as the “Linux server” (at the domain “remoteserver”) and define my local machine as simply “my laptop.” We’ll call the file directory that I wanted to access on the Linux server as “/webapps”. In essence, I wanted to be able to access the folder “/webapps” on the Linux server as if it were a folder sitting on my laptop.

I’ll also note that I had already set up my SSH keys on my laptop and the Linux server. That needs to be accomplished before anything else. If you need guidance on that, here’s a simple tutorial.

After SSH had been set up:

  1. I downloaded the latest version of FUSE for OS X at the FUSE for OS X web site.
  2. I installed FUSE for OS X on my laptop by double-clicking the disk image, then double-clicking on the installation package. There is pretty standard Mac OS X stuff; it went without a hitch.
  3. I downloaded the latest version of SSHFS for OS X at the FUSE for OS X web site.
  4. I installed SSHFS by double-clicking on the downloaded file. I ran into an issue here where Mac OS X refused to install the package because SSHFS comes from an “unidentified developer.” To get around this, you need to override the Gatekeeper in Mac OS X, which can be as simple as right-clicking on the package and selecting “Open” from the context menu.
  5. Both FUSE for OS X and SSFHS were now installed.
  6. Next, I needed to create a new folder on my laptop which would serve as the mount point. Let’s call that folder “~/mountpoint.”
  7. Now, it was a matter of learning how to invoke the appropriate command to have my laptop mount the Linux server. The command I used was:

sshfs -p 22 username@remoteserver:/webapps/ ~/mountpoint -oauto_cache,reconnect,defer_permissions,noappledouble,negative_vncache,volname=myVolName

Using the above steps, I was able to successfully mount the Linux server. Unmounting is a piece of cake:


umount ~/mountpoint

 

Additional notes:

The SSHFS command used to mount the remote server is lengthy; indeed, filled to the brim with arguments that I cut and pasted. If you would like to know what each argument does, there is a helpful guide that describes them.

Online Learning About Learning Online

As I continue to tentatively wade back into development waters, I’ve started taking advantage of the many online learning opportunities that are out there. The reasoning is two-fold: (1) The obvious reason is that I want to learn (or, as the case may be, re-learn) some new languages and frameworks, and (2) as someone who works as an educational technologist, I ought to be current on these online opportunities, anyways.

In particular, I’ve been refreshing myself on JavaScript, teaching myself Python, becoming more familiar with Joomla!, and I’m also interested in getting some game development underway with HTML5. For my refresher on JavaScript and my dive into Python, I’ve been using Codeacademy; for Joomla, I’ve been taking advantage of lynda.com; and for game development in HTML5, I attempted participating in a Udacity MOOC. I’ve summarized my (ongoing) experiences below:

Codeacademy

This is a fantastic (and free!) resource for both beginner and advanced programmers, though advanced programmers may find the hand-holding approach a tad slow. Clearly aimed at introducing the newbie into the world of programming, each course re-introduces the fundamentals (syntax, variable assignment, conditionals and control flow, functions, objects, etc.). Each course is divided into sections, and each section into a series of lessons. Lessons build upon themselves, as do sections, and most importantly, several sections are reserved for implementing a simple application based on the concepts learned (I especially enjoyed “Pyglatin:” implementing a pig latin generator in Python).

The interface consists of panel on the left that introduces a particular concept, and then instructs the user to write some code based on the concept. Therein lies the genius of Codeacademy–unlike a book, you are not only forced to read about a subject, you are forced to actually sit down and implement it before moving on. And so far, I’ve discovered their console works amazingly well at detecting errors, giving hints if things don’t go well, and just generally getting things right.

In essence, Codeacademy was designed to teach code and coding practices–nothing else. It does so with simplicity–no fancy videos or multimedia, just well-written text and a console–which is what coding should be all about. Currently it offers courses in JavaScript, Python, HTML/CSS, PHP, Ruby, and APIs. It also offers Codeacademy labs where you can experiment with some of the new languages that you have learned.

Lynda.com

The amount of subject matter on lynda.com is staggering. From project management to 3D modeling, lynda.com offers courses on just about any popular technical concept out there. I typed in “Joomla” and received no less than 13 tutorials (granted, only three pertained to the most recent version of the system). I’m about a quarter way through “Joomla! 3 Essentials” and thus far, my experience has been a positive one.

The course on Joomla! 3 takes what I consider to be the “traditional” approach to online learning: Divide a course into a series of sections, divide each section into a series of lessons, with each lesson consisting of a video and downloadable content to perform the described exercises. Like Codeacademy, lynda.com understands that for most users, learning is the equivalent of doing. This particular course hand-holds the user through downloading and installing Joomla on one’s laptop, then stepping through a series of exercises based on downloadable material. The course sometimes encourages “homework” in between its lessons–that is, if you don’t complete the exercises after a lesson has finished, the next lesson will be tougher, if not impossible, to follow.

The videos for this particular course are professional and well-paced, though, again, for advanced users, the hand-holding might be a tad slow. Nevertheless, with just a quarter of the course behind me, I feel confident enough to go into any Joomla! environment and be able to decipher the basic structure of the site.

Perhaps the only downside to lynda.com is that it’s not free. Although you can get buy with paying $25/month, you really need to download the exercise files to fully experience a course, which ups the price to about $40/month. I’m fortunate that my institution offers lynda.com as a perk; if your institution doesn’t, I strongly encourage you to encourage them to invest in it.

Udacity

I won’t dwell too much on my first experience with MOOCs; suffice to say, I wasn’t impressed. I eagerly signed up for “HTML5 Game Development” when it started being offered, but gave up after the first lesson or two.

Like most MOOCs that I have seen, the course was divided up into a series of lessons, each lesson a series of videos, with each video followed by a “quiz” that could be automatically graded. This is where everything fell apart. The quizzes expected code to be inputted (in this case, Ajax code), and this code would then be “graded” as either correct or incorrect. The problem is that the Udacity grading engine (or whatever they were using behind the scenes) wasn’t able to grasp the concept that with coding, “there is more than one way to do it”.  Although a user could enter code that gave the correct result, the engine seemed to require that the code follow an exact syntax. And in following the discussion forums of each quiz (and some of the apologetic emails I received from instructors), it was clear I wasn’t the only one having difficulties. As I said, I gave up after a bit. Perhaps I’ll return some day.

Maybe I chose the wrong course, or maybe I was wrong in choosing Udacity; regardless, the experience seemed less professional and less reliable than either Codeacadmy or lynda.com. Maybe it’s because MOOCs are in their infancy. . . or perhaps it’s because they’re being run by academics rather than solid business professionals. Regardless, if the experience I had is any indication of how MOOCs are, in general, being run, I don’t see them as viable competitors to other online learning platforms.

 

 

Posted in Development, Javascript, MOOCs, Online Learning, Python. Tags: , , , , , , . Comments Off on Online Learning About Learning Online »

Documenting Documentation

I recently waded back into simple web application development (more on how that feels later), and one of the many aspects of coding that I’ve been refreshing myself on is how to best document what I’ve written so my future self doesn’t get too confused over what I implemented or why I implemented it in the way I did. The application is deadly simple, so simple that I hesitate to call it an application. The web page contacts a PHP script via an Ajax call, and the PHP script does its thing, sending back a JSON-encoded object. The client subsequently uses the object to display a message of success or failure.

As I said, deadly simple.

Nevertheless, as simple as the application is, I’ve been researching how best to document PHP and JavaScript. For PHP, the definitive answer appears to be phpDocumentor 2. For JavaScript, there is JSDoc. Here are some additional links that I found useful:

phpDocumentor

JSDOC

Note that I haven’t actually tried generating documentation with either toolset; that’s a completely different challenge. I’ve mostly been following the format so that my documentation can be printed/generated if somebody (aka me!) wishes. And what I’ve come to understand is that learning how to document a language feels almost as complicated as learning the language itself.

 

Posted in Development, Javascript, PHP, Uncategorized. Tags: , , , . Comments Off on Documenting Documentation »

Getting Travis-CI to work with a PHP Project

First off, Travis CI is only usable through github. It’s a great service that allows tests to be run before pull requests are merged, helping to ensure code stability. It’s a very new service, and PHP support is in flux. Almost daily information changes. So I fear this will be outdated tomorrow. Regardless, I wanted to get this written down.

The magic all happens in the .travis.yml file. It’s a config file that tells travis what to run. It only has a few sections, but they can be tricky. Here is the example .travis.yml for PHP:

language: php

# list any PHP version you want to test against
php:
  # using major version aliases

  # aliased to 5.2.17
  - 5.2
  # aliased to a recent 5.3.x version
  - 5.3
  # aliased to a recent 5.4.x version
  - 5.4

# optionally specify a list of environments, for example to test different RDBMS
env:
  - DB=mysql
  - DB=pgsql

# execute any number of scripts before the test run, custom env's are available as variables
before_script:
  - if [[ "$DB" == "pgsql" ]]; then psql -c "DROP DATABASE IF EXISTS hello_world_test;" -U postgres; fi
  - if [[ "$DB" == "pgsql" ]]; then psql -c "create database hello_world_test;" -U postgres; fi
  - if [[ "$DB" == "mysql" ]]; then mysql -e "create database IF NOT EXISTS hello_world_test;" -uroot; fi

# omitting "script:" will default to phpunit
# use the $DB env variable to determine the phpunit.xml to use
script: phpunit --configuration phpunit_$DB.xml --coverage-text

# configure notifications (email, IRC, campfire etc)
notifications:
  irc: "irc.freenode.org#travis"

Not included libs

Currently this will not work. Their phpunit is requiring libraries that aren’t included in the vm that’s created. Regardless of if your project is using these libraries:

before_script:
  # everything after this point is needed to just use phpunit
  - pear channel-discover pear.phpunit.de
  - pear install phpunit/PHP_Invoker
  - pear install phpunit/DbUnit
  - pear install phpunit/PHPUnit_Selenium
  - pear install phpunit/PHPUnit_Story

Database usage

I use mysql currently. The problem I hit was you can create users, but you can’t grant them privilages. So you HAVE to use the default root user with no password.

env:
  - DB=mysql

# execute any number of scripts before the test run, custom env's are available as variables
before_script:
  - mysql -e 'CREATE DATABASE `quizmo_dev`;'
  # The following is fine, but travis won't allow granting privilages
  # - mysql -e "CREATE USER 'quizmo_dev'@'localhost' IDENTIFIED BY 'quizmo_dev';"
  # - mysql -e "GRANT ALL PRIVILEGES ON *.* TO 'quizmo_dev'@'localhost' WITH GRANT OPTION;"
  # migrating adds all tables
  - quizmo/protected/yiic migrate --interactive=0

Notifications don’t work

I’m assuming this is something they’ll fix soon. I’m just trying to use email notifications — but it never sends an email.

Posted in Development, Open Source, PHP. Tags: , . Comments Off on Getting Travis-CI to work with a PHP Project »

Continuous Integration with PHP on Travis CI and Github

People on high have been preaching the wonder of continuous integration for a while now. It’s been all about Jenkins forever. Jenkins is still the #1 choice for most people, but I recently ran into Travis CI and at least short term, this is going to be the solution for our shop.

What is Continuous Integration for PHP?

CI to most people involves building and running integration tests. PHP clearly doesn’t build, but good PHP still has unit tests / integration tests / functional tests — so CI for PHP is running those tests before code merges.

What makes Travis CI good?

Probably ease of use. There is no setup of a “travis server”. It’s a service that they run. You hook it to a repo you have access to and set up a config file and it’s good to go.

But this only works with github?

That’s probably the biggest detractor. It’s currently built exclusively for use with github. Which is awesome for github projects, but not every project can be on github. We don’t always have control over where our repos are — and not everyone is an open source person.

Posted in Development, Open Source, PHP. Tags: , , , , , . Comments Off on Continuous Integration with PHP on Travis CI and Github »