Working on Wednesday #6-#7 : Correctly installing Ruby on Rails

This post spans two weeks because I couldn't manage to have a clean Ruby/Rails install on my first try. I read a lot, installed ruby using various methods, but finally managed to get it to work corretly.

Cleaning up

First of all, you have to remove any ruby version you might have already installed, just to be sure.

sudo apt-get remove ruby && sudo apt-get autoremove

Installing RVM

Then, you have to install RVM before installing Ruby. My biggest mistake in my various shots at installing Ruby was to install RVM last.

RVM is a very important part of the whole Ruby process. This is a little piece of genius that allow you to create Ruby sandboxes. You can install various Ruby versions side by side, even various gem versions and you simply tell RVM which sandbox you want to use.

If you are absolutly positive that you will never ever work on more than one Ruby project in your entire life, you can skip installing RVM and simply install Ruby globally on your system. But you know that this will never happen, so, avoid future troubles and install RVM first.

To install RVM, simply execute the following command

bash < <(curl -s https://rvm.beginrescueend.com/install/rvm)

This will download and execute the install script. Once it's finished, edit your .bashrc or .zshrc to include the rvm config file whenever a shell is launched.

[[ -r $HOME/.rvm/scripts/rvm ]] && source $HOME/.rvm/scripts/rvm

Just to sure to have the latest version, I also ran

rvm get head
rvm reload

Updating your system

RVM depends on some binaries to work, so be sure to install them all. They are listed when running rvm notes, but as the time of writing this was the list for me :

sudo apt-get install build-essential bison openssl libreadline6 libreadline6-dev curl git-core zlib1g zlib1g-dev libssl-dev libyaml-dev libsqlite3-0 libsqlite3-dev sqlite3 libxml2-dev libxslt-dev autoconf libc6-dev ncurses-dev

Installing Ruby

Once RVM is installed, installing the latest (1.9.2 as the time of writing) Ruby version is as easy as :

rvm install 1.9.2

This will take some time, downloading and compiling Ruby. Next, tell RVM that this is the version we are gonna use.

rvm use 1.9.2

You can always switch back to your system-wide ruby install by doing

rvm use system

Creating a gemset

Plugins in Ruby world are named gems. They can easily be installed/uninstalled to a project to provide advanced features. Rails itself is a gem.

The traditional way of using gem is to simpy using the RubyGem command gem.

When using RVM and its sandboxed mode however, the best way is to create a gemset, and install gems in that gemset. This will allow us in the future to switch between multiple gemset easily.

I suggest creating a gemset for each project you start. You can also install gems in the global gemset so they get available to each project. As I'm new to the Ruby world and still don't really know which gems ar "must-have", I'll skip this part for now.

Let's create a new gemset for our new project. I'll name mine pixelastic, but change the name to fit your project name

rvm gemset create pixelastic
rvm gemset use pixelastic 

You'll be now using the gemset pixelastic. You can list all available gems in your current gemset by doing

gem list

Or list all the available gemsets by doing :

rvm gemset list

The one you are currently using will be prefixed by =>

What is Rake ?

You might have noticed that your new gemset contains only one gem, named Rake. You do not need too spend to much time on that. You simply have to know that Rake is more or less the Ruby compiler. Your Ruby code will go through this gem to became a running app.

Installing Rails

As I said above, Rails is a gem like many other, so you can simply install it by doing :

gem install rails

Note that because we are using RVM, the gem will only be installed in this gemset and not globally. If you switch gemset, rails will no longer be available.

If we weren't using RVM, the gem would have been installed globally. RVM is actually wrapping itself around the gem command to sandbox it inside its own gemset.

Handling dependencies with Bundler

Installing Rails will install a bunch of other gems. One of them is Bundler.

Bundler is a Rails specific gem dependencies handler. Its features seems to overlap thoses of RVM. At the time of writing, I haven't yet used it, but its main use still seems to be its gemfile.

The goal of a gemfile (located in your project directory) is to list all the gems your project will need (along with respective versions if provided). Then, whenever you drop your project in a new environment, Bundler will be able to download and install your gems for you.

If said environment uses RVM, then the gems will be saved in the gemset, if not they will be installed globally. Bundler is absolutly not linked to RVM and can be used independently.

The syntax of a gem file will not be discussed here as I have no previous experience with them, but the command to read the gemfile and update the project accordingly is :

bundle install

Automatic switching gemset

One nice bit of RVM is that it is able to automatically detect the gemset to use on a per project basis. You simply have to create a .rvmrc file in a project, and RVM will execute it.

For example, to use my pixelastic gemset and Ruby 1.9.2, simply add the following to your .rvmrc

rvm use 1.9.2@pixelastic

References

I read a lot on the subject, to finally get it right. Here are the various sources :

Update : Sqlite3

If you got error complaining about Sqlite3 missing, just

sudo apt-get install libsqlite3-0 libsqlite3-dev

Testing Facebook Credits in a local environment

Testing Facebook API has always been a pain for me. Their documentation is still crappy and examples are wrong or/and outdated. Things don't exactly work the same way when you're testing with "Test account" or real accounts.

One of the things that helped me much was to create a test application and set its canvas url to a local server. As the iframe call is done through the client, I can test local code without the need to upload stuff to an online server.

I could even test Requests and Streams through this method as it's the Javascript SDK that does all the AJAX calls. The only tiny issue was with Stream images as Facebook requests them server-side before displaying them, I had to put some placeholder images online.

But today, with the testing of the new Facebook Credits API, new horizon of pains arised. Facebook will make no less than three calls to one of my server callback url, but does not make them client-side.

I still don't want to upload my code to a debug server just to test this feature, so I decided to put in place a little IP/port forwarding. Thanks to tips from my colleague LĂŠo, this was done in a matter of minutes.

Setting up the DNS/port forwarding

First, we'll need a url that Facebook will call. I want this url to point to my local server. So all I had to do was to create a simple DNS redirect at dyndns.com that point to my local IP.

Let it be http://customname.dyndns-office.com/

Paste this url in the "callback url" field of your Facebook Credits config page.

Then, I'll assign a fixed internal IP to my computer so that it won't change on each reboot. My router can do that just fine, by assigning a fixed internal IP based on the MAC adress.

Let it be 192.168.0.51

Now, we'll redirect every call on the router through port 80 (http) to that url. My router admin panel can also do that just fine, in its DHCP configuration.

Finally, we'll have to update the server virtualhost config to point all incoming requests matching customname.dyndns-office.com to the server files.

A side note

There is one last little gotcha to be aware of.

It does not seem to be possible to access one of your network computer from an external IP (as we just configured) FROM one of your network computer.

In other words, if you would like to test your config, do not type http://customname.dyndns-office.com/ in your browser on one of the computers sharing the same network as 192.168.0.51.

Instead, use a free wifi connection, a ssh tunnel or curl from an external server you own.

In my case, locally testing http://customname.dyndns-office.com/ always brought me to my router admin panel and did not forward the port correctly. Doing a curl http://customname.dyndns-office.com/ from one of my online servers correctly returned what I wanted.

Back to Facebook

Back to our Facebook example, you still won't be able to see any outputs from the calls Facebook is making to your app. Your best bet is to have a logging system that will write on disk or in the DB so your can track the calls.

Also note that you have to load the page in the iframe canvas, even for your tests. You can't simply load an html page and call FB.ui({method:"pay"}), this will result in error 1151. Always load in the whole FB page.

Working on Wednesday #5 : Rails documentation and Zombies

Today I continue on my Rails for Zombie learning. Actually, I wake up kind of late (11am), had to deal with noisy neighbors and make some shopping before being really able to start working. It's 2pm now, and I just open my browser.

Back to Zombies

I like the clean syntax Ruby provides. I like being able to pass custom parameters without having to care about the order. This could be achieved in cakePHP using array for options, but is much more clean the Ruby way, using truncate(zombie.graveyard, :length => 10, :omission => '')

Also the link_to and new_{Model}_path, edit_{Model}_path are clever and allow easy access to the link you always use. This force you to logically organise your app.

The way Rails controllers pass vars to the view (using @) is also cleaner than the cakePHP set method. I love those little things the language permits.

before_filter also seems more powerful than in cake world, being able to define several of them and restrict them to certain action. Could be extended (I guess) to an ApplicationController that could check on show, edit, etc that the specified id exists and display an error message if not.

I didn't quite get the various respond_as for JSON and XML. Why should I have to pass the @tweet while I don't have to for the html view ?

Also, the Rails routings system is more appealing to me than its cakePHP counterpart. I would have loved to have such a nice tutorial for cake when I learned it. Routing is very well explained in Rails for Zombies.

After completing the Zombie tutorial, I headed to the famous blog tutorial every web language should have. Once again, I started reading the doc and a few things caught my attention as very promising :

Command line interface

I like the way one can create a new app simply with one command line. Such feature is also provided by cakePHP but I never managed to make it work the way I wanted to. That might have been influenced by the fact that I was working on Windows at that time.

Directory structure

I also note that a Rail application seems greatly structured : there are spaces defined for documentation, tests, database migration, dependencies, logs, deployment scripts and so on.

Databases

This is great that Rails directly provides two distinct DB configurations : development and production. I will no longer have to do it myself.

Rails also uses SQLite3 as the default database for development. As I wasn't very familiar with this technology, I made some researchs. Turns out that SQLite is a very simple DB system, perfectlynsuited for the development period as it does not require a DB server.

Using environment vars in lighttpd config files

Our game is hosted on a farm of servers behind a load balancer. All servers are identical except for their names (prod-01, prod-02, etc) and virtual IP addresses.

In PHP, if I try to access $_SERVER['SERVER_NAME'], I only got the domain name "prod.game.com". Actually, this was exactly the same var as the $_SERVER['HTTP_HOST'].

For logging purposes, I needed to know the name of the server that my script was currently running on. So I updated my lighttpd.conf

Lighty has a feature called include_shell that you can use in its config files. It will basically run a shell script and add its output to your file.

So I wrote a simple shell script to define a var.serverName (this is a custom value, name it as you want, but keep the var prefix) and then re-use when needed.

#!/bin/bash
echo 'var.serverName="'$(uname -n)'"'

Then, I included it in my lighttpd.conf file using include_shell "/etc/lighttpd/scripts/serverName.sh"

To define the PHP SERVER_NAME value :

setenv.add-environment = (
  "SERVER_NAME" => var.serverName
)

To add it as a Server: response Header :

server.tag = var.serverName

Note that include_shell directives are only called when you start lighty and not on every request.

Incorrect MySQL date

Several ingame time calculation we made in game are based on Paris time. Some weeks ago, we decided to make a pass on all our servers to always use Paris Time (GMT +1).

Today, I spotted that logs we save in the DB have some date inaccuracy. It appears that our mysql server and instances weren't always updated to the correct date. Some hours later, here is what I learned :

Finding and updating MySQL date

You can tell what timezone mysql should use when you start the service. If you don't specify anything, it will use the system time. Once loaded, you can get its time by running SELECT NOW().

This is the easiest way to spot errors.

To know the defined timezone, run SELECT @@global.time_zone. If not defined, you'll read SYSTEM, which is not very helpful.

Note here that even if you changed the system date AFTER you started mysql, mysql will still use the date that was in effect when you first launched it.

It means that changing your server time will not affet running mysql processes. You'll have to restart mysql to do that : sudo /etc/init.d/mysql restart

Finding and updating the server date

Even after restarting mysql on some servers, the mysql date was still incorrect. After connecting the the sql server, I found that it was the server time that was incorrectly set (I just type date).

To update the current time zone, I had to call sudo dpkg-reconfigure tzdata (I'm using ubuntu) and choose the correct city

Updating mysql running through ndb_mgm

I'm no server expert, so this part was a little trickier. Some of our databases are using ndb cluster for replication. Reloading those configurations was harder.

First, I had to connect to the server running the ndb management and call ndb_mgm. In the later prompt, I typed show and this get me the list of all servers currently managed.

I then shut them down typing shutdown.

The, I reloaded the management and the node running on this server by doing sudo /etc/init.d/mysql-ndb-mgm restart and sudo /etc/init.d/mysql-ndb restart

Finally, I had to connect to all the servers I saw earlier (with the show command) and run sudo /etc/init.d/mysql-ndb restarton each of them