CloudFlare, Apache, WordPress and IP address logging


If like me, you use the very useful CloudFlare service to speed up & protect your site(s), you may have noticed that since using CloudFlare, your access logs may seem to have a ton of visits from a very narrow range of IP addresses. This is because CloudFlare acts as a reverse proxy and the IPs you are seeing are from CloudFlare’s network.

This is a bit sucky for analytics since those IPs are not of the actual visitors to your site(s). The original IP is still in the HTTP request headers when CloudFlare is enabled, though, and looks something like this sample request header:

GET /blog/feed/ HTTP/1.0
Host: www.normyee.net
Accept-Encoding: gzip
CF-IPCountry: US
Connection: close
Set-Keepalive: 0
Accept: */*
From: googlebot(at)googlebot.com
User-Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

CloudFlare inserts a CF-Connecting-IP header containing the original requester’s IP. In this case, the IP is google’s web crawler paying me a visit, although the request was logged as coming from — one of CloudFlare’s IPs. We of course want the original IP logged, and not CloudFlare’s. Fortunately there are quick solutions for both Apache and WordPress.

For Apache, CloudFlare has an Apache module, mod_cloudflare, which you’ll need to compile from source for your system. You can get more info and instructions here & view the source on github here (it’s linked to from the previous link as well). It’s pretty straightforward, assuming you have shell access and the ability to run apxs (the APache eXtenSion tool).

For WordPress, you can just simply download the CloudFlare WordPress plugin at wordpress.org to get the correct IPs back in WordPress. CloudFlare has a wiki page for the plugin as well, but the WordPress.org plugin page has all the info you need.

Debugging JavaScript in Adobe AIR apps


Several months back I switched from using Notepad++ to Eclipse-based Aptana Studio to develop an Adobe AIR desktop client for work & have been pretty happy with Aptana other than for the huge amount of memory it gobbles up. Late last week the Adobe AIR Development Plug-In for Aptana came out of beta & is great news for anyone developing HTML/JavaScript-based AIR apps (as opposed to Flash/Flex-based).

One of the drawbacks in developing HTML/JavaScript-based Adobe AIR applications is the lack of debugging tools. It doesn’t help that AIR’s error exception will often output an “undefined at undefined” error message in the system console when a JS error occurs, without any stack trace, leaving you to guess where it actually occurred. This is fine for a small application, but when you have tens of thousands of lines of JavaScript code you’ll often be left banging your head on the desk trying to trace down the error in the code.

This screencast from Aptana better illustrates the new features. Be sure to check it out if you’re using Aptana Studio and/or were looking to give it a try!

jQuery tips, migrating from jQuery’s tablesorter to Ext JS’ GridPanel


I ran across this post which has some good jQuery tips.

Speaking of JavaScript frameworks, I’ve been spending the last couple weeks ramping up on Ext JS since it comes with a very slick grid component (actually all of the components it comes with are very, very slick). I’ve been using the jQuery tablesorter plugin for a grid layout used in one of my projects, but it’s missing a couple of features I need such as grouping & resizable columns, which the Ext JS grid component has.

Migrating to the Ext JS grid has been interesting since I’ve had a chance to see the pros and cons of each framework and the different approaches taken to build a grid. The jQuery tablesorter plugin operates on an existing HTML table (or one you create in JavaScript and append to the DOM, which is my case) and converts it to a sortable grid, whereas the Ext JS grid uses a datasource such as a JavaScript array, JSON or XML file, etc. and uses that to build the grid.

Ext JS is quite verbose unlike jQuery (unless you want the verbosity — note: if it’s not obvious, that is a April Fool’s joke :) ), but I do like how Ext JS is a complete framework with a lot of included widgets that all work well together, no doubt because they all are part of the distribution.

jQuery on the other hand is very compact and the syntax is similarly compact (which I love). The core distribution is kept as tiny as possible, and additional functionality is added via plugins such as tablesorter.

The existing implementation of my grid uses jQuery’s tablesorter, contextmenu, blockui, cluetip, and metadata plugins, whereas to achieve the same functionality in Ext JS and to add new features such as grouping & resizable columns, I am using the following Ext JS objects: Ext.grid.GridPanel, Ext.grid.GroupingView, Ext.data.GroupingStore, Ext.data.JsonReader, Ext.QuickTip, Ext.menu.Menu, Ext.MessageBox, Ext.Viewport.

It’s not an apples to apples comparison since some of the Ext JS objects I listed above are used to load the JSON data into something the grid can use, whereas the JSON data was a JS object literal previously and I manually iterated over it to create my table which tablesorter then coverted into a grid. Others like grouping (via Ext.grid.GroupingView) is a new feature that I wanted to implement & wasn’t available in tablesorter or any other jQuery plugin that I could find at the time. If I was to break it down to replicating my grid’s functionality in jQuery over to Ext JS, it’d be the following objects: Ext.grid.GridPanel, Ext.QuickTip, Ext.menu.Menu, Ext.Messagebox.

Each framwork has its pros and cons, but I am glad though that they both work well together (both were designed with this in mind). That’s a good thing, as I am using both: I’ve migrated the grid to use Ext JS’ GridPanel, but am still using jQuery to do a couple manipulations to the grid. I can have my cake and eat it, too (though I will see if I can migrate all of the grid functionality over to Ext JS to keep things simpler)!

Watching an app grow outside the U.S.


It’s interesting watching the growth of Describe Me. When a user first adds the app, he or she is automatically tagged by one of the developers as “cool,” in order to illustrate how the app works.

Recently I became the person tagging new users, and therefore get a ton of friend requests, pokes, or just “who the hell are you” emails from random people. At first it was just mainly people in the U.S., but I started to get requests and pokes from South Africa, Sweden, Hong Kong, Indonesia, Egypt, United Arab Emirates, Singapore, Malaysia, and London to name a few of the more common locations (or rather, what they’ve indicated as their “network”).

Pretty neat to see it grow beyond the U.S. since it was not something we considered when the app was being built. Thankfully the db is running in UTF-8 to handle the various character sets — I’ve got at least one tag in Chinese which I can’t read but at least it renders correctly heh.

Oh, and as of this post, Describe Me has 80,249 users. Can’t wait until 100k! We’re gonna need to throw a celebration or something. :)

PHP’s curl_multi to the rescue


One of my Facebook apps hits Amazon’s Ecommerce Service (ECS) for item information via REST queries. I needed to process 19 separate queries searching for a title (basically searching 1 of 19 “browse nodes”), and return that data onto the Facebook canvas.

Simple enough, I could just do a foreach loop to make each REST request. Only problem was, say, each loop took 500 milliseconds total. That’s not hard when you consider all the steps: DNS, performing the REST query, waiting for the response, receiving the response, and then parsing the XML (in this case, using the SimpleXML extension). At 19 requests that’s 9.5 seconds which is way too slow, not to mention Facebook times out the request as well and returns a lovely error page.

For the test REST query I was benchmarking, it was averaging 9.65 seconds for the entire PHP script to complete (performing all 19 REST queries and then formatting the output). Simply switching from file_get_contents() to PHP’s cURL functions dropped the average to 7.32 seconds, a roughly 24% improvement. That was a great improvement, but still too slow and Facebook was still timing out the pages.

The root of the problem was that the 19 REST queries were sequential and as a result too slow. I needed to make those requests concurrently. Fortunately, PHP supported multi curl (basically making multiple cURL requests concurrently) via curl_multi_* functions.

BAM. Using curl_multi dropped the page generation time down to 1.6 seconds — much more reasonable and a 83.4% improvement. w00t.

Go to Top