If like me, you use the very useful CloudFlare service to speed up & protect your site(s), you may have noticed that since using CloudFlare, your access logs may seem to have a ton of visits from a very narrow range of IP addresses. This is because CloudFlare acts as a reverse proxy and the IPs you are seeing are from CloudFlare’s network.
This is a bit sucky for analytics since those IPs are not of the actual visitors to your site(s). The original IP is still in the HTTP request headers when CloudFlare is enabled, though, and looks something like this sample request header:
GET /blog/feed/ HTTP/1.0 Host: www.normyee.net Accept-Encoding: gzip CF-Connecting-IP: 220.127.116.11 CF-IPCountry: US X-Forwarded-For: 18.104.22.168 Connection: close Set-Keepalive: 0 Accept: */* From: googlebot(at)googlebot.com User-Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
CloudFlare inserts a
CF-Connecting-IP header containing the original requester’s IP. In this case, the IP 22.214.171.124 is google’s web crawler paying me a visit, although the request was logged as coming from 126.96.36.199 — one of CloudFlare’s IPs. We of course want the original IP logged, and not CloudFlare’s. Fortunately there are quick solutions for both Apache and WordPress.
For Apache, CloudFlare has an Apache module, mod_cloudflare, which you’ll need to compile from source for your system. You can get more info and instructions here & view the source on github here (it’s linked to from the previous link as well). It’s pretty straightforward, assuming you have shell access and the ability to run apxs (the APache eXtenSion tool).
For WordPress, you can just simply download the CloudFlare WordPress plugin at wordpress.org to get the correct IPs back in WordPress. CloudFlare has a wiki page for the plugin as well, but the WordPress.org plugin page has all the info you need.
This screencast from Aptana better illustrates the new features. Be sure to check it out if you’re using Aptana Studio and/or were looking to give it a try!
I ran across this post which has some good jQuery tips.
Ext JS is quite verbose unlike jQuery (unless you want the verbosity — note: if it’s not obvious, that is a April Fool’s joke ), but I do like how Ext JS is a complete framework with a lot of included widgets that all work well together, no doubt because they all are part of the distribution.
jQuery on the other hand is very compact and the syntax is similarly compact (which I love). The core distribution is kept as tiny as possible, and additional functionality is added via plugins such as tablesorter.
The existing implementation of my grid uses jQuery’s tablesorter, contextmenu, blockui, cluetip, and metadata plugins, whereas to achieve the same functionality in Ext JS and to add new features such as grouping & resizable columns, I am using the following Ext JS objects: Ext.grid.GridPanel, Ext.grid.GroupingView, Ext.data.GroupingStore, Ext.data.JsonReader, Ext.QuickTip, Ext.menu.Menu, Ext.MessageBox, Ext.Viewport.
It’s not an apples to apples comparison since some of the Ext JS objects I listed above are used to load the JSON data into something the grid can use, whereas the JSON data was a JS object literal previously and I manually iterated over it to create my table which tablesorter then coverted into a grid. Others like grouping (via Ext.grid.GroupingView) is a new feature that I wanted to implement & wasn’t available in tablesorter or any other jQuery plugin that I could find at the time. If I was to break it down to replicating my grid’s functionality in jQuery over to Ext JS, it’d be the following objects: Ext.grid.GridPanel, Ext.QuickTip, Ext.menu.Menu, Ext.Messagebox.
Each framwork has its pros and cons, but I am glad though that they both work well together (both were designed with this in mind). That’s a good thing, as I am using both: I’ve migrated the grid to use Ext JS’ GridPanel, but am still using jQuery to do a couple manipulations to the grid. I can have my cake and eat it, too (though I will see if I can migrate all of the grid functionality over to Ext JS to keep things simpler)!
It’s interesting watching the growth of Describe Me. When a user first adds the app, he or she is automatically tagged by one of the developers as “cool,” in order to illustrate how the app works.
Recently I became the person tagging new users, and therefore get a ton of friend requests, pokes, or just “who the hell are you” emails from random people. At first it was just mainly people in the U.S., but I started to get requests and pokes from South Africa, Sweden, Hong Kong, Indonesia, Egypt, United Arab Emirates, Singapore, Malaysia, and London to name a few of the more common locations (or rather, what they’ve indicated as their “network”).
Pretty neat to see it grow beyond the U.S. since it was not something we considered when the app was being built. Thankfully the db is running in UTF-8 to handle the various character sets — I’ve got at least one tag in Chinese which I can’t read but at least it renders correctly heh.
Oh, and as of this post, Describe Me has 80,249 users. Can’t wait until 100k! We’re gonna need to throw a celebration or something.
One of my Facebook apps hits Amazon’s Ecommerce Service (ECS) for item information via REST queries. I needed to process 19 separate queries searching for a title (basically searching 1 of 19 “browse nodes”), and return that data onto the Facebook canvas.
Simple enough, I could just do a foreach loop to make each REST request. Only problem was, say, each loop took 500 milliseconds total. That’s not hard when you consider all the steps: DNS, performing the REST query, waiting for the response, receiving the response, and then parsing the XML (in this case, using the SimpleXML extension). At 19 requests that’s 9.5 seconds which is way too slow, not to mention Facebook times out the request as well and returns a lovely error page.
For the test REST query I was benchmarking, it was averaging 9.65 seconds for the entire PHP script to complete (performing all 19 REST queries and then formatting the output). Simply switching from file_get_contents() to PHP’s cURL functions dropped the average to 7.32 seconds, a roughly 24% improvement. That was a great improvement, but still too slow and Facebook was still timing out the pages.
The root of the problem was that the 19 REST queries were sequential and as a result too slow. I needed to make those requests concurrently. Fortunately, PHP supported multi curl (basically making multiple cURL requests concurrently) via curl_multi_* functions.
BAM. Using curl_multi dropped the page generation time down to 1.6 seconds — much more reasonable and a 83.4% improvement. w00t.