When I added the favorite photos feature to my photo album software, I wanted a way to randomly show a subset of said favorites on the albums display page. I initially thought about implementing my own means of doing this through PHP. Ultimately, I wanted random selection without replacement, so that viewers would not see multiple copies of the same image in the 'Favorites Preview' section. Thankfully, MySQL saved the day!

When sorting a MySQL query, you can opt to sort randomly:

SELECT {some columns} FROM {some tables}
WHERE {some condition} ORDER BY rand()

The rand() function in PHP essentially gives you random selection without replacement for free! How great is that? It was an easy solution to a not-so-simple problem, and saved me a lot of programming time.

Update: I have since learned that the ORDER BY rand() call is horribly inefficient for large data sets. As such, it should ideally be avoided. There's a great article describing ways to work around these performance limitations.

Reading With Franz

Mar 17, 2009

My dad stumbled upon an incredibly well produced video entitled "Reading With Franz." In it, we learn how Franz, a puppet representing a person with a disability, is able to read books with a simple switch device and Tar Heel Reader. For those who may not know, Tar Heel Reader is a website my dad started a while back with an emphasis on providing books for beginning readers. There are over 3000 books on the website as of this writing, with more being added every day. Over 2200 visitors surf the site every week, with nearly 300,000 weekly page views. This map of readers shows that visitors are coming in from all over the world (a total of 80 countries so far). If you know a beginning reader, particularly one with a disability, be sure to check out the site.

About this time last year, I noted that our build machines at work were way out of sync in their respective local times. As a result, we were seeing a bunch of "clock skew" warnings when building our code. To fix the problem, I figured out how to use NTP on a private network. Imagine my surprise when, while performing a build today, I noticed more clock skew warnings! I checked our setup, and NTP was still functioning as expected. The problem, it turns out, was that some of our build machines had not yet changed over to Daylight Savings Time (DST), something NTP doesn't assist with. Only the oldest machines were affected, which wasn't surprising, seeing as Congress feels the need to change the DST rules every few years.

Thankfully, updating time zone information is easy to do. Here's how:

Step 1: Verify Your System Settings
Clearly, we should first check to see if we even need to update our system. To do this, we can issue this command, replacing 2009 with the appropriate year:
zdump -v /etc/localtime | grep 2009
The reported DST times should correspond with your local area. In my case, the reported date on the broken systems was April 5, not March 8. So this particular system needed updating. See the end of this article for a note on potential zdump problems on 64-bit systems.
Step 2: Obtain the Latest Time Zone Information
The latest time zone data can be obtained via the tz FTP distribution website. You'll want to get the tzdata{year}{version}.tar.gz file. In my case, the filename was tzdata2009c.tar.gz. Copy this file to the system to be updated, and unpack it in a temporary location (I put it in a subfolder in /tmp).
Step 3: Compile the New Time Zone Data
We now need to compile the new time zone data. This can be done through use of the handy zic command:
zic -d {temp_dir} {file_to_compile}
In my case, I used the name of zoneinfo for the {temp_dir} parameter, and I wanted to compile the northamerica file, seeing as that's where I live:
zic -d zoneinfo northamerica
Upon completing this compilation step, a new zoneinfo directory was created in the temporary location where I unpacked the time zone data.
Step 4: Copy the Newly Built Files
Now that the appropriate files have been built, we'll need to copy the specific region files to the right location. By default, Linux time zone information lives in the /usr/share/zoneinfo directory. Since I live in the Eastern time zone, I copied the EST and EST5EDT files to the aforementioned location (I didn't know which file I really needed, so I just grabbed both). These files will overwrite the existing versions, so you may want to back those old versions up, just to be safe. In addition to this 'global time zone' file, you'll want to copy the appropriate specific time zone data file to the right place. In my case, I copied the America/New_York file to the corresponding location in the /usr/share/zoneinfo directory. Again, you'll be overwriting an existing file, so make backups as necessary.
Step 5: Update the localtime Link in /etc
The file /etc/localtime should theoretically be a soft link to the appropriate specific time zone data file in the /usr/share/zoneinfo directory. On a few of the machines I had to update, this was not the case. To create the soft link, issue the standard command:
ln -s /usr/share/zoneinfo/{path_to_zone_file} /etc/localtime
Here's the command I issued for my example:
ln -s /usr/share/zoneinfo/America/New_York /etc/localtime
Step 6: Validate the Changes
Now that we have installed the new time zone information file, we can verify that the data has been updated properly, again by using the zdump command:
zdump -v /etc/localtime | grep 2009
This time, the dates shown should be correct. If you issue a date command, your time zone should also now be correct.

There is one word of warning I can provide to you. On some older 64-bit systems, the zdump command will seg-fault when you run it. This is a bug with the installed glibc package. I found this RedHat errata page covering the issue (at least, it refers to the package version that fixes this issue). Thankfully, I was able to compile and install the new time zone information without having to update glibc (I simply validated my changes by issuing a date command). It seems that only the zdump command exhibits the seg-fault on those older systems. Your mileage may vary.

I ran into a weird problem in one of our build scripts at work today. We compile our tools across a number of platforms and architectures, and I ran across this issue on one of our oldest boxes, running RedHat 9. Here's the horrible error that I got when linking:

/usr/bin/ld: myFile.so: undefined versioned symbol name std::basic_string<char, std::char_traits<char>, std::allocator<char> >& std::basic_string<char, std::char_traits<char>, std::allocator<char> >::_M_replace_safe<char const*>(__gnu_cxx::__normal_iterator<char*, std::basic_string<char, std::char_traits<char>, std::allocator<char> > >, __gnu_cxx::__normal_iterator<char*, std::basic_string<char, std::char_traits<char>, std::allocator<char> > >, char const*, char const*)@@GLIBCPP_3.2 /usr/bin/ld: failed to set dynamic section sizes: Bad value

It seems as if the standard C++ libraries on this system were compiled with gcc 3.2, while the version we're using to build our tools is 3.2.3. Unfortunately, the 3.2 compiler isn't installed on the system, and I'm not sure where we would find it for RH9 anyway. Thankfully, I found a workaround for this problem. Our link step originally looked like this:

gcc -shared -fPIC -lstdc++ -lrt -lpthread -o myFile.so {list_of_object_files}

I found out that by moving the standard libraries to the end of the line, the problem disappeared. Here's the new link step:

gcc -shared -fPIC -o myFile.so {list_of_object_files} -lstdc++ -lrt -lpthread

I don't fully understand why ordering should matter during the link step, but by putting the standard libraries last, we were able to get rid of this error. If you understand the root cause of this, please leave a comment explaining. I'd love to know more about why changing the order makes a difference.

Burrito Filling

Mar 8, 2009
  • 1 can (15-oz) kidney beans
  • 1 can (15-oz) pinto beans
  • 1 can (15-oz) black beans
  • 1 large carrot
  • 1 stalk celery
  • 1 medium onion
  • 1 garlic clove
  • 2 green chili peppers (optional)
  • 1 tsp chili powder
  • 1 tsp cumin
  • 1 tsp vegetable seasoning
  • 1/4 tsp kelp
  • 1/8 tsp thyme
  • Dash of cayenne pepper
  • 1 can (5-oz) tomato juice

Place all three cans of beans into a colander; rinse and drain thoroughly. Pour the beans into a large skillet and mash them. Place the carrot, celery, onion, garlic, chili peppers, and tomato juice in a blender, and blend well. Pour the blended ingredients into the beans, mixing them together. Add the rest of the ingredients, again mixing well. Simmer over low heat for 10 to 15 minutes, or until the mixture is warmed to your liking, stirring occasionally.

In my recent post on analyzing bandwidth usage, I promised an update once February was done. Seeing as it's now March, it's time for said update. Here's the graph of my bandwidth usage for the month of February:

I didn't break the 40 GB barrier, but I wasn't far from it this month at 37 GB. The highest daily total was 3304 MB on February 2, though several other days came close to that total. This is the first month that I haven't noticed any interesting trends, but it's still enjoyable to chart my activity. As I predicted, my daily average seems higher this month, thanks to my Roku player and Netflix Watch Instantly. If I break the barrier in March, I'll be sure to let everyone know. It appears Time Warner has done their homework on their proposed upper limit...

Ground Zero

Feb 27, 2009

Gizmodo pointed me this morning to an oh-so-wrong yet oh-so-fun Google Maps mashup, that allows you to nuke the city of your choice. Simply search for your favorite (or least-favorite) city, select your weapon, and nuke it! It was interesting to compare the blast radius of the Little Man and the more modern nuclear weapons. Suffice it to say that today's weapons are awfully scary.

My favorite, however, is the asteroid impact. Most. Destruction. Ever.

If I Ran the Oscars

Feb 22, 2009

If I ran the Academy Award ceremony:

  • The host would be a news reporter, chosen specifically for their inability to make lame jokes.
  • Said host would read the award category, the nominations, and the winner, without any pauses or cuts to montages of said nominations.
  • Award presentations that no one cares about (best sound editing, best art direction, best makeup, etc) wouldn't be televised.
  • Award winners would receive their award on a side stage with no podium or microphone, thereby removing their ability to give an acceptance speech.
  • The entire award ceremony would be 30 minutes long.
  • Nielsen ratings for the event would be at an all time high.

Hold your applause, please.

A PHP Include Pitfall

Feb 22, 2009

I ran into an interesting problem with the PHP include mechanism last night (specifically, with the require_once variant, but this discussion applies to all of the include-style functions). Suppose I have the following folder structure in my web application:

myapp/
 |-- includes.php
 +-- admin/
      |-- admin_includes.php
      +-- ajax/
           +-- my_ajax.php

Let's take a look at the individual PHP files in reverse order. These examples are bare bones, but will illustrate the problem. First, my_ajax.php:

// my_ajax.php
<?php
require_once("../admin_includes.php");

some_generic_function();
?>

Here's the code for admin_includes.php:

// admin_includes.php
<?php
require_once("../includes.php");
?>

And finally, includes.php:

// includes.php
<?php
function some_generic_function()
{
    // Do something here
}
?>

When I go to access the my_ajax.php file, I'll get a "no such file or directory" PHP error. This immediately doesn't make much sense, but a quick glance at the PHP manual clears things up:

Files for including are first looked for in each include_path entry relative to the current working directory, and then in the directory of the current script. If the file name begins with ./ or ../, it is looked for only in the current working directory.

The important part is in that last sentence: if your include or require statement starts with a ./ or ../, PHP will only look in the current working directory. So, in our example above, our working directory when accessing the AJAX script is "/myapp/admin/ajax." The require_once within the admin_functions.php file will therefore fail, since there's no '../includes.php' in the current working directory.

This is surprising behavior and should be kept in mind when chaining includes. A simple workaround is to use the following code in your include statements:

require_once(dirname(__FILE__) . "../../some/relative/path.php");

It's not the most elegant solution in the world, but it gets around this PHP annoyance.

TF2 Scout Update

Feb 21, 2009

It looks like I'll have a reason to get back into Team Fortress 2 next week: the official Scout update is nearly here! So far, Valve has released information on the following:

There are still two days of updates left to be unveiled. One of them, if I recall correctly, is a new payload map, and the other is undoubtedly the new primary unlockable weapon (replacing the scatter gun). Very exciting!

Watchmen Review

Feb 16, 2009

Reading Watchmen is, for me, akin to looking at the Mona Lisa. In my heart of hearts, I know it's a masterpiece, but I just don't like it. My main problem with Watchmen, and a problem I'm increasingly having with LOST (which I'm trying to catch up on), is that there's no hope for the characters. I have absolutely no reason to root for the characters in Watchmen; they're the saddest group of people in the world. The story is overly complex, the pacing erratic, and the tone is way too preachy for my liking.

I know lots of folks out there adore this story, but I say 'skip it.'

Female Operator

Before I get to the actual point of this post, allow me to rant just a little. What's up with the increasing number, and more importantly the duration, of public radio/television pledge drives? Our local public television station, UNC-TV, will be starting their Festival drive in February, and it will last for more than a month (February 21 to March 29)! If this kind of thing happened just once a year, I wouldn't care so much. However, two months ago, the station had its Winterfest drive (November 30 to December 14). Occasionally, they'll even have a drive in August! Public television clearly needs commercials. I would suggest having commercials between the television shows they offer, so as to keep the 'commercial-free' feel of today. Just my 2 cents.

Back to the real topic. Driving home yesterday, I listened to a little bit of our local public radio station. They are currently in the midst of their pledge drive, so programming is light and begging for pledges is heavy. In the midst of their asking for donations, you often hear the sound of telephones in the background. And I'm talking old school telephones. Let's take a quick walk down memory lane and have a history lesson.

Back before the digital revolution, telephones had bells in them. Yes, physical bells. When someone called you, a small hammer oscillated between two of these bells, causing the telephone to 'ring' (hence the term 'ringing' someone). I haven't seen one of these telephones in probably 20 years or more. Yet, during these public entertainment pledge drives, you hear them ringing constantly.

The funniest circumstance of this is found during the public television pledge drive. Volunteers can be seen in the background sitting at computers with their operator-style headsets. No telephones can be seen during this time. And, occasionally, none of the operators are talking. Yet the ringing goes on. So where are those ringing sounds coming from? Are the computers synthesizing the sound? Or is it a gimmick being pulled from the control booth?

I like to think it's the latter. On my way home yesterday, while listening to the radio, I got thinking about this phenomenon. There must be a point at which this ringing trickery yields the greatest ROI, right? And someone must have figured this out. I'm no statistician, and I'm no psychologist, so the following logic is simply me thinking aloud. If the 'phones' were constantly ringing off the hook, with no breaks in between, it seems to me that listeners would be less likely to call in and pledge (why pledge, when everyone else is doing it for me?). Likewise, if the phones were too silent, listeners again might be less inclined to call (silence won't prompt the listener into action). So the answer certainly lies somewhere in between. I'm guessing that, if the ringing is indeed a trick, the frequency of said ringing is somewhere on the lower end of the spectrum. As a radio station, you want to sound needy, but not too needy. Others are supporting us; why won't you?

I'd love to know where the middle ground really is. Maybe an influential politician will happen upon this post and decide to funnel some of our country's economic stimulus package into a research program on this topic. Our nation's public media outlets might depend on it. ;-)

Time Warner Cable recently announced that it will be bringing bandwidth caps to more cities, after apparent success in their trial area of Beaumont, Texas. The upper bound on the cap is 40 GB, considerably lower than the 250 GB cap used by Comcast. Go over that amount, and TWC will charge you extra overage fees. I'm completely against this. If caps come to our area, I will seriously consider ditching TWC for some other means of internet access (perhaps the recently mentioned WISP network). Note to Verizon: start rolling out your FiOS service to the Triangle area; I will happily subscribe!

Anyways, while chatting with my dad about these caps, we got wondering about what our bandwidth usage rates really are. I recalled that my router (the oh-so-wonderful Linksys WRT54GL), which I flashed with the open-source DD-WRT firmware, supports bandwidth monitoring (beginning in v24). Happily, I flashed v24-SP1 right before I moved into my new house, so the data has been collecting ever since that time. There are some very interesting trends in the graphs, so let's take a look at them:

I moved in on September 30, so October of last year is the first month I had data for. As you can see, I transferred just under 31 GB for the entire month. Though it's not as apparent on this graph, the peak daily value was 4204 MB. One interesting trend in this graph is that you can see how busy I was with unpacking my stuff at the beginning of the month. As the month went on, I was online more and more.

The graph for November is a little misleading. Note that the units on the y-axis are on a different scale. This is thanks to the largest daily transfer (a clear outlier): 7223 MB on November 15. I purchased a game on Steam that day, which accounts for the majority of that bandwidth. Overall, this month was pretty light, though you can see that I was home at the end of the month (for the Thanksgiving holiday). November's bandwidth total was just over 30 GB.

I was home for over half of December, thanks to all the vacation I failed to take throughout 2008. As such, my daily bandwidth average was much higher, with a monthly total of 34.6 GB. I was out of town from the 27th to the 31st, which explains the lull in that period. The largest daily total was 4780 MB, on December 15.

January's graph is very interesting. Again, take note that the y-axis values are different. Can you identify the day I received my Roku player? It's pretty clear that January 17th is the beginning of a new trend of bandwidth. I've been watching a bunch of stuff on Netflix watch instantly, which accounts for the daily spike in activity. The monthly total for January was 31.6 GB, with the largest daily peak of 2428 MB occurring on January 31.

All in all, these graphs are pretty interesting to analyze. Watching my bandwidth usage over the next few months should be an educational experience. Surprisingly, I have yet to break the theoretical 40 GB limit. However, I have not yet had my Roku player for a whole calendar month. After February has come to a close, I will post an update on my bandwidth usage. That should give me a better clue as to what my 'real' bandwidth totals will be going forward, seeing as I watch Netflix content more than I watch broadcast TV.

Do you track your bandwidth? If so, share your findings!

It appears that Microsoft is quietly slipping in a Firefox extension with updates to the .NET framework. The extension is named "Microsoft .NET Framework Assistant" and, based on the description, "Adds ClickOnce support and the ability to report installed .NET versions to the web server." According to reports, this extension:

  • Cannot be uninstalled through Firefox
  • Changes the Firefox user-agent string
  • Does God knows what else

Happily, people have figured out how to uninstall the extension. This move seems pretty dirty to me, but Microsoft has been pointed in this direction for some time now. If you find yourself 'infected' with this piece of malware, do yourself a favor and remove it.

Eye on Springfield

Feb 2, 2009

A week or two ago, I was introduced to the Eye On Springfield blog. For a Simpsons nut like myself, the site is pure enjoyment. According to the site's official description:

Eye On Springfield is a retrospective of Simpsons hilarity spanning from seasons 1 to 9, when it was still funny.

I would disagree with the "still funny" bit; there are plenty of classic episodes in seasons 10 through about 18, though the most recent seasons have definitely fallen off. Site posts range from scenes in an episode, sometimes with an accompanying quote, to sound clips. It's a great site, and I highly recommend it for a laugh.

I'm also glad to see that I'm not the only one who, in the words of Patrick Cassels, has:

that annoying habit of beginning half my sentences with, "Remember that Simpsons episode where..."

This meat sauce recipe makes for an excellent spaghetti sauce, sloppy joe filling, or "meat pie" filling.

  • 1 pound ground turkey or beef
  • 1 small onion
  • 1 small bell pepper (optional)
  • 1 can tomato soup
  • 1 or 2 tsp sugar
  • 1 tsp basil
  • 1 tsp oregano
  • 1 tsp dried parsley
  • 1 tsp Italian seasoning
  • 1/8 tsp garlic powder
  • 1 Tbsp olive oil
  • 1 Tbsp Worcestershire sauce
  • 1/2 tsp salt
  • Pepper
  • Ketchup
  • Grated Parmesan cheese

Mince both the onion and bell pepper. In a skillet, combine the olive oil, minced onion, bell pepper, and meat, and brown over medium-high heat. Once the meat is fully browned, reduce the heat to simmer. Add the can of tomato soup and, using the same can, 1/2 a can of water (don't add the water if you want a thicker sauce). Add the Worcestershire sauce, sugar, and a good squeeze (or dollop) of ketchup. Mix the ingredients together well. Now add the spices: basil, oregano, dried parsley, Italian seasoning, garlic powder, about 1/2 tsp salt, and ground pepper. Shake Parmesan cheese over the top of the sauce, and mix everything together well. Taste the sauce; more salt and pepper may be desired.

RIP Train Sim 2

Jan 28, 2009

As a part of its recent layoffs, Microsoft closed Aces Studio, the team behind Microsoft's stalwart Flight Simulator line of games, and the upcoming Train Simulator 2. According to the report, Train Simulator 2 is officially dead, and will not be revived. The Flight Simulator counterpart may be brought back at some point, but even its fate is undetermined at the moment.

I know I'm one of the only people on the planet who cares about it, but I was looking forward to Train Sim 2. The graphics looked great, and I was really looking forward to the migration to the Flight-Sim game engine. But, alas, it's not to be.

Onward and upward, I suppose.

The Ultimate Apple Ad

Jan 23, 2009

Twenty five years ago today, the oh-so-epic 1984 ad from Apple debuted during the super bowl. This ad is as powerful today as it was back then. If there's one thing Apple can certainly do well, it's marketing. They have perfected the art of making their products cooler than the rest, something lots of other companies would love to learn how to do. If Microsoft had learned how to market as well as Apple, perhaps there would be no Apple at all. But alas, that was not to be (and we're all the better for it).

Here's to one of the best advertisements in the history of advertising!

Avatar Finale

Jan 21, 2009

I just completed the final season of Avatar: The Last Airbender, and all I can say is wow. This show is, without any doubt in my mind, one of the top 5 television shows I've ever seen. It ranks up there with The Simpsons and Pushing Daisies, in my opinion, and that's saying something (seeing as how big a Simpsons nut I am).

The series finale takes place across four episodes, each more epic than the one before it (and the episodes leading up to this finale are just as good). There are some excellent surprises throughout each episode, with the vast majority of the lingering mysteries finally being resolved. A surprising twist occurs in the finale's climax, and it's safe to say I was thoroughly surprised. The ending was fulfilling beyond anything I imagined.

If you haven't checked out this series, I cannot recommend it highly enough. I will definitely be purchasing the entire series on DVD (Season 1, Season 2, and Season 3).

There are only two things that could have made the series better. First, I was disappointed that we didn't learn more about the air bender society. If Aang is the only one left, what becomes of their heritage? Has this nation therefore truly become extinct? Second, there were still a few strings left dangling at the series end. I won't list them here so as to avoid spoilers, but I'd love to know what becomes of these threads. Another episode (or two ... or three) would have been really appreciated to see where everyone ends up. Perhaps the creators of the show will come back in the future to wrap up these threads.

All in all, I had an enjoyable time with this series. Even though I'm a little sad to say goodbye to it, I'm oh-so-glad I took the journey. I now have other shows to catch up on, both old and new, and that's what I'm off to do next. So long, Aang and company. It's been a wonderful ride.

Circuit City Closing

Jan 16, 2009

It seems that Circuit City is closing for good. Deep down, does anyone care? Part of me does, and part of me doesn't. I've been to Circuit City probably twice in my life; both times were to pick up games that Best Buy didn't have at the time. Since that time, I've come to hate stores like that (hence the part of me that doesn't care). At the same time, with Circuit City exiting the market, the market for certain things becomes that much smaller. That bothers me, especially since the competition is so lame. What do you think? Will you miss Circuit City?