Friday, November 07, 2008

Yes, You Can Dump MS Outlook

When you buy a new Windows system, you will likely have Outlook Express preinstalled (I don't know anyone who uses it these days). Worse, you may feel inclined to dish out $85 for MS Office Standard 2007 (which includes MS Outlook) because the Windows office productivity space is crowded by Microsoft products. If that's all your workplace supports (because they use Exchange servers), you have no choice. But what if you could dump all that for a free email, calendar, and task management solution? It's easy and attainable.

For email, install Mozilla Thunderbird. It does everything MS Outlook does as far as sending and receiving emails, and I think perhaps a little smarter in organizing your mail, searching, and customizing. One caution: resist the temptation to import settings from other mail clients - it gets messy within Thunderbird.
You need the Lightning plugin to add task and calendaring features: download and save the .xpi, then from TB, install it using Tools | Add-ons. After TB restarts, viola! Il y a un calendrier.

Since I routinely work on more than 3 computers, I need them all to have my updated schedule at all times. For this, online calendars provide the needed synchronization link. If you use Google Calendar, you can set up Lightning to synchronize with it from Thunderbird, like this:
Download yet another plugin - the Provider for Google Calendar. After installing it as an add-on for Thunderbird and restarting:
  1. Open your Google Calendar from a web browser
  2. Click on Settings, then on the Calendars tab, click on your calendar.
  3. Scroll to the Private Address area (bottom) and right-click the XML link to copy its URL.
  4. In Thunderbird, select File | New | Calendar : On the network : Google Calendar (and paste the URL here) : provide credentials.
That's all. Since all your calendars now feed off the same online calendar, your computers always have an updated copy whenever you run Thunderbird. This is a totally winning solution that's free and effective. At this point I was ready to backup my messages in Outlook and uninstall it from all my computers. I haven't seen issues in a couple of months since I did this.

Sunday, November 02, 2008

Launch OS Applications From Java

It's not a good idea to reinvent the wheel when programming for various platforms; you'll live happier when you can just use what the operating system already offers - straight from your Java application. This is especially helpful if the OS program you are interested in supports command-line invocations: this tip shows how to run those programs and capture the output those programs would produce back into your Java application. Then you could use regular expression or other means to parse the output.
Here's all you need:
    public static List<String> runCommand (String... cmd) {
List<String> output = new ArrayList<String>();
try {
// Create the process
ProcessBuilder pb = new ProcessBuilder(cmd);
Process proc = pb.start();

// Obtain the input stream
InputStream is = proc.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);

// Read what is returned by the command
String line;
while ((line = br.readLine()) != null) {

// Close the stream
} catch (IOException ioe) {

// Return the output
return output;
As a utility method, just pass it an array of strings used to build the command, first the fully-qualified name of the program to invoke, followed by program arguments. For example, if I needed to run a Perl script called and have the Perl interpreter at C:\Perl\perl.exe, then the string array would be something like {"c:\\perl\\perl.exe", "c:\\security\\"} - notice the use of fully-qualified names everywhere - this process is cross-platform and doesn't look up environment variables to help resolve relative filenames.
The methods returns the output - what you would have seen on the console - as a list line by line. Easy stuff.

Thursday, August 07, 2008

Facebook Browser History Management Bug In Firefox

I had mentioned this issue in a previous blog post, and I think I have identified the root cause. If you use Firefox to browse Facebook, at some point, content will begin to fail to load. This is because of malformed URL often generated by AJAX history management libraries when they do URL replacement or add tags to the URL to help them keep track of history (essentially manage what happens when you click the Back, Forward, or Refresh buttons on the browser). You can do two things about it - either reopen the link in a new tab (so there's no history), or edit the URL as I'll show you.

To reproduce:
(1) Fire up the fox and log into Facebook. Your URL should look something like this:

(2) Click on the "Messages" link. It should open your messaging area, defaulting into your inbox. The URL should look like this:

(3) Click on the "Home" link. It should bring you to your Facebook home page. The URL should look as in the image below. Notice how the URL looks a bit weird - and that's the beginning of the problem. The more you browse around, the more URL chunks are appended at the end of the current URL.
(4) Click on the "Profile" link. The URL looks like that in the image below. This will fail to load content most times, as I described in that blog post.

If you want that URL to work, remove the part between the domain and usually before the last #. So, becomes, for example, and it will work.

This is only one of many ways Firefox users will run into this issue. Any combination of browsing history will most likely come to the same conclusion, and it's not pretty. Facebook totally needs to fix this bug, or they risk frustration from Firefox users that with think their site sucks when their favorite browser is used.

Tuesday, July 15, 2008

Facebook Not Displaying Content In Firefox

If you live on Facebook - like I do - and use the Firefox browser, you will soon run into an issue that goes something like this: after browsing around on Facebook for a while, clicking links no longer displays the desired content. The browser may have a 'busy' mouse cursor, but most times, it'll report 'Done' in the status bar.
To make sure I'm not going nutz, I did a few sanity tests:
(1) If I open the same link in a new tab (right-click on link -> open in new tab), all is great - the content displays just fine.
(2) If I close and reopen Firefox, then reenter the link, all is well.
(3) I have seen this on more than one computer - at work, home, on friends' machines, and elsewhere.
(4) It doesn't matter what plugins - or add-ons - you have installed. A bare FF install behaved the same way.
(5) It's a Facebook-only issue. You can immediately use the same tab for other websites just fine.
(6) There are no errors generated - whether they be scripting or browser exceptions.
(7) I haven't seen this issue in IE.
(8) If you clear the active cache (both memory and on disk) for the browser, it clears the problem.

I suspect two things:
(1) AJAX history management, since it tends to happen after a lot of browsing on Facebook. Could it be cache/memory managment in the browser?
(2) Timeouts while switching the various DIVs. I did capture a trace of traffic generated when a link is clicked, and all asynchronous requests are returning successfully, as far as I can see. The next thing would be to hide/unhide divs loaded with content - which doesn't seem to be happening in this issue.
(3) Browser's Javascript engine or DOM engine - something might be amiss. Haven't investigated.

Maybe someone else out there has seen this inconvenience, and know how to fix it - if it totally isn't a Facebook issue. I've tried FF 2.x.x.15 and FF3. Or maybe Facebook will investigate.

Friday, July 04, 2008

Trends In Web Development

Any interactive website you visit on the Internet these days has web applications running behind it. The applications do all the work and most content is generated on the fly - as opposed to the old model where static pages were served from a web server somewhere. Gone are the days when plain HTML was sufficient to give you a web presence - users demand (and deserve) a highly dynamic web experience now, complete with multimedia and personalized browsing.

How you make this happen is beyond basic HTML skills; it calls for pooling graphic design, database, object-oriented programming, and scripting skills in the mix to get even the simplest web applications up and running. I just finished a small AJAX webapp at work, but it used a variety skills including HTML, Javascript, XML, DWR (or JSON oriented client/server transport mechanisms), Java (with servlets, JSPs, and JavaBeans), SQL (database) and data access methodologies (DAOs, connection pooling, keep-alive), Perl (for the user scripting aspect), C/C++ (for access to OS and native libraries using JNI), and security (session and identity management). For such a small project, it sounds like a lot of skills to pour in - but this is the trend in the web development world. No longer can you survive on one particular skill; your technical worth is directly tied into how diverse and related your skill set is.

There's also an increased shift to developing web applications using regular software development cycles and models. Websites are now versioned in content management and versioning systems just like your typical enterprise product. Build cycles exist and well as integration cycles and such.

Web development tools also continue to take prominence. No one writes code from scratch these days - that's under-productive, like reinventing the wheel. Instead, serious developers maintain templates or use rules-based code generation engines. I particularly like tools that can take all my UML junk and produce code from it - all the classes and stubs I will need. So much time savings there.

In mentioning tools, you've got to consider IDEs and plugins. It is no longer sufficient to know how to use only one IDE because projects come in all shapes and sizes. I routinely use NetBeans and Eclipse on the same project, for example, because of the various benefits each offers (especially in the area of add-ons). Mind you, each skill you will need to use may have its own IDE, so you've got to stay on top of it all.

Building the application itself isn't sufficient - it has to be tested: performance, usability, security, and stress. I don't care how good a developer you are, but there will always be bugs in written code. I've stuck with Java for a long time because of JUnit and other testing fameworks. NetBeans has a profiler, and Eclipse has code optimization for various platforms. I often write a test suite that leverages Javascript's power for each web application I develop. Suites such as this are good for regression testing whenever changes are made to the site.

Here's where experience does some good: on a few projects, we've been able to write large chunks of webapps as Java code only and have CSS, HTML, and Javascript generated on the fly - using rules. This is becoming a huge hit because it eliminates having huge code bases: essentially the application server "knows" how to produce content at the point of request. It also reduces coding errors greatly and aides maintenance. Such systems are highly scalable because you can retask them, for example, by simply changing the rules. They are harder to design though.

Something to always remember is that web code is no longer proprietary. It is easily reverse-engineered (look, functionality, etc without legal consequences) that there's no point in trying to protect Javascript and other client-oriented features. I laugh at people that try to prevent downloading images, for example, by disabling browser menus. Do they forget that if you can see it on your computer, it IS on your computer? Waste of time. If you have to protect something, leave it at the server.

Finally, it's a can of worms when you consider personalized browsing. Everything from targeted ads, to personalized look & feel, to presentation of related content and session management fall under this umbrella. It's a sort of social engineering done in software, once again eased by rules-based computing. Beyond that, reporting strategies, tracking, data warehousing, and all that crap I don't do well - the web application isn't complete until most of this stuff is in place. Web development is no longer a piece of cake.

So those are just a few observations from my latest project. I'm actually relieved it's over - it was beginning to drain the life out of me.

Wednesday, May 28, 2008

NetBeans 6.1 EA for PHP

Nice work by the people at Sun - there really is a NetBeans specifically for PHP. While I complained about missing support for a PHP plugin in 6.1 that was there in 6.0, one of my blog readers alerted me to this early availability release. Though, if you are used to NetBeans coming loaded with application/web servers, this version doesn't. But you can always install WAMP, which will provide the database, web services, and PHP engine for you.

To get NetBeans working with WAMP, the steps couldn't be easier:
  1. Create a new project in NB.

  2. Set where files should be copied to when you run the project.

  3. Make sure the Copy To Folder path points a web-accessible directory in WAMP, usually the www directory. So when you "run the project" in NB, files will be copied directly into the web server.
  4. Start the WAMP stack, and run the project. Should be accessible at http://localhost/TestApp (your substitues).
That's all there is. I like that NetBeans maintains the simplicity of developing PHP in this IDE. It's lightweight and poised to become a great development environment for all PHP scripting. I honestly had some trouble finding a [free] IDE for PHP development - that I liked, no strings attached.


Thursday, May 22, 2008

NetBeans Pulls PHP Plugin

In April, I mentioned that you could develop PHP in NetBeans 6.1. But at this writing, the plugin required has been quietly removed from NetBeans update/plugin centers. The search I describe in the article will only yield a jMaki plugin. This is disappoining because PHP developers would have liked using the scripting plugin for its pure orientation to PHP than jMaki that's a Java-oriented feature.
The plugin is not available for 6.1, but may be found for 6.0. Here's one reason to keep an older IDE around!

Monday, May 19, 2008

"War Dance" Of Ugandan Children

I spoke at an elementary school last week, and one of the topics that interested the kids most was what kind of life kids their age have in Africa. Somehow the discussion came around to the plight of tens of thousands of kids in northern Uganda that have lost their innocence at such tender ages. It's hard to imagine any tough situation until you experience it somehow.

In northern Uganda, a war has raged for a couple of decades. It started along tribal lines, but has since escalated into a rebel movement that has rendered the region unstable. Somehow the Ugandan government has not been able to completely weed them out - hard as they have tried. Whenever government solders are present, the region is quiet. With a limited military budget as most developing countries have, it's not feasible to keep soldiers deployed. So rebels have learned to only act when the military are withdrawn. Adding salt to injury, this particular rebel group has sanctuary and supply bases in southern Sudan, out of reach of Ugandan military. If you know what's going on in Darfur, you might understand why Khartoum hasn't been willing to help squash the uprising.

In this war, villagers are rounded up, kids abducted and introduced into rebel ranks, human trafficking, and other crimes against humanity. The strangest thing is that rebels are by and large attacking their own people - one of the reasons it's remained focused in the north. The story of ceasefires and reconciliation and amnesty and peace talks is as untrustworthy as a morning sunset.

But out of the misery comes a small film called "War Dance" (review) that chronicles the lives of 3 youngsters. The feature does a good job shedding some light on the personal lives of people affected by the conflict, hardly anything glamorous. You might see the scars of conflict all over their faces, but notice glimmers of hope in their human spirit. I like how music is the last thing about a populace to die when everything else crumbles; in it, the kids are hopeful. It's what kept the people's spirit aflame in South Africa during the days of apartheid, for example, or that energized slaves on cotton farms in the south. There's something about music that breeds resiliency - and you will see it in this film, along with deeply authentic moments of experience our comfortable lives in America can't afford us.
[It is produced by Shine Global. (If you have Netflix, it is available for viewing online already). Production note: some scenes are staged, so if you are a purist for documentaries, relax the rules. But it's nicely done] ...

Finally, you might have heard about efforts by organizations such as Invisible Children and World Vision - their hard work in the region to tend to the plight of these young children, most of whom have known life only as orphans, refugees, or child soldiers. Please support them much as you can - they are doing a great job in the region, helping restore a sense of nomalcy. In all my travels, the Acholi land of northern Uganda has been one my most desolate destinations - quite depressing and hopeless. I have no idea how life flourishes in such a place - with rumors of war and abduction plaguing villages. It's a desperate situation. It would be a shame for the world to ignore the situation any longer.

Thursday, May 15, 2008

Into Linux: Installation

As I explore migrating all my systems to Linux, I've decided to use 3 choices for my evaluation: Red Hat Enterprise Linux 5, Ubuntu 8.04 Server, and OpenSUSE 10.3. I'll be scoring them as I go along, and will ultimately stick with one that scores highest in my experience. My setup will be a system with 3 hard drives. The intent is to install all 3 OSes on this system, multi-booted.

Download experience: If you hope to just download RedHat's Linux from their site, you'll be frustrated to learn you need a subscription to get a basic ISO image. You'll also be tossed back and forth from the Fedora site to their JBoss stack site. This nudging is annoying - for an OS I believe is basically free. It's much more straightforward with Ubuntu and OpenSUSE - download links give you what you need. [RH=0, S=1, U=1].

Install media: RHEL5 comes on 6 CDs, although you may need only 2 or 3 for a basic install, depending on options selected. Ubuntu is 1 CD as is SUSE. The less the media needed for a reasonable install, the better. [RH=0, S=1, U=1].

Install experience: I felt a bit skeptical working with Ubuntu - it reminded me of text mode installs on Windows NT 3.11. It does a good job detecting hardware though and installs faster than the other 2. RedHat takes the longest - up to 40 minutes. I felt most confident about SUSE and had an install in 25 minutes. Another dimmer for Ubuntu - it failed to detect an existing RHEL5 install and configure GRUB appropriately. When I pulled the drive with RHEL5, Ubuntu failed to boot. It's rescue mode saved the day nevertheless, but it scores low for this small issue. I even managed to get the famous "Error 22" after installing Ubuntu. Smoothest and most intelligent install is SUSE, although I also liked how easy and straight-forward the RedHat install went. [RH=1, S=1, U=0].

At this point, OpenSUSE is looking very likeable. Ubuntu follows a close second, after which RHEL5 is just painful. I'm scratching the setup and reinstalling all 3 in this order: Ubuntu -> Redhat -> Suse. I thought of using the recently released Fedora 9, but that's 6 *.iso I have to download and burn to CD, and as long as it's some Redhat clone, I can live with RHEL5 for this experiement.
Final score: OpenSUSE=3, Ubuntu=2, RedHat Enterprise Linux=1.

Friday, May 09, 2008

Plunge Into Linux ...

I've finally decided to switch most of my server operations to Linux. I happen to have a couple of old Pentium-II machines lying around, and although they won't run Windows (properly), I know they can run Linux without a problem. To prepare for this plunge, I have a collection of general Linux ebooks - including certification materials for the Red Hat Certified Engineer (RHCE) certification. I probably won't be taking the certification itself ($800 a pop?), but at the end of this exercise, I will have the same knowledge as those guys.
It was surprisingly hard to find a free version of RHEL 5. I know Linux itself is free, but all I kept finding on their site were subscription-based downloads. Forums everywhere suggest it should be free, but for hours I couldn't find a free download of the OS. I really just need a basic server build to do this "certification" on - my way of getting fully into the world of Linux.
The other motivation for interest in Linux at this time is the impending dumping of Windows XP support in 2009. If you didn't know, Microsoft will stop supporting and selling Windows XP next year - a setup that forces people to move to Vista. I don't want Vista (much heard about its headaches), so I am hoping that by that time I'll have proficient Linux experience to migrate all my crap to Linux when the need arises. I've heard much about Ubuntu being poised to penetrate the desktop market, so when I buy my next laptop (which will come with Vista - obviously), I want to be able to move quickly to Linux.
So much has changed in the world of Linux since I last worked on it. I probably should purge it from my resume until I am done with this exercise. I'm doing this in tandem with other projects I already had going - in fact, some might benefit from a migration to Linux servers.

Thursday, April 17, 2008

System Disk And Drive Letter In Perl

There's no easy way to discover the system disk and drive letter on Windows, so I came up with this tedious work-around. Basically, you use Windows Scripting Host (WSH) to write a file that Perl can then read. Someone may wonder why I'm not using Win32:: packages - well, they don't implement the whole CIMv2 spec, so you'll begin running into errors such as Win32::OLE(0.1707) error 0x80020009: "Exception occurred", smartly described as a "Generic Error". It's a nightmare to dig through MSDN to figure out what the heck an ambiguous error like this means.

So the simple workaround: write a VBScript file that contains this code:

Set objWMIService = GetObject("winmgmts:\\.\root\cimv2")
Set colDiskDrives = objWMIService.ExecQuery("SELECT * FROM Win32_DiskDrive")
For Each objDrive In colDiskDrives
strDeviceID = Replace(objDrive.DeviceID, "\", "\\")
Set colPartitions = objWMIService.ExecQuery _
("ASSOCIATORS OF {Win32_DiskDrive.DeviceID=""" & _
strDeviceID & """} WHERE AssocClass = " & _
For Each objPartition In colPartitions
If objPartition.BootPartition = True Then
bootinfo = objPartition.DiskIndex
Set colLogicalDisks = objWMIService.ExecQuery _
("ASSOCIATORS OF {Win32_DiskPartition.DeviceID=""" & _
objPartition.DeviceID & """} WHERE AssocClass = " & _
For Each objLogicalDisk In colLogicalDisks
bootinfo = bootinfo & " " & objLogicalDisk.DeviceID
strOutputFile = "./theboot.txt"
Set objFileSystem = CreateObject("Scripting.fileSystemObject")
Set objOutputFile = objFileSystem.CreateTextFile(strOutputFile, TRUE)
Set objFileSystem = Nothing
End If

There are zillion ways to write this code to the filesystem in Perl; I leave that up to you. Execute the file (with a .vbs extension) on the system shell (DOS), and have Perl read the file theboot.txt (or whatever you call your file) - parse it for the physical disk number (given by Win32_DiskDrive.DeviceID) and the drive letter (given by Win32_LogicalDisk.DeviceID). Associating the two WMI classes is weird, as the code above shows, but it works. In the code, the file is written in the format [disk_number] [drive_letter].

Note that I use for-loops in this workaround, so if you have multiple boot disks, only one will be discovered.
Also, don't rely on DISKPART for this job, especially if you are scripting things. It works well interactively, but I've found it unreliable on systems with large drive counts - in such cases, the disk number won't always be the same.

This workaround essentially saved my day: reliance on diskpart had cost me a couple of lost OSes as it sometimes reported that there was no system partition or could not find drives whatsoever, especially if you had a lot of drives. The WMI/CIM solution seems bullet-proof - for now.

Saturday, April 12, 2008

Developing PHP On NetBeans

I'm finally making the move to developing in PHP - or at least I must for the time being because of an important project I am working on. To make things easy on myself, I need a familiar development environment - NetBeans 6.x is just perfect for that. Here's how you set up NetBeans to talk PHP.
  1. Ensure you don't have previous installations of PHP. Presence of such will confuse the hell out of your new installations.
  2. Download the Wamp server. This package contains an Apache HTTP web server, MySql database, and the PHP engine, along with some admin utilities and a Wamp server console. Believe me, this stack takes the nightmare out of configuring PHP on Windows.
  3. Download the PHP plugin for NetBeans: Tools | Plugins -> Available Plugins tab -> Search [php]. Install the PHP scripting plugin, which includes an editor, runtime, and documentation. We probably won't use the runtime that comes with this plugin, as we need to use Wamp and other debug tools that it does not support. NB will need to restart after this install.
After NB returns, test the setup by creating a simple PHP project:
  1. File | New Project ... -> PHP | PHP Project [Next] -> (set project properties) [Next].
  2. Web server configuration: click [Manage] -> Connection name = Wamp2, Local web server with file access [Next].
  3. Manual configuration [Browse] to %where_you_installed_wamp%\bin\apache\apache2.2.8\conf\httpd.conf for the Apache config file. [Next]
  4. Http server settings - just change the port number to something no other app is using. I choose 81 on localhost (Remember to update httpd.conf to listen on port 81 as well).
  5. Finish up and close the dialogs. NB creates a new project and opens index.php for you to edit. You can start wil basic HTML in a .php file. If on running the project you see a good page, all is well.
Note that you won't need to configure the web server every time you create subsequent projects. Also, I'll leave debugging and other configurations to future posts.

Friday, March 28, 2008

Lots Of Free PC Utilities

I happen to also be a PC Service Technician (aside from everything else I do for work), and I survive on having the right utilities to service your computer whenever you call on me. PC Magazine is running an article that comprehensively reviews over 90 utilities you can use to tweak your Windows PC. I spent all evening pretty much checking out all the free ones (not evaluation copies) I didn't already have, and I'm quite impressed. Most are small downloads that install cleanly (no need to reboot) and are unobtrusive in your daily computing tasks.

Missing from the list are some on my wish list:
  1. A tool to convert movies from DVD, AVI or YouTube to iPod format easily. My current setup takes all night to rip a DVD - but it works.
  2. Networking tools such as one that can load-balance network bandwidth between multiple NICs on a system. Theoretically, if I use a wireless card and regular ethernet, I should have double the bandwidth, right?
  3. Program backup - a tool that can backup individual programs. Say, if I want to backup my MS Office, this tool should know which files, registry keys, DLLs, etc that are needed so that you can make programs more portable. Something that goes beyond System Restore and regular file-based backup/restore.
  4. A daemon to lock up your registry and startup folders (such as Adaware's Adwatch used to do). I having to manually clean up these areas once in a while.
  5. Optimization tools - anything that'll make your machine run faster.
  6. CD/DVD copyright restrictions cleanup. After a while, DVD players refuse to play encrypted CDs and DVDs because someone thoughtlessly decided that protection by region should be tracked.

Nonetheless, the PCMagazine's list is a good start for someone wanting to milk more from their Windows system. There's just too much stuff out there that perhaps it all couldn't be covered in one article.
My personal favorites from that list:
  • IrfanView = See any image file and most digital videos, convert files to other formats and do some quick editing and annotation.
  • Microsoft PowerToys = Useful tools that don't come with Windows, especially SyncToy, ClearType Tuner, Open Command Window Here, Alt-Tab Replacement, Power Calculator, Virtual Desktop Manager, Taskbar Magnifier, and Webcam Timershot.
  • FreeCommander = Much better than Windows Explorer in controlling your files.
  • QTTabBar = Advanced, tabbed interface.
  • 7-Zip = No more expiration popups from WinZip and Winrar, and it does better zipping up stuff in a variety of formats.
  • TrueCrypt = For those files for your eyes only. I don't think Administrators can get past this. Good for USB thumb drives.
  • TeraCopy = If you are tired of single-file copying or having to start over when something goes wrong.
  • FileZilla = New favorite FTP client. Supports SSL too.
  • µTorrent = For the file sharing nerd in you, torrents and many P2p networks are your friends.
  • WinDirStat = Visual statistics of what's on your hard drives.
  • Xinorbis = Even better file statistics of what's on your hard drives.
  • System Information for Windows (SIW) = Their tagline of "everything you want to know about your PC" isn't wrong.
  • Absolute Uninstaller = If you ever want to uninstall multiple programs all at once - instead of one at a time.
  • Undelete Plus = Now you can recover files you didn't mean to delete. Watch for this program's cache, and empty it periodically.
So yea, I'm officially a geek and my computer has never had issues (not).

Saturday, February 23, 2008

Caching Nightmares

If you are a web developer, you will be very familiar with this problem: browsers and web servers cache stuff to help speed up response. The idea is great when things don't change much, but it ain't rosy when pages, scripts, code, and content are changing constantly and quickly.
Most browsers offer a "refresh" button that's supposed to get the latest content. Don't be surprised if it's not always the case. I think I relied too much on that feature and it screwed me over, causing me to hack my own code before I realized it could be the browser's fault (and rightly so). I've seen instances where refresh can load an old (cached) script even when a newer version exists. Reducing cache size to mere MB helped - my first clue. Usually when I suspect this kind of issue, i switch to another browser (IE does cache better) and see if things look the same.
Web servers also cache stuff. The most dangerous are what they save when shutting down so that they can restore when they are back online. I've seen instances where old stuff was used instead of newly deployed code. Tomcat is notorious for this. I disabled this kind of caching a while back (don't recall how I did it, but suddenly it was gone as I tweaked around).
Compilers are the third and final source of caching issues. Most advanced IDEs have a "clean project" feature that helps. In a nutshell, if your files didn't change, some compilers won't recompile them when you build the project - even when dependencies may have changed. The result is a product with mixed code, old and new together. The surest sign of this is new features with old data - what I call "history playback".

Saturday, February 09, 2008

Liquid Templating Language ... Say What?

I pride myself in being a diligent programmer with enough experience to be able to use any programming language under the sun, until I came across a small scripting language called Liquid. It's a server-side language - an offshoot Shopify/VBScript - that is used in some web applications to generate dynamic content in reporting views. It is touted as a simple language that anyone can use with ease.

I don't really agree with all the hype about it. In terms of flexibility and robustness, this language doesn't even come close to what modern scripting languages (both client-side and server-side) offer. To start with, it is poorly documented - an attempt to find full documentation and usage samples leads to Google Code and SVN preview to check out code if you want. It doesn't seem like people use it much.
Having a small set of constructs might make it easy to learn and use, but it severely limits what else it can do. Consider the simple idea of named variables: it doesn't have it! By extension, there's no concept of code reuse, regular expressions, and arrays/hashes - components that are useful for processing text repeatedly. In working with this language, part of me wanted to implement a Javascript engine that'd reprocess at the client whatever content Liquid produced so that it meets even the simplest requirements such as treating data with certain keywords specially.

The bottomline is this: if search engines know nothing about a language you choose to use in your application, chances are that you will be one of a dying generation of that language's users, or it is proprietary. Programming languages thrive on a huge user base in addition to a rich feature set - Liquid really has none of these. A time will come when applications written in obscure languages like this will need to be shredded in favor of simpler, fully-featured, modern languages. Consider Pascal and Cobol - they are still in use today because of the huge code base from ages ago, but there's a rapid move from them to C/C++ because of progress in programming languages. Companies that don't keep up will be stuck with antiquated and inefficient code - a long term cost. Especially for web applications where change is more rapid, I don't see this language surviving more than 7 years really. As more people find out about it, the more disadvantages out it they will see, and the less it will seem appealing for use in web projects.

Friday, January 25, 2008

OpenGL In Java With JOGL

OpenGL itself is written for C/C++ programmers, and is in fact more efficiently implemented in that language. Serious games and animations are written that way. However, you can still do snazzy things with it from Java using the Java OpenGL (JOGL) library and the associated JNI/utilities library GLU. Perhaps an encouraging fact is that JOGL is the result of a joint effort between Sun Microsystems (makers of Java) and SGI (makers of OpenGL) to produce Java bindings and extensions for OpenGL. I also looked at Maya (the top GUI-oriented 3D package out there) and various incarnations of JOGL and Java 3D (native to the JDK), but decided that to learn the most about OpenGL, I should learn to code everything from scratch. Also considered was DirectX; I discourage this because it is targeted at Windows systems and thus not as portable as OpenGL is.

To get started, you need JOGL binaries (JARs) that you can download from Choose the apprpriate package for your OS and extract it to a location on your computer. For Windows, copy all DLLs into the %system32% directory, and the *.jar files into a location that your IDE can access to include in compile and build classpaths.
If you'll be using Eclipse, they claim to support OpenGL out-of-the-box (since Callisto, v3.2), but I couldn't get a simple application going until I added the JOGL libraries to the project. If you use NetBeans, there is a plugin for OpenGL available through the Update Center, but I couldn't get that going as well. So I decided to do things manually i.e. copy DLLs to appropriate location, then manually include JOGL libraries to the project path.

Here's the source code of a template you can use to start OpenGL projects. Included is a class (JoglTester) that tests whether your DLLs loaded correctly, and whether the core library is accessible. It'll also open a window (JoglApp) with dots of colors across the application, demonstrating how a canvas can be initialized for display. It is a good idea to have your own event handler that implements a GL listener (JoglGLEventListener). Most of your code changes will happen in the listener's overridden methods.
This work is based on Gene Davis' article at Java World. He does a good job of maintaining his code samples there (see comments/updates at bottom of article). When you set up your projects, create them as "Java Applications".

Tuesday, January 22, 2008

Delete, Create, Format Drive Partitions ...

At work, I sometimes have the need to create usable operating system partitions quickly on 20 or so hard drives, and it gets absolutely boring deleting existing partitions in Windows Disk Management, creating new partitions, and formatting each one. So I came up with this script that does it all for me.
It's basically a Perl script, but relies heavily on existing Windows features: diskpart - a disk management utility that comes with Windows XP/2003, and Windows Scripting Host - the default scripting engine that handles the VBScript I use. WSH uses OLE and the WMI framework. This is not the best implementation of the idea, but it absolutely works, so I don't care.
This script will attempt to determine what the system disk is and exclude it from all operations. Then it deletes all partitions on other hard drives, creates a 4GB partition, and formats it with the NTFS file system. It runs in under a minute, obviously depending on how many drives you have. I tested it with 23 (the most you can assign drive letters to in Windows). It doesn't work well on Windows 2000 because (1) it doesn't come standard with diskpart.exe, and (2) the WSH on it is ancient. On Windows 2000, it fails to format the partitions created.
Needless to say, don't be dumb and run it on a system with useful partitions. I don't give any warnings - I assume that in running the script, you really want to get rid of all partitions/volumes on your system except the system partition.
This script can be adapted many ways: split the main features into own module-friendly scripts or subroutines, or a mechanism to specify varying partition sizes and file systems, and whether to quick format or do a full format. You can even adapt it to operate on a remote machine (do you see how?).
Download the script.

Tuesday, January 15, 2008

Back To Language Basics

My friend Rachael can speak the language from Papua New Guinea as fluently as she does English, and that impressed me. For a while I hadn't been around people that spoke anything other than English and Spanish well, and I was getting bored, so this was something refreshing, and it inspired me to tap back into languages I used to know or learn some new ones. At one time, I spoke up to 6 languages well, but it's all eroded into a heavily accented English breeze. I tend to mumble much these days, have a very low voice volume (I swear it sounds really loud inside my head - that's why), and it affects how I communicate.

Some people like the accent (sounds more British than anything normally), but a majority others don't get it. People place me as being from the islands (Caribbean) and others wonder soon enough where I am from when I speak. Thing is: my accent is all languages I ever spoke rolled into one thick thing - the intonations, the phonetics, and expressiveness of my voice are heavily influenced by the African languages I learned as a child. Depending on what mood I am in (tired, stressed, happy, cheezy, etc), you might hear something different. Sometimes I can be really hard to understand ... and yet other times I'm as clear as day. My family doesn't notice anymore and for a long time I didn't care.

This year I'm doing something I enjoy much: learning languages. I started English (??) this week with my friend Jessica's mom coaching me (which I think she does professionally). On the side, I am also working through a French series using software by Topics Entertainment (Instant Immersion French Deluxe v2.0). The software has speech recognition features and some pretty good exercises to get you up to speed. If you have never spoken or learned French, even the beginner level is not for you - you need some background for this software to be useful at all. I sound ridiculous now repeating words and sentences many times over, but that's how I'll get back in the groove.
As I worked through the French series, I realized how much my experience as a musician helps me: I "sing"/sound the words and sentences I hear, and that's the only way I get them right. If I can't hear it, I can't say it - even if I can read it and understand it. I think I have an ability to imitate sounds quite well. Languages are my newest escape from the daily grind of computers, programming, music, and the social others.

After English and French, I'll be back to Swahili and Arabic (build on the little I knew), and then get caught up with other African languages. Some new languages I'd like to learn: Spanish or German. I'm not messing with Chinese and other Asian languages, like Ben and Megan have successfully done. For my friend Rochelle and the huge collection of Brazilian music I have, Portuguese would be an interesting language to pick up, I think.
The key to keeping in shape linguistically is practice, practice, practice. Any chance I have, I'll speak these languages. A while back, I switched to reading my news from French RSS news feeds ... and I haven't missed anything worthwhile on the world scene. That's just a start.

Thursday, January 10, 2008

Into Perl ...

I'm on a journey to master Perl by the end of this month. As it turns out, this powerful scripting language is being positioned at work as THE scripting language for the automation framework. All test tools will need to be written or re-written for compatibility with a new test/automation framework based on Perl.
So far, learning Perl is a walk in the park. I can certainly see a lot of relation to existing programming languages Java and C++, with a zing of more useful features. For being an interpreted language (you need to install an interpreter on Windows, Linux comes standard with one), it is efficiently fast. I like how it "has no limits but your system" and how "there's always more than one way to do things". It can get confusing sometimes, but if you just stick with a few ways, you will get by just fine. My methodology is formed from experience as a Java programmer, so I use syntax that resembles Java as much as I can.
My biggest hope for Perl is that I should be able to use it in my own testing - when I do development work for Strive, Ltd. I am developing a Javascript test framework for webapps I develop, but I can see now how Perl would greatly augment this package. It'll be interesting to learn to integrate it in my Java development and deployment policies.
The best part of it all: I can learn this language for free. There are so many good tutorials and even free ebooks online that I wonder if stores can still sell actual books at all!

Tuesday, January 08, 2008

Sieve of Eratosthenes (Perl)

Followup to a previous Java implementation (that blogger botched), here is the Perl scriptage:

print "\n=======================\n";
print "=======================\n";
# Get the range
print "Minimum value of primes: ";
my $min = <STDIN>;
chomp $min;
print "Maximum value of primes: ";
my $max = <STDIN>;
chomp $max;
# Create a max-sized array
my @primes = (1...$max);
# Initially assume all numbers are prime
for($i = 0; $i < $max; $i++) {
$primes[$i] = 1;
# The sieve
for ($i=2; $i*$i <= $max;$i+=1) {
if ($primes[$i]) {
for ($j=$i; $j*$i < $max; $j+=1) {
$primes[$i * $j] = 0;
# Show the results
my @p = ();
for ($i=$min; $i<=$max; $i++) {
if ($primes[$i]) {
push @p, $i;
$size = @p;
print qq(Found $size primes:\n@p\n);

Sunday, January 06, 2008

Alternative PDF Reader

If you are growing tired of Adobe Acrobat's bulky reader, there's hope in Foxit Software's free PDF reader. This app is fast, renders stuff much better, and is not a resource hog as Acrobat is. I had come across it 3 years ago, but at the time, I still has Acrobat 5, which was moderately good. Then came 6.x and 7.x and the angst grew, so I had been looking for an alternative to Acrobat Reader. The answer came in Foxit's extremely small application that does wonders! Consider this: Acrobat uses about 80MB working with a 9MB PDF file, takes forever to install and get started, causes Firefox to hang while closing, browses documents sluggishly, does a bad job rendering "weird" pages, and has a habit of calling home for updates; Foxit uses only 18MB for the same document, displays/renders fonts and images much crisper, and is a breeze to start and install. In fact, it is the fastest install I have ever seen! The installer itself is small - 2.5MB. I'm impressed by this program.
The guys at Adobe must be scratching their heads on this: how does a relatively unknown company do their idea better than them? Performance-wise, it's superior to Acrobat. I haven't used it long enough to know what other features and issues it has, but for my needs of reading (huge) PDF's, this is the clear choice.

Thursday, January 03, 2008

Notes On Hashing

  • Technique that promises near-ϑ(1) data lookup, compared with ϑ(n) for linked lists or ϑ(log n) for binary search trees.
  • Uses a hash function - a means of interpreting some object numerically (with an integer usually) to determine its position in some table (hashtable).
  • There's no order (e.g. sorting) to items stored in the hashtable - it all seems random and is the reason this process is sometimes called randomizing.
  • Implementations prone to inefficient storage usage and high rates of collisions when used slots are far-flung and many entries end up with the same hash code, respectively.
  • There's no such thing as a perfect hashing algorithm - there will always be some collisions and space will never always be utilized efficiently enough - a good trade-off is needed.
  • Prime numbers are king when calculating hashes.
  • The JDK's String class has the simplest and yet effective hashing algorithm I have ever seen!
  • When a hashing table fills up, you need to resize it dynamically.
  • Two techniques exist that help reduce the need to resize hashtables: (1) probing, and (2) bucketing.
  • Linear probing = if a collision occurs, look for the next available slot (including wrapping to the head of the table if necessary) to place the object in. At some point with more and more objects being added, performance degrades to linear/brute-force ϑ(n).
  • Bucketing = maintain a collection at each hashtable slot so that each hashcode can have more than one object.
  • One way to determine when to resize a hashtable is to keep track of a load factor = the ratio of stored values to available buckets (for bucketing).
  • A load factor of about 75% is a fairly good trade-off between space and time, and works well for most implementations.
  • Bucketing is the most efficient of the two techniques. Nor surprisingly, the JDK version implements bucketing with a default load factor if 0.75.
  • Modern hashing functions are based on serious mathematical analysis (beyond me). There are several websites I came across with some good hashing algorithms for everything from cryptography to applications in geometry: Arash Partow's site and Hash Algorithm Directory are just two of the many out there.
  • discusses hashing algorithms in their article about encryption.
  • Most everyday programmers don't deal with hashing (designing and implementing algorithms) beyond using data structures available in most APIs. I've never had to do anything like that beyond using the Hashtable and HashMap classes provided in the JDK. Always preferable to use any of the good GUID generators currently available (which probably use some sort of hashing internally). The advantage is that they guarantee you will have a unique identifier for every object in the proper context.

Tuesday, January 01, 2008

It's In The Sorting, Dude!

I just finished consulting on an interesting project: this client's inhouse web application had been working great until recently when it seemed to take longer and longer to respond to requests. Being it is an SOA application that uses AJAX all over, speed is of the essence for an overall satisfactory user experience. A couple of times, the IT guys had upgraded the servers running the webapp (hardware), but the problem lingered. So obviously the webapp itself was to blame - but where would the issue be in an application that's been relatively bug free since its inception?
Long story short, the sorting algorithms used were to blame. I couldn't identify which one was being used (proprietary or more advanced than I have learned?), but magic happened when I replaced the sorter with the well-known mergesort algorithm. In preliminary testing, the problem is pretty much gone for the size of live data the company typically works with. Amazing!

My real excitement though is in the fact that I got to use (so soon) knowledge from a previous school class I had taken (Design and Analysis of Algorthms, and Data Structures). I didn't expect to be using ideas from the class in a real-world environment yet, but I found myself doing real analysis of the client's code like we had done in assignments, and making a decision based solely on the results - and it worked. Sometimes class materials don't make complete sense until you have to use them practically.

Mergesort is a divide-and-conquer comparison-based algorithm that runs in ϑ(n log n) computational time. Its only drawback is that it requires significantly more memory to do its job than other algorithms. Here's the Java implementation of mergesort:

public class MergesortListSorter implements ListSorter {
private final Comparator comp;

public MergesortListSorter(Comparator comp) {
assert (comp!=null): "Comparator cannot be null.";
this.comp = comp;

public List sort(List list) {
assert (list!=null): "List cannot be null.";
return mergesort(list, 0, list.size()-1);

private List merge(List left, List right) {
List result = new LinkedList();
Iterator l = left.iterator();
Iterator r = right.iterator();
while(!(l.isDone() && r.isDone())) {
if(l.isDone()) {
} else if (r.isDone()) {
} else if (, r.current()) <= 0) {
} else {
return result;

private List mergesort(List list, int s, int e) {
if(s == e) {
List result = new LinkedList();
return result;
int split = s + (e - s)/2;
List left = mergesort(list, s, split);
List right = mergesort(list, split+1, e);
return merge(left, right);

There's probably better implementations out there than this, but this variety worked well. The actual solution was a blend of mergesort and shellsort because both perform well in average and worst case scenarios. Java helps with the problem of memory because object references are used in collections (as opposed to objects themselves), so the overhead in memory is in the additional references that are created.