About This Site

Introduction

I'd like to take a moment to point out aspects of this site that both show my respect for web design, and illustrate my understanding of it. I consider myself a application developer and do not pretend to be a web interface designer. However, I do understand how crucial a role it plays in providing character to an application. That character is what engages users and provides the means of utilizing your application.

I use my personal web site to showcase my knowledge of web standards and their implementation. As much of my development is "behind closed doors" and not easily represented visually. Whether you are a peer or a potential employer, you may have dug around in the source of this site to get a feel for its construction and thus get a feel for me. This is good. It was my hope that you might do this. For those who did not, or haven't yet and ended up here, I will be outlining how this site came to be and areas I've focused on.

Genesis

It took me long time to get a personal site put together. Two things stood in my way: content, and design. As mentioned, I am not a designer, but I needed a design to put a face on, well, me. Time provided the content, it also provided the acquaintances necessary to get a design scrounged up. A colleague of mine was not hired as a designer, but he sure had a knack for it. Eventually, I was able to coax him into working on a design for me. The layout you see is the result. The implementation of that design was left to me.

When I first began development of the site in 2007 Internet Explorer 6 was still, statistically, king. So, as much as I disliked it, I ensured there was reasonable compatibility of my design with Internet Explorer 6. Unfortunately, the "hacks" I had to employ pushed me out of technical standards compliance. It is now 2011 and IE6 isn't on top; thus I've removed all compatibility "hacks" and am truly 100% CSS3 compliant. I also took the additional step, albeit controversial, to detect IE6 users and require them to obtain a more reasonable browser.

Highlights

Completely table-less design that is CSS3 compliant. I could be CSS2 compliant, but I chose to use text-shadow for the primary navigation. Perhaps a little progressive on this topic.

XHTML 1.0 Strict. I really like XML and the extremely simple rules it employs to form an extensible means of writing and parsing meta-data (markup). I will eventually use strictly XML and XSL with this site, but don't want to get ahead of the game too much there.

Use of CSS "spriting" as a performance tactic. You may notice a lot of round edges on my site. I could use CSS3 to make a few of those happen, but CSS3 is relatively new territory; trying not to push my luck. Making those round edges has the side effect of creating many small graphics, all requiring a connection to acquire. Spriting is when you take all of those small images and merge them into a single image in order to reduce the connection overhead, and in some cases even reduce the overall compression of the images in question. You then use CSS to pull the "sprite", or the specific image you're after, out of the primary image during rendering. Now that the sprite image is being used repeatedly within your document, it is most likely to be sitting in memory during rendering. This can be a tedious process and you need to weigh the effort vs. pay-off. Given the number of small images on my site, it was a very good use of the tactic.

Unification and minification. This has become a pretty normal action taken by concerned sites. Not used as often as it should be given the ease of implementation and near zero risk, but regular enough that I do not attempt to proclaim to be doing anything special. I first unify all of my CSS into a single file, then minify it. Even though on any given page there could be 30% of the CSS un-utilized, the CSS was 1) minified 2) gzipped to the browser and 3) now in the browser's cache. So with my site, having ALL of the CSS downloaded on the first visit removes any further CSS requests. Worth the tradeoff in this situation. However, there is still enough JavaScript differentiated on each page to warrant separation, but it too is minified. Not as common, but in use here, is the minification of the outbound XHTML. For those that may have attempted looking at the source, you might have been met with a nasty single line of markup. This is because I am minifying the XHTML prior to display, thus earning just a few more brownie points with the performance monster. This tactic is less common and also something that should really only be done if you can hold the minified markup in a cache for a significant amount of time; which I can due to this being all static content.

I've used 100% portable network graphic (PNG) files for the layout images. They have been ridiculously optimized, admittedly, by Google's PageSpeed itself. I could not quite reproduce all of the optimization being performed by PageSpeed via OptiPNG and PNGCrush. The best I could do was within 5-9% of PageSpeed's files. So I used those. Either way, they are as small as possible. There is also the plus of the format being completely patent free.

Content delivery networks (CDN) (ie. Akamai, Amazon CloudFront, Yahoo CDN) provide you with closer to destination routes in order to physically speed up the process of getting data to your users. As a side effect they also reduce the bandwidth utilization at your server (known as the origin). Even more of a side affect is how it helps browsers get the data they need faster, in a programmatic sense. Browsers could open up a lot of connections to the server they need say, 23 resources from, but if every browser were this greedy, it would not be good for the servers amongst other reasons. So instead browsers open around two connections to each domain they need resources from. The key here is "each domain". By utilizing a CDN you usually end up with a subdomain of which you reference some content, thus the browser opens say four connects, two to each server, and snags the content that much faster. Now, CDNs are good for high traffic sites, but are hard to justify otherwise. For the sake of exhibition I have setup a fake CDN to basically fool the performance tools into thinking I am using a CDN. You'll notice on my Visuals and Trips pages you'll see me using static.benlake.org for the JPEG images. A common practice is to put your heaviest content on the CDN. The fake part comes in because static.benlake.org is just an alias to benlake.org, but I think you get the point by now. Also, if you were thinking, "that's not a good SEO practice", then at least know, that I know.

I try to be a minimalist and only utilize technologies when needed. For example, databases tend to be used more often than necessary. It is very common to see sites running with a database back-end simply because they are using a content management system or blog tool. In most cases, this is overkill. As a bit of a statement, I do not back my site with a database because there is no need. This is not to say my site is not "dynamic". It does a lot of things programmatically to ease maintenance and ensure consistency; you just don't need a database to do that!

All of these highlights culminate into a good user experience and are validated by the fact that I have achieved a greater than 90 score on both Google PageSpeed and YSlow. At first glance that may seem simple for "a small site", but honestly it does take patience and work for any site. The scores are really closer to 96-98, but I'm providing some buffer depending on the page you rate me on. PageSpeed dings me for the unused CSS, but I'm ok with the trade-off as explained above. YSlow dings me with no CDN on pages not using static.benlake.org.

I think I've achieved the goal of exhibiting my understanding of the profession, and will continue to use this site as a showcase of that knowledge.