Fun with site usage stats, part two

Back in February, I wrote about web browser usage by visitors to my site. Some of the discussion over my recent redesign has prompted me to do it again. Here we go!

Web Browsers

browser-20091021.png

Compare to last time: Firefox has jumped from 34% to 47%. That gain has come at the expense of both Safari and IE, which have dropped from 33% to 27% and from 28% to 17%, respectively. (Note, of course, that I’m rounding the actual percentages to whole numbers because talking about “16.88%” makes me feel like Spock on Star Trek, and I’m enough of a geek without that.)

Also worth noting: Chrome. It is stuck in fourth place, but its share has jumped by 4.1% from 1.44% to 5.54%. (OK, in this instance I needed to Spock it up a bit.)

Operating Systems

os-20091021

Once again, as a Mac user who also (unfortunately, despite my feeble efforts at self-promotion) represents a hugely disproportionate amount of the total traffic, I’m skewing the results here a bit. Still, I have not significantly altered my own usage of the site since February, but in that time Windows has nonetheless dropped from 56% to just under 50% of my total traffic, while the Mac has gone from 29% to 43%. Interestingly, in February, iPhone/iPod represented over 12% of the traffic but now they’re just over 4%. Linux has stayed pretty even, in between 2 and 3%.

OS/Browser Combinations

browser-os-20091021

In February, IE/Windows was the dominant combination, at 28%. Now it has dropped to fourth place, at 17%. Firefox/Windows has gone from #2 to the top spot, even though it just inched up from 25% to 26%. Safari/Mac and Firefox/Mac each went up a spot as well, moving into second and third, and going from 21% to 24% and from 8% to 18%, respectively.

Conclusions

This is far too small and skewed a sample to say a whole lot about trends on the Internet as a whole, but what I’m seeing here overall is that Mac usage vs. Windows is up, and Firefox usage vs. anything else is also way up. Specifically I’m seeing a significant surge in Firefox/Mac… which may suggest, I suppose, that I have been visiting the site a lot more lately than I did in February. Or maybe not.

It’s also worthwhile to look at the raw total numbers in the traffic. In the time between then and now I’ve split up room34.com into a number of separate sites. The totals back in February were across the board on room34.com; for October we’re looking at stats strictly from blog.room34.com. The date range is the same: 30 days. (The original data was from January 19 to February 18; the new data is from September 20 to October 20.) Back in February, the data I analyzed represented 2,845 unique visits to my site. This month’s data represents 3,810 visits, an increase of 965, or 34%. Since the old stats included visits to a lot of pages that are now parts of other sites, the increase in blog traffic is even greater. So while it’s probably true that I’ve been spending more time looking at the blog myself in the past month, vs. February (considering I just did a redesign this weekend), the majority of the traffic increase is most likely not from me. In fact, it’s probably quite likely that my own percentage of the total traffic is quite a bit less than it was in February. Traffic here spiked on October 13-14, when I posted a reply to Derek Powazek’s blog on SEO — visits to that single page, just on October 13, represent more than 10% of the total traffic the entire site saw all month.

Let’s take a look at the OS/browser breakdown for just that one day, October 13, 2009:

os-browser-20091013

The traffic from this one date was likely responsible for some overall skewing of the totals. Derek Powacek’s blog appeals most strongly to Mac users, which would explain why the Mac/Safari combination is in the top spot (Safari being far more popular in general on Macs than Firefox, for the same reason IE dominates Windows — it comes with the OS).

Lessons to be learned? Well, if I want traffic, I should write about SEO. The SEO bots (both human and software) seem to love it. But beyond that, I think there probably is some valid evidence here that there’s some real movement in the directions of both Mac and Firefox. Something that sits just fine with me!

Final Thought

What’s the deal with this “Mozilla Compatible Agent” on iPhone and iPod? I haven’t seen that before, but I assume it’s one of two things:

1. A Mozilla-derived alternative to Mobile Safari, available only on “jailbroken” iPhones.
2. An embedded client in an app like Facebook, which allows you to view web pages without leaving the app.

I’m inclined to guess that #1 is correct. I’d be surprised if any Apple-approved apps were running a Mozilla-based web browser; it seems it would be far easier and more logical to develop legit apps using the official WebKit/Mobile Safari engine. I haven’t seen any hard numbers (nor do I think it would be possible to obtain them) on the percentage of iPhones in use that are jailbroken, but if this assumption is correct, and we can assume that the ratio of “Mozilla Compatible Agent” to Safari on the iPhone/iPod platform represents at least the percentage of iPhones that are jailbroken (since I’d assume some jailbroken iPhone users still use Mobile Safari), then the numbers are staggering indeed.

However… given the fact that over 8% of the total traffic on October 13 came from this user agent, and I myself visited the site numerous times on that day from my (non-jailbroken) iPhone, to monitor and respond to comments, I suspect a much more innocuous explanation. But a brief yet concerted effort to find an explanation on Google turns up nothing. Anyone in-the-know out there care to shed some light on the situation?

Trying out a new look

I’m trying out another new look for this blog. This design will probably evolve over time, but I am excited about the new direction — most significantly, the new colors, and the custom fonts using @font-face in CSS. The fonts are from a site I just discovered and am very excited about: The League of Moveable Type (no relation to Movable Type, the blogging software).

Of course, Internet Explorer won’t support it, so the fonts degrade to more common, standard, and boring options.

Let me know what you think!

Top 5 Albums of 2009: The Contenders

I'm not necessarily saying Grizzly Bear's gonna win this year, but... well... infer what you will.I know I’m getting ahead of myself announcing contenders for this year’s top albums. After all, in some past years I haven’t even gotten around to this until July of the following year. There may be a few more best-of-the-year quality albums coming in the remaining two-and-a-half months of 2009, in which case I’ll post a hyphen-heavy-contenders-addendum follow-up entry.

But I was inspired to write this today as I fired up TV on the Radio’s Dear Science, an album I granted honorable mention in last year’s list since I hadn’t actually heard it at that point. I did eventually buy it this summer, and it is definitely good enough to have made the list last year.

And so, on that note, I present the year-to-date contenders for my Top 5 Albums of 2009. And once again, I’m presenting the current top four contenders (since I can’t decided on a fifth at this point) in bold. Last year, all of the preliminary contenders made the final list. Will that hold true this year as well? Time will tell.

  • Air: Love 2
  • The Bird and the Bee: Ray Guns Are Not Just the Future
  • Crystal Method: Divided by Night
  • The Decemberists: The Hazards of Love
  • Dream Theater: Black Clouds & Silver Linings
  • El Grupo Nuevo de Omar Rodriguez Lopez: Cryptomnesia
  • The Flaming Lips: Embryonic
  • Green Day: 21st Century Breakdown
  • Grizzly Bear: Veckatimest
  • Heartless Bastards: The Mountain
  • Hypnotic Brass Ensemble: Hypnotic Brass Ensemble
  • Jet: Shaka Rock
  • Dylan Leeds: Bit by Bit
  • The Mars Volta: Octahedron
  • Phish: Joy
  • Phoenix: Wolfgang Amadeus Phoenix
  • Pomplamoose: Videosongs
  • Tortoise: Beacons of Ancestorship
  • U2: No Line on the Horizon
  • Umphrey’s McGee: Mantis
  • Various Artists: Kind of Bloop
  • Zero 7: Yeah Ghost

For the first time, there are a couple of unsigned indies in the list here: Dylan Leeds and Pomplamoose. The Dylan Leeds album is excellent, certainly worthy of consideration alongside any commercial release this year. It’s available on Joshua Wentz’s Sidedown Audio boutique label. And Pomplamoose… well, I’ve already discussed them here. Their album is available on iTunes and elsewhere.

Last year in my contenders post I also provided some fun (?) statistics about the list. Let’s do it again!

22: albums in the list (last year: 28)

14: artists I had heard of before 2009 (last year: 18)

13: artists I already owned music from before 2009 (last year: 13)

4: purchased on CD (last year: 14)

4: purchased on iTunes (last year: 3 2/3)

14: purchased on Amazon MP3 (last year: 10 1/3)

2: unsigned independent artists (last year: 0)

Update: Oops, there are three indies in here. How could I forget about Kind of Bloop?

Update #2: Just realized I also forgot to mention Wilco (the album) in this list. I had some technical difficulties a couple of weeks ago and I needed to reformat my hard drive without being able to salvage some of the music on there — specifically, CDs I had ripped within the past 3 or 4 months. This was one of the few CDs I had bought in that time. I think it says a lot that it took me 5 days after originally writing this post to even remember it existed. Don’t expect it to make the cut.

Update #3: Here’s a new one: Flight of the Conchords’ I Told You I Was Freaky.

Search Engine Optimization (SEO): the good, the bad and the (mostly) ugly

Years ago I first encountered a mysterious acronym: SEO. I bristled when I learned what it meant: Search Engine Optimization. The term can be both innocuous and poisonous. In its innocuous form, it means, quite simply, presenting your site in a way that is most likely to lead to prominent placement in search results. In its poisonous form, it means deceiving the algorithms search engines use, in essence, tricking the search engines into listing your site when they shouldn’t.

That the latter connotation has become the primary meaning of the term is unfortunate, as there is a legitimate role in web design and development for tuning your website for maximum effectiveness in search engine listings. Doing it the right way does not involve gaming the system. In fact, the principles of sound search engine optimization aren’t really about search engines at all: they’re simply rules of good design, ensuring that your site is well-formed, well-organized and intuitive. In short, the best honest ways of appealing to a search engine’s algorithms are going to be the same ways of appealing to the real target of your website: human users. After all, the goal of a search engine like Google is to deliver the most relevant results to its users. And if your site isn’t relevant to a particular user, it shouldn’t be coming up in their search results anyway.

Derek Powazek has an excellent blog entry called Spammers, Evildoers and Opportunists that pulls no punches in criticizing the dark side of SEO. So much so, in fact, that one questions whether there is any other side to it. Ultimately, maybe not. The question then is what to call the best practices in web design and development that just happen to also be the most effective legitimate ways to optimize your site for search engine placement. I don’t have an answer, but I have to admit that after reading his blog, I’m reluctant to use the term “Search Engine Optimization” any longer.

Some background here: for the past year or so I’ve been including a brief section in all proposals I’ve created for new clients, entitled “Search Engine Optimization,” wherein I talk about these best practices, criticize unscrupulous SEO tactics, and give my recommendations for how best to build a website (in ways that also just happen to be good for search engine placement). I give this information away for free. I do, however, charge my clients for work I do to these ends. It’s not smoke and mirrors, and it’s not snake oil. But it is actual work, it does take time, and if it’s not something the client can or will do for themselves, then it’s something I need to bill them for. Powazek says:

Look under the hood of any SEO plan and you’ll find advice like this: make sure to use keywords in the headline, use proper formatting, provide summaries of the content, include links to relevant information. All of this is a good idea, and none of it is a secret. It’s so obvious, anyone who pays for it is a fool.

Right on, brother. But here’s the thing: while I will gladly share this information with any client for free, there is still work involved to implement these ideas. And if I’m the one doing the work, I bill for it, just like any other work I do. I believe what he’s really criticizing is the practice of charging simply for sharing this information. Much like the late-night infomercials that promise riches in real estate, the real get-rich-quick scheme is in selling the information itself; the person who’s going to get rich is the one selling training books and videos, not land.

Let the information be free. Here, word for word, is the information I include in every proposal I write:

Search Engine Optimization

“Search Engine Optimization” (SEO) is a common buzzword today, but what does it really mean? Many web consultants will offer “advanced SEO techniques” and submission to thousands of search engines. But most of these techniques are dubious at best, and most of the thousands of search engines are irrelevant to directing significant traffic to a website.

Ultimately there are a few simple principles that, when implemented on a website, will help to ensure the site receives proper placement in the search results of the most popular search engines, like Google, Yahoo! and MSN. Because the principles are so basic, and correspond so closely with the principles of simple, clean, well-organized web design in general, Room 34 offers these recommendations, free of charge, as a standard part of all website proposals:

Title Bar

The web browser’s title bar is easy to ignore, but a well-structured page title is one of the most important ways to ensure that your site is listed prominently in search engine results. The title should be clear, relevant, detailed, and specific. Each page of the site should have a title that accurately reflects what is on the page. The page title should begin with this specific information, followed by general information that is the same for every page: your business name, the nature of your business, and if relevant, your city and state.

Meta Tags

Meta tags do not appear anywhere on the web page, but they are included in the HTML header of the page to assist search engines in identifying the relevance of a web page if its textual content does not fully reflect its purpose. There are two primary meta tags used by search engines: keywords and description. Keywords is a comma-separated list of words or phrases that describe the content of your page. The description is a sentence or two that can be used in search engine results to summarize the content of your page. Meta tags should be as concise and accurate as possible. Excessive repetition of terms, or content that does not accurately reflect what is on the page will hurt search engine rankings rather than help them.

Semantic HTML

Semantic HTML means HTML that is built to reflect the logical structure of a web page document, with visual presentation separated into CSS (Cascading Style Sheets) rather than embedded within the HTML. Fonts, colors and visual layout elements should be restricted to the CSS. HTML tables should be used for tabular information only, not layout and positioning. The content of the page within the HTML should be organized such that the page is logical and readable with CSS turned off. Also, it is increasingly important that documents be formated with valid XHTML rather than older HTML specifications. Pages should be checked against an XHTML validator (http://validator.w3.org) to ensure accuracy.

Accessibility

Building web pages with proper accessibility for visually-impaired visitors also helps to ensure a semantic HTML structure that will improve search engine rankings. All images and other visual content should include “alt” text. Content that requires Flash, JavaScript or other browser plug-ins should also include a standard fallback version to allow them to “degrade gracefully” for screen readers, browsers without these add-on features, mobile devices, and search engines. By organizing features like site navigation into standard HTML unordered lists instead of elaborate table layouts or Flash elements, pages will be both more widely accessible and more relevant in search engine results.

Relevant Links

Most modern search engines like Google use cross-site links as an indication of a site’s popularity and relevance in a particular field. By exchanging meaningful links with relevant sites in a particular field, a site can improve its search engine results. There may be a temptation here to exchange links with sites that are simply aggregators of links. This might provide a temporary boost to search engine placement, but ultimately if the links are not on sites that offer real live users a meaningful web experience, they will not provide long-term benefit. Before exchanging links with another site, consider whether or not it is a site you would visit and trust as a resource. If not, it is probably not worth the effort.

No Magic Bullet

There is no secret weapon to ensure top search engine placement. Many promises of search engine optimization rely on short-term “gaming” of a search engine’s relevance ranking algorithms. But just as the “gamers” evolve their tactics, the search engines are constantly being enhanced to counteract them. Ultimately the best way to ensure long-term relevance within search engine listings is to stick to the principles of well-organized, validated XHTML documents and meaningful content.