Internet Explorer and transparent CSS layers

ie7chalkHere’s the scenario: I’m working on an image gallery for a client site, sort of a slideshow concept. The client wants the user to be able to click on the left half of the current image (“slide”) to move to the previous image, and on the right half of the current image to move to the next.

OK, no problem.

In the days of yore, I might have handled this with an imagemap. But my solution now, drenched in jQuery goodness, involves a pair of empty <div> tags, with set dimensions, absolute positioning and z-index in the CSS. These transparent <div>s are overlaid on the image, and the jQuery code triggers a refresh of the HTML code in the slideshow <div> itself, underneath, that contains the <img> tag for the current “slide.”

If you’re not a web developer that last paragraph is probably little more than a steaming pile of (to follow John Hodgman‘s lead) “bullroar,” but the point is that this allowed me to stick to semantic HTML (more or less — there is a chunk of JavaScript code directly in the page, but I may optimize that out of there soon), and keep most of the presentation in the CSS and the functionality in a separate JavaScript file, the way it should be.

Only it doesn’t work in Internet Explorer.

“It doesn’t work in Internet Explorer” is nothing new to me, and in fact I wasn’t totally surprised by that. My first hunch was that IE didn’t like the empty <div>s, even though they have dimensions specified in the CSS, so I gritted my teeth and stuck &nbsp; inside each one. Didn’t help.

Next I decided to make sure my dimensions were being interpreted properly, since it seemed clear that they weren’t: when I say it didn’t work in Internet Explorer, that’s only a half-truth. It didn’t work properly in Internet Explorer, but there were clickable areas on the slide — they were just really small and not in the right places.

So I set the background of the “previous” area to red and the background of the “next” area to blue. Reloaded in IE. Yep, the dimensions looked perfect — my photo was now covered precisely by equal-sized red and blue rectangles. And, curiously, the links worked properly now. I thought maybe it was just because I had added some CSS code for these <div>s in my IE-specific stylesheet (yeah, I do that… get over it). So I changed them both to transparent within the IE-specific stylesheet, and the problem reappeared.

And then, it hit me. No… could it be?

Yes, it could: IE was just ignoring the presence of those z-indexed <div>s simply because their backgrounds were transparent. This is probably a “known issue” with IE, but I had never had to deal with it before, and regardless, it sure as hell didn’t make any sense.

My solution? I went for a play straight out of the book of web design circa 1997: a one-pixel transparent GIF. Except I used a PNG, but same thing. I made a completely transparent image, and in the IE CSS file, I used that as the background for these <div>s.

Problem solved. But as is so often the case with IE issues, the solution is so bitterly unsatisfying, so fundamentally wrong, that there is no joy in its discovery. Just the impetus to blog about it.

The growing problem of registration spam in WordPress

WordPressNow, this is odd.

A few months back I wrote a plug-in for WordPress called RegisTrap. It’s beyond basic, and has one simple purpose: to block registration spam on my WordPress-based website.

Registration spam, for those of you who don’t know, is when a “bot” (a computer program written to seek out and exploit poorly-written web forms to send floods of spam email messages) signs up as a “user” on your site. These “users” can’t really do anything on the site, but they clutter up your database nonetheless.

I had a feeling that RegisTrap was really only going to work reliably if I kept it to myself. And I was right: after submitting it to the official WordPress plug-in repository, it eventually stopped working, as the bots adapted to avoid its “traps.” It might have happened eventually anyway, but I’m sure that the publicity it received from being in the repository, and the hundred or so people who downloaded it (many of whom, I suspect on reflection, were probably bot developers looking to dissect its workings), accelerated its demise.

As I announced here a few days ago, I turned RegisTrap off on my site, and I also turned off registration altogether. But that hasn’t stopped the flood of new bot registrations. There are 14 of them sitting there in my database right now (well, there were before I just deleted them), all added after I turned off the ability to register altogether.

I suppose, since the bots don’t actually visit the site and fill in the form, they just submit the right data directly to the right URL, whether it’s “browsable” or not, it doesn’t even really matter if your site is set up to reject registrations. Still, it’s a bit dismaying that WordPress is processing those registrations even with registration turned off. Apparently it stops at making the registration page inaccessible via links; it doesn’t actually turn off the code that processes registrations. Boo. Perhaps that should be my next plugin: “Stop All Registrations 4 Realz.”

But maybe I won’t call it that.

Fun with site usage stats

OK, “fun” may be an exaggeration, but it is interesting to look at these stats for room34.com courtesy of Google Analytics.

The usage statistics that are always of the most interest to web designers and developers are the web browser and operating system breakdown among site visitors. “Conventional wisdom” is that Windows makes up about 90-95% of most sites’ users (with Mac OS X making up almost all of the rest), and that among Windows users, Internet Explorer is at about 80-90%, with Firefox making up the bulk of the rest, while on the Mac about 90-95% are using Safari and the rest are on Firefox.

The stats for my site paint a much different picture. Now, granted, I am probably by at least a couple of orders of magnitude the most frequent visitor to my site. I can accept that. So that means Mac OS X/Safari should skew high in the results. But just how high? Let’s take a look.

The following are room34.com stats from the past month, January 19 to February 18.

Web Browsers

Here’s the breakdown of web browser usage among my site’s visitors:

Site Usage: Web Browsers

Firefox appears to be winning this war, with Safari close behind and Internet Explorer strong, but decisively in third place. Chrome trails far behind in fourth place, but I get a twisted pleasure from seeing Opera disappearing into irrelevance.

Operating Systems

And now for the operating systems:

Site Usage: Operating Systems

Well, how about that? There are enough other people looking at my site that Windows manages to still be the most widely used OS, though its 56% share is far below the roughly 92% share it (supposedly) holds among the general populace of computer users. And what do you know, the iPhone is third! Actually, iPhone and iPod should be identified together, since they run the same OS. I’m not sure why Google breaks them out (but doesn’t break out something much more useful: the different versions of Windows). Look at #7: the Wii! Sweet. Those were not from me. I must confess I’ve never heard of Danger Hiptop, but it’s obviously a mobile OS. Perhaps I should care, at least 0.04% of the time. (That works out to about 2.9 hours a month. Considering the average time on my site is about 3 minutes, one could [carelessly] deduce that Danger Hiptop users like to spend nearly 60 times the average amount of time per visit!)

OS/Browser Combinations

And now, all together:

Site Usage: OS/Browser Combinations

It’s no surprise that the Windows/IE combination manages to land the top spot, but it is surprising that the combo’s share is less than 29%. I’m a little surprised that Windows/Firefox also edges out Mac/Safari, although I should be glad that I represent, at most, about 1/5 of the visits to my own site. (I’m sure it’s actually only about half that!) Fully 12% of visitors to my site are coming to it on an iPhone or iPod touch. That’s incredible. And almost none of those are me. I guess it’s time to make sure I’ve optimized for that platform! I think this represents a turning point in the viability of mobile browsers, and we web designers and developers had best take notice.

RegisTrap seems to be losing its effectiveness

I suspected this might happen once I released RegisTrap to the public. I had four new spam user registrations on my site when I checked it today (having last checked it maybe two or three days ago). Previously I’d only see about one a week with RegisTrap running.

It was bound to happen. The rules RegisTrap employs are fairly simple, and the “bots” are constantly being modified. I have no idea how many registrations RegisTrap has blocked in the time it’s been running — perhaps my next step in developing it is to add a logging feature. If there were only four (or even maybe a dozen or so) spam attempts on my site during this time period, then RegisTrap seems pretty ineffectual. But if it actually blocked a ton (metaphorically speaking) of spam registrations, then four sneaking through doesn’t seem so bad.

If anyone out there is using RegisTrap and cares to comment on ways I could improve it, let me know! Meanwhile, as time allows I am going to pursue the logging functionality, if only for my own edification. As valuable as the logging feature would be, it goes against the spirit of simplicity inherent in the plugin. I really don’t want to write anything to the database or filesystem.