2005-11-30: firefox 1.5 released - if you can find it

You'd think it would be easy to find Firefox 1.5 just after its release, but at the moment getfirefox.com is forwarding to mozilla.com, which links back to getfirefox.com... mozilla.org has a big banner telling you to go to mozilla.com; but it does link to "Firefox 1.5 released" in their news area. That takes you to a page with the release candidate, but if you click "Firefox 1.0" it will take you through to the Firefox 1.5 release page.

Mozilla's flagship product has a major new release, yet their web presence is a mess. Whoops!

So anyway, if you still care about Firefox 1.5 after all that, head over to www.mozilla.com/firefox and grab a copy :)

All jokes aside, I'm not seeing much on the Firefox 1.5 release notes which explains why this wasn't just a point release to 1.1. Yes, I know it's just marketing hype and not a real version number; but it still irks me when people treat ".5" like a major version. The only notable feature addition is the automated update, which should definitely make life easier. The rest... well the rest isn't going to impress Opera users, who've been enjoying features like 'clear private data' and drag-and-drop tab reording for a long time now. Good to see them added to Firefox though.

Tags: , .

2005-11-28: browser makers work together, hell chilly

Browser Makers Band Together Against Phishers - Yahoo! News: Developers speaking for Internet Explorer, Firefox, Opera, and Konqueror met in Toronto last week to hash over ideas on how their browsers could better identify trusted and suspicious Web sites. Additionally, they talked about changes to browser pop-ups that would make it more difficult for scammers to spoof sites or trick users into divulging personal information such as bank or credit card account numbers and passwords.

Alternatively people worried about phishing could just use Opera, which already gives you a one-click method to restore the chrome! :) Seriously though, it's good news for the industry that the major players can have a sit-down and work this sort of stuff out in a coordinated manner. Particularly good that Microsoft are putting aside their usual attitude and accepting that their product is not the only one in the market.

That said... I hope they're thorough in their actual implementation of these ideas. For example, having different colours to indicate levels of security is less than ideal. You shouldn't use colour alone to indicate meaning, so they'll need to add a text equivalent. It also means that if anyone figures out a way to hack the colour system the user could be hit twice as hard since they'll trust the site.

Really though, that's just details. The big point is to see the vendors putting the users first and working together towards an extremely important goal. Maybe next time they could talk about coordinated rendering standards :)

For more information, check out the posts from the various people involved:

Tags: , , , , .

history, but no news

Tantek's Thoughts | Pandora's Box (Model) of CSS Hacks And Other Good Intentions: Well it's been 7 years since REC-CSS2 ... But we don't have any fully compliant CSS 2.1 browsers yet.

Tantek gives us a history lesson about hacks; but at the end is forced to wrap up the same way we've been wrapping this issue for quite some time now: avoid hacks, validate your work, don't rely on defaults and wait for the browser vendors to get up to speed. I think Tantek's patience is greater than mine - 7 years is past the stage of asking the waiter where you food is. After 7 years we should be chasing the chef up the street with flaming torches and pitchforks.

Well ok, maybe not. But it is frustrating to be endlessly waiting for the day when we can actually build to standards, as opposed to build to standards then throw in a bunch of hacks to cater for all the browser quirks.

Tags: , , .

simple things that still need saying

Some articles are groundbreaking. Some are good reminders about simple things. Some illustrate how far we haven't progressed by remaining relevant years later (in IT that should never really happen!). Still, these are all well worth reading :)

2005-11-20: 301 redirections guarantee nothing

I've seen a few SEO consultants recommend using 301 HTTP redirections to forward traffic from an old location to a new location. It seems logical enough, since that's what it's for (301 - Moved Permanently). So I tried it out when I moved a personal website. I can now say that based on my own experience, using 301 HTTP redirections guarantees nothing :)

why try this?

I decided to try using 301s after one consultant in particular insisted that it was absolutely necessary to avoid losing whatever pagerank the old location had; and it would "ensure the site did not lose standing" in search indexes.

Well I know that consultants don't actually have some secret society with the search engine companies, so they're going on guesses and anecdotal evidence. Maybe they're educated guesses and maybe they're total crap. Depends on how ethical the consultant is.

So I decided to try it out myself, since I'm not getting paid to say it works. I didn't have a specific expectation, I just decided to See What Happens™.

the scenario

For quite some time, the site in question was the third or fourth result on Google for its most relevant keyword. Unfortunately I wasn't checking the public Google PageRank before the move.

The site has some (basic, ethical) SEO techniques including meta keywords (for what they're worth) and a couple of extra keywords included in the <title>. Otherwise it's just accessible XHTML.

the first move

The page has been moved before using nothing more involved than pages with links to the new location. The user had to actually click through. There was no noticeable dip in traffic to the site during or after the move.

the second move

Eventually I decided to move it again, which is when I decided to use 301 HTTP forwarding like the SEO guys tell people to do. Not having access to the server config, I just plunked a .htaccess file into the directory. In that .htaccess, each separate file had a forward to its specific new home.

the result

The new page (targeted by the 301 redirections) bombed out of Google, dropping from the top ten to the bottom of the top 100. The tenth page of results, aka oblivion :) Pagerank dropped to 1 or 2, depending on when and where you checked.

Eventually I removed the .htacess file and just put up a click-through forwarding page. A few weeks on, it's back in the top ten to fifteen on Google and its Pagerank has rallied to 4 (which seems to be about average for any site with even a little traffic).

what went wrong?

Well, this was hardly a scientific test so there are a few things which may have gone wrong.

  • I may not have set up the .htaccess properly
    • Maybe specifically sending each page to its new URL is bad.
    • There were three variations of the old index filename forwarding the new index URL - maybe that's bad.
  • Using 301 in any manner may trigger a bad ranking.
  • Using 301 to forward an entire site to a different domain may be a bad idea; perhaps you should only use them within the same domain.
  • Other factors may have been to blame and the 301 was coincidental.

No matter what, it seems reasonable to say that using 301 HTTP forwards does not guarantee that you'll keep your old search ranking. Even though using 301 is correct according to the HTTP specification, search engines don't simply swap the URL for the new and keep the current rank/index position of the old URL. My guess is that the new site either starts from scratch or - I think more likely in this case - it gets ranked lower than the old one and has to recover from the penalty.

Considering the page went back to the top ten basically right after the 301s were removed, I think the page was getting a negative flag due to the 301s. If I'm right, it's really a pity that unethical usage has ruined yet another useful technology on the 'net.

so should you use 301?

Maybe. But I would only recommend you do it if the usability benefit for your target audience outweighs the risk of dropping in search rankings. I'd guess that using it internally (ie. within your existing domain) is probably fine. Just don't think it will guarantee you won't lose search ranking!

Tags: , , .

2005-11-16: OpenOffice 2.0

I recently installed OpenOffice.org 2.0 (or OOo as people call it in the forums) and after using it for a little while I can say this: your average user would probably never notice the difference between OpenOffice 2.0 and MS Office. The interface is neat and tidy, the keyboard shortcuts are consistent with user expectations. The only obvious omission is revision tools, for that very small proportion of people who actually use revision tools. Be honest, most people use the B and I buttons, the font face dropdown and the font-size dropdown... and not much else.

can it PDF?

A big benefit for OOo is its ability to export PDFs, despite being a free product. With the hefty price tag of MS Office or Adobe Acrobat, this should be a big consideration both for the OOo developers and prospective users.

Unfortunately this is yet another application which has PDF tagging disabled by default - score yet another one for Dumb Defaults I Have Seen. As a quick test I enabled tagging and exported a PDF with the default job options (a couple of headings, paragraphs and an image with alternative text inserted). The only accessibility warning in Acrobat was the lack of a document language. Tags and alternative text were all ok.

The language failure is really annoying since - at this stage - OOo doesn't let you set a document language (correct me if I'm wrong!). Initially I thought I just couldn't find the option; but the response from the OOo user forums was that, no, you can't set document language in OOo yet. Which is very weird since the OASIS ODF format includes a Language setting, I would have thought supporting the format would include basics like letting the user specify what they're sticking into these documents.

I managed to add a Language value to the the same document created in MS Word. This meant that an all-default export had that slight edge over OOo PDF. Mind you, OpenOffice's image handling is far better than Word - I found it much easier to add alternative text in OOo.

performance

A lot of people have reported very slow start up times for OOo, particularly the previous version. On my beefy new WinXP work machine it flies. I can't see any speed difference between OOo and MS Office; which is interesting when you consider that OOo can't build anything into the operating system ;) My home machine is Win2k and doesn't really provide a good testbed, since it has some issues. Feel free to comment with your own experiences with OOo 2.0.

compatibility

OOo has been able to open nearly everything I've thrown at it, the only exception was a macro-laden spreadsheet that only just works in Excel (my timesheet, what joy). Word documents, average spreadsheets, Powerpoint slideshows; they all open with no trouble.

The only real hassles start when you start collaborating closely with users that don't have OOo. Round trips via MS Word don't work very well at all; and I've had a lot of trouble with bullet lists losing their bullets when exported to MS Word. I have no doubt this is why Microsoft is resisting adding ODF support. If you could round trip the documents, I wouldn't have needed to install MS Office on my work machine.

For a stand-alone home user, no worries. On those rare occasions you have to give one of your files to someone else; spit out a Word doc, PDF or even HTML. In fact, in an office environment you'd still be ok if you don't have to do lots of back-and-forth editing with someone else.

conclusion?

Still some work to be done before it can take over the corporate workplace, but for a home user... save your money, give the finger to Microsoft and use Open Office!

2005-11-14: google pimps firefox?

Apparently Google is going to start paying US$1 per referral to get users to download and install Firefox: InformationWeek > Google > Google Gets Closer To Firefox > November 9, 2005: [Google] is providing its ad publishers with a set of buttons that Web site visitors can use to download Firefox with the Google Toolbar ... Site operators using the buttons will be paid $1 each time someone installs the browser and toolbar. I'm yet to find anything on the Google's public website about this, but that could just be due to Google's website being crap. Update: Google does mention the general situation at https://www.google.com/adsense/referrals, although you probably have to sign up to AdSense before they'll admit the actual dollar amount.

I guess the benefit to Google is getting more users running their toolbar, so it the toolbar must collect data worth more (to Google) than $1 per user. I wonder how much we can genuinely trust Google?

No matter, I use Opera anyway ;)

2005-11-05: cognitive disabilities and online education


Update 2011.07.24: Since this was originally published, the case against the university was dismissed; and the university counter-sued for defamation. As part of the settlement the student has issued an apology and been ordered to remove posts made in a variety of places - including this page's comments.

The university's legal team brought this to my attention and requested I assist by removing the comments posted by the student. Since several replies didn't make sense without the original posts, I have removed all comments which directly discussed the legal fight.


The Chronicle: 8/12/2005: Lawsuit Charges Online University Does Not Accommodate Learning-Disabled Students (that's a terrible headline which translates to "online university gets sued by student with a learning disability").

A former Capella University student has filed a lawsuit against the online institution, claiming that it violated the Americans With Disabilities Act by using technology that does not accommodate his learning disabilities. ... Some experts say the student may have trouble winning the lawsuit because few clear guidelines exist dictating what assistive technologies colleges must provide to students with learning disabilities.

Murky to say the least. Learning disabilities are extremely varied and relatively hard to define, which poses significant problems for web developers. If a person can't see or can't hear, you know relatively clearly what you are dealing with. Screens can be magnified, colours can be changed, sounds can be cleaned up and amplified. The industry knows what it needs to do.

But what do you do, definably and repeatably, for people with learning disabilities? What standard can you lay down? What will work for each of ten different students? There is no alt="" for memory or cognition. If it's possible for clear standards to be created - standards which would form a W3C specification for example - the people who know what to do have not been able to communicate it widely enough yet. The web industry does not know what to do.

What little I have seen involves content based strategies: for example writing 'simplified but not dumbed down' versions of documents, which is not the web developer's role. They are not the subject expert, they are the technical facilitator. Content authors need to be trained in advanced writing techniques which are compatible with advanced web development techniques. Even then, it's hard to find any hard data about whether it actually works.

All this and we haven't even touched on feasibility of implementation, budget constraints and the reality of modern tertiary education.

The student who filed the lawsuit, Jeffry La Marca, says his learning disabilities include short-term memory loss, which is recognized by the federal law he cited.

...

After he completed one quarter at the university, in 2004, the administration installed a new software system, made by WebCT, for managing online courses. Mr. La Marca says he found the new setup confusing and difficult to work with. "It was just a navigational nightmare," he says. "It made it impossible for me to study."

...

He complained to university officials and asked them to switch back to the old software, which they said they could not do. Mr. La Marca then asked for more time to complete his course work, and discussed with the officials how much more time he should be given. But when the discussions became heated, he said, he was suspended from the university. Mr. La Marca claims that his suspension was in retaliation for his complaints about the software.

Greg Thome, general counsel at Capella, says Mr. La Marca's suspension was for inappropriate behavior in online courses and had nothing to do with his accusations.

I don't think they were lying about the software. They almost certainly couldn't afford to roll back to the previous system, no matter what the reason. The investment of time and money involved in implementing large applications is generally a 'no return' decision for a large organisation. That's it, they're committed. No going back. They can't afford to change for X many years, nor can they afford to uninstall it and go back to the old system. The money is gone.

Ultimately, the system should - theoretically - benefit the majority without unfarily disadvantaging any minority. The reality is that current technology at most universities has not reached perfection, or even anything close. None of the big commercial software apps are seriously standards-compliant. With that as a given - at least for the moment - there should be enough majority benefit to justify using the system, provided the minority can still access the material in order to complete their studies.

In the meantime if the technology fails, then the people must step up and cover the shortfall. Which does involve some cooperation from the student - they have to communicate their needs and work with university staff who are trying to help them.

There's an awful lot to be done in this area of web-delivered content.

IE7: sane and rational?

mezzoblue § IE7 Conditional Comments relates back a Question & Answer session with Chris Wilson and Brian Goldfarb of Microsoft, about IE7 and conditional comments. Transcribed by Molly Holzschlag (she has also posted about the rest of the meeting: molly.com » WaSP Microsoft Task Force Update: Upcoming Products, XAML, Acid2, SXSW, and IE7 Revealed).

One particular quote really has me stumped.

Why not implement conditional comments in CSS syntax, so we can move our filtering into external files and keep our markup clean?

Chris replies:

"I think it would be great if we provided a mechanism like conditional comments inside CSS. We’ve [Microsoft] thought about it, we’re not going to do it in IE7 because we want to do it in a sane, rational way. You’ll want conditional comments to be backward and forward compatible. Tough to design that into CSS so that it will actually work."

Microsoft wants us to do this to all our documents, to allow for IE6:

<!--[if IE]>
...link to IE-specific CSS file...
<![endif]-->

As I've said already, I think it would have been much better to provide conditional comments in the CSS rather than the page source. All they had to add to IE7 was the ability to parse something like this in a CSS file:

/* [if IE 7] 
...css...
[endif] */

...which both validates and works in other browsers (as in, it's ignored). So why on earth was it not a "sane and rational" approach?

ARGH.

Blog Archive