Asking Permission First

Tim Bray has an interesting take on the use of AJAX: rather than have your server do the data processing, use AJAX to grab the data and then have the clients do the work:

A server’s compute resources are usually at a premium, because it’s, you know, serving lots of different client computers out there. Lots of different computers, you say; and how busy are they? Not very; your average Personal Computer usually enjoys over 90% idle time. So you’ve got a network with a small number of heavily-loaded servers and a huge number of lightly-loaded clients. Where do you think it makes sense to send the computation?

The thing is, you know what’s happening on your server, but you don’t know what’s happening on each individual client machine. In addition, you don’t know what each client can and cannot support. You don’t even know if your client has JavaScript turned on to be able to allow you do to the processing.

I agree that we can do some interesting work with Ajax and grabbing data from the server and processing it on the clients. Perhaps we need to explore some newer uses of JavaScript and RDF in light of the new server-client interoperability.

However, a developers first priority is to do no harm to those using their application. Their second priority is to ensure their pages are accessible by their target audience. If we start making assumptions that the client’s machine is ours to do with what we will, we won’t need hackers to inject harm into our scripts–we’ll do a fine job of it, ourselves.

Shelley

About this entry:

Author: Shelley Powers (email)
Published: April 7, 2006 at 8:59 am
Categories: Technology, SemanticWeb
Comment Status: closed with 26 Comments

Comments
[ 1 ]   11:40 am 4/7/2006

The nice part of working primarily on intranet applications is that I know exactly who my target audience is and what they have on their machines. Of course, that’s also the down side.

Scott Reynen

[ 2 ]   1:25 pm 4/7/2006

The thing is, you know what’s happening on your server, but you don’t know what’s happening on each individual client machine. In addition, you don’t know what each client can and cannot support. You don’t even know if your client has JavaScript turned on to be able to allow you do to the processing.

Well sure you do. Or at least you can infer it. We used meta refresh with XmlHttpRequest a while back to check and see if the user had Javascript turned on. You set a refresh to a “sorry, you need javascript” page and then make an XmlHttpRequest using JS back to the server. If you get a query from the client to your server (ours was something like, “{nameofPage]?clientscript=yes” but at the time we were using classic ASP and just used document.location.href instead of XmlHttpRequest), then you send back a JS redirect sending them to the same page but w/o the meta refresh. In addition most server side scripting languages have a rich browser detection/user agent sniffing framework.

Scott

[ 3 ]   1:26 pm 4/7/2006

The old is new again. I used this techique about 10 years ago for a metasearch engine I wrote, only it used Java instead of JavaSript. Perhaps it’s time to bring it back again.

Sean Conner

[ 4 ]   4:57 pm 4/7/2006

Scott R, I hear you.

Scott, the point I was making is we don’t know if a person will have JS or not before we put all our functionality into the JS. I’m not overfond of “You need to have JS to have access to all this neat stuff.”

Sean, the old is new again. Look what’s happened to DHTML? You should bring it back.

Shelley

[ 5 ]   5:42 pm 4/7/2006

I’m not overfond of “You need to have JS to have access to all this neat stuff.”

Incidentally, Shelley, your pages act like shit in my Lynx browser.

Seth Russell

[ 6 ]   5:48 pm 4/7/2006

your pages act like shit in my Lynx browser.

“Please consider upgrading your browser to Internet Explorer for the full Web 2.0 experience !!!

Arthur

[ 7 ]   5:55 pm 4/7/2006

Please consider upgrading your browser to Internet Explorer for the full Web 2.0 experience !!!

… can’t, the screen reader keeps saying “grafix” “graphix” or something … we can’t upgrade our household until everyone has equal access. What’s Web2.0 ? Is that where you can see pictures? Who needs it, we’re all blind here.

Seth Russell

[ 8 ]   6:19 pm 4/7/2006

Seth, everyone’s pages look like shit in Lynx.
But my pages are readable if you eliminate the CSS. The page validates. A person doesn’t have to have JS enabled. And they’re accessible except I don’t always label images–though I try. It ‘looks’ good in a browser reader. I know, I’ve tried it.

But if you want a text only page, send me an email and I’ll give you the username/password for the fullcontent atom feed. Then you won’t have to visit with Lynx and look at shitty pages.

Shelley

[ 9 ]   6:49 pm 4/7/2006

Shelly, point is that at some point, you have to break free of the older technology. I was still supporting Lynx (because of my blind customers) back in ‘99 when everybody else had cut it loose. But there is a tipping point when you say “enough is enough”. It’s a judgment call. Me, i think it’s time to rely on JS. But i can accept that your mileage may vary.

Seth Russell

[ 10 ]   6:53 pm 4/7/2006

I wouldn’t say your site looks like shit in lynx. It looks mostly fine. You have a minor problem where your navigation links at the top run together, which is not great for blind users; I would probably try to put some space or a bullet or something between the items, then use CSS to hide whatever character you use there. No impact on the visual design, but helps accessibility. (What can I say, I’ve been on an accessibility kick lately….)

ralph

[ 11 ]   9:34 pm 4/7/2006

Seth, it’s only been in the last year that there’s been serious JS security breaks. It’s not unusual for people to turn off JS. Sites can use as much JS as they want, as long as they realize they may make the site unaccessible to some of their readers. Notice with Tim Bray that he used it for his photo? It’s a nice addition to his site, but not essential.

Ralph, thanks for that. I added a space.

Shelley

[ 12 ]   10:21 pm 4/7/2006

Great topic!

Public cost, private profit. Sound familiar? Then there’s other issues. Myself, I see similarities between the fashionably efficient marketer, lawyer, and accountant hegemony that’s strangling public, business, and political life. A little assumption starting building around an ill-defined shared goal, can escalate a slice at a time into something else. More complicated and resource hungry systems that keep developers and manufacturers in business? One could think so.

I don’t buy Seth’s apriori argument that you’ve got to break free of old technology. Sure, drop something that’s going nowhere or causing more than good. I can agree with that. Where I draw the line is repeating mistakes like the broken or unsupported driver syndrome that was so common only a few years ago. Arrogance and neglect on the product level was magnified across whole communities, and no marketing man has ever stood up and admitted that.

Somewhere between Shelly and Seth’s position is, I think, something closer to sanity. Change from worse to better, whether it’s via increased simplicity, usability, or usefulness isn’t a bad thing. Running experiments that disrupt peoples lives and cause them to pick up the tab is something else. Systems developers who use these tools need to stop and think. Customers need to stop and think. If they don’t, history will repeat itself all over again. Short-term gain, long-term pain.

My take on the whole deal is this is just another technical and marketing jolly. The lessons developers and customers will be learning as this will be difficult, expensive, and drawn out. Progress will be hampered by running before we can walk. Reputations will be made and lost. However, for those who seize the opportunity to learn about what’s delivered and talked about, as well as how that effects them, I think, it will lead to useful and positive outcomes.

On a more personal level, I’ve only just started using RSS feeds for a few sites. It’s not that I don’t understand this stuff or how it works. Heck, I could design, code, and market something like this myself, if I had the inclination. What put me off was the hype element to it, the mad dash, and how different sites handled things in different ways. I didn’t just have to get my head around one thing, I had to process a million different things.

Looking at programming languages, like C and C++, or products like mobile telephones. You can see how something that’s initially badly designed can develop a life of its own. Each stage of development is weirder than the first and can be rushed out before it’s ready. Too much freedom, or too much power in the wrong hands, leads to poor products and design, and a poor experience for those who use them as tools of the trade, like developers, or in their personal lives, like customers.

I think, ignorance and selfishness have a lot to answer for, not just for the creations and people we see around us, but what we do and how we deal with things. The wheel of misery turns from us, here, now. That’s why, I think, it’s a good idea to develop better intellectual and emotional strategies that more accurately balance us across the line of correct action, that Buddhism advocates. Digging beneath knowledge and skills, which are a commodity, the substance of our character matters. And that’s why, ironically, one must embrace these difficulties.

Sorry, just gone a bit Dave Rogers on everyone.

Charles E. Hardwidge

[ 13 ]   11:27 pm 4/7/2006

Charles, i didn’t mean to imply that there should be some some general imperative “to break free of old technology”. I’m just saying that i have recognized a particular turning point here. Now it is practical to create applications where we have program control at both ends of the wire — client and server. This is a big change. Thanks to the support of a standardized DOM in modern browsers most surfers can make practical use of applications like gmail and virtual earth. It’s not hype to support the JavaScript end of this newly deployed technology and shout it from the computer tops that IT WORKS. It seems to me that Shelley is reluctant to fully embrace this change in our environment. Well … come back in a year and ask her what she thinks.

… i don’t know if this makes it any clearer, but it was fun to draw. Since i don’t know Shelley’s advatar, i chose a hero from from her Flickr.

Seth Russell

[ 14 ]   12:34 am 4/8/2006

Re: lynx:

I’ve found that if the markup is mostly semantic, with no presentational tables, and the layout is applied via CSS, then then result will be excellently usable with text-only browsers almost as a side effect.

Aristotle Pagaltzis

[ 15 ]   8:23 am 4/8/2006

Seth, if you have comments about what I say, you will address them to me rather than to others in the third person.

I have written books on JavaScript, I was in the room when Scott Isaacs and Microsoft first introduced DHTML to authors, wrote one of the first books on this, and I’ve been using it since it’s inception. My current book is on JavaScript.

I am aware of its power, and capabilities. I am also aware that people can get caught up in the “Gee Wiz! Look at what we can do!” factor, and make some bad errors in judgment when it comes to JS. Including limiting access to their sites.

Tim’s idea is interesting. Where does it fail? Because we don’t have any–any–control of the client side of the application, other than what we’re given by the JS engine. What the client machines can handle is largely unknown. And it’s a piss poor developer who throws her code into the unknown.

Shelley

[ 16 ]   10:13 am 4/8/2006

Seth, if you have comments about what I say, you will address them to me rather than to others in the third person.

Yes, Mam … it will never happen again.

Tim’s idea is interesting. Where does it fail? Because we don’t have any–any–control of the client side of the application, other than what we’re given by the JS engine. What the client machines can handle is largely unknown. And it’s a piss poor developer who throws her code into the unknown.

I though the DOM standardized what the client side can do … and it seems to me it does a pretty good job of it. What did i miss? I think a person who turns off JS, or runs with a substandard browser, has already decided they don’t care to see these application. Sorry, Shelley, it does not make any sense for me not to deploy this technology for reasons like that.

Here is a case in point: my cartoon maker has some problems in IE 6 because of a bug in z-index on that browser … but my friend at Microsoft assures me that has been fixed in IE 7. My solution: advise people who want to draw my cartoons to use FF. What is your solution … don’t deploy the cartoon maker at my site?

The Iternet is still experimental. Things in this explosion are not always going to work. With all due respect, me thinks you are a bit too cautious.

Seth Russell

[ 17 ]   12:20 pm 4/8/2006

Well, Seth. Thanks for extracting some meaning from the waffle.

Myself, I think, the real issue isn’t knowledge or skill, of which technology is an implementation or perspective. The real problem is attitude. As Shelly suggests, the problem is pushing too hard with the wrong stuff. You yourself admit the internet is experimental. What producers and consumers need to more consciously wake up to is the fact that ambitions have soured and delivery has plummeted. The gap between the two creates misery for both.

You get exactly the same phenomena in politics and law. With both, like the internet, the tipping point you talk about was reached before it started. None of these things solve clearly defined problems in a competent way. Rather than rushing forward with grand statements and armfuls of legislation, like the mad dash to Web 2.0, I would prefer we dug deeper into the simple fundamentals than building more legacy implementations for the future.

When politics, law, or the internet becomes something you do for the sake of it, like bad religion and marketing, you lose track of the absolutes and start building on previous missteps. As you increasingly lose sight of what it’s all about, you end up becoming the exact opposite of what you’re supposed to be. The majestic towers you create today are the ruins of tomorrow. War hasn’t changed since time immemorial, only the price of participation has gone up.

The point I’m trying to make, here, is the problem isn’t Web 2.0. The problem is the character of individuals and society, the stresses between the two, and what the overall structure and balance should look like. At the moment, I think, industries, companies, and other assorted special interests are putting their own ambitions before the good of the many. It’s fundamental errors like this that contribute to economic and social costs. Technology and technologists need to recognise the social component of their work.

Is Web 2.0 an improvement? Yes and no. Should we hold back? Yes and no. It’s here, and something we’ll all have to deal with, but it remains a stepping stone, an experiment. In many ways it is a step in the right direction, but it’s a small and imperfect step. By seizing the balance between Shelly’s circumspection and Seth’s enthusiasm, I hope, people can develop better quality ideas about where to go next. Do it right and everyone can win.

Charles E. Hardwidge

[ 18 ]   3:31 pm 4/8/2006

None of these things solve clearly defined problems in a competent way. Rather than rushing forward with grand statements and armfuls of legislation, like the mad dash to Web 2.0, I would prefer we dug deeper into the simple fundamentals than building more legacy implementations for the future.

I wonder about the wisdom of $100.00 laptops being delivered to children throughout the world. What “problem” do they solve, and is a $100.00 laptop the best solution?

We just implicitly trust the virtue and utility of technology built with good intentions. Then we just sort of shrug when the unintended consequences come along.

Some skepticism, some deliberation, some sobriety is probably in order at some point.

dave rogers

[ 19 ]   3:42 pm 4/8/2006

Seth and Charles, I think that you’re both laboring under a misunderstanding of what I wrote. I use JavaScript at my site. I use JS at most sites. I have created DHTML games and photo shows and all sorts of applications based on JS.

I think that Flickr’s use of JS is wonderful. What I think is best about Flickr’s use of JS is that the site degrades gracefully. And you can see what I mean by this phrase by searching on this term at my site.

Web sites that base all of their activity on JS, or AJAX, or DHTML are telling those who can’t or won’t enable JS: we don’t want you. That’s the site’s choice, but I think doing the “We don’t want you, come back when you’re using the right browser or have JS turned on”, is not the mark of a smart page developer. A clever one, perhaps; but not a smart one.

As for Tim, I think he’s introduced an interesting idea. But I think his statement about pushing some of the server-side processing on the front end because we’re not using it is not something I would recommend many page developers adhere to as a design philosophy.

But then, I also believe in stripping out certain characters from user input before sending to the database, and that attempting to bypass the sandbox security set on the clipboard is one of scariest security risks I’ve seen in a time–but hey! Perhaps my web page readers like to live on the edge without any choice.

Shelley

[ 20 ]   3:55 pm 4/8/2006

“Technology and technologists need to recognise the social component of their work.”

Charles, I can agree without reservation that technologists need to recognise the social component of our work. I also agree with you, Dave, that many new technologies are greeted as if they are all good.

What concerns me is that too often when a person expresses reservation about a new tech or a new idea, they (we) are branded as luddites, the enemy, the killjoy, or the person who takes all color out of life (I think was a recent description).

But how are non-techs going to know if there is potential harm associated with a tech if we don’t verbalize our concerns?

As for those 100.00 laptops–I’ve not seen that having access to the internet has increased tolerance, created greater equality for minority faiths and races, or helped women realize the potential of making up 50% of the planet.

Look at India. Thanks to modern medicine, baby girls are being aborted in many areas of the country because the value of women in the society is so much less than men. This in a country that is now a favorite outsource location for much of the technology companies in this country and Europe. The irony of the situation is that the country is heavily over-populated. If the trend continues with a higher male birth rate, over population may end up a thing of the past.

The point is, any technology can be used for ‘harm’. Most can be used for good. Most times, the only difference between the two is intention.

Shelley

[ 21 ]   4:09 pm 4/8/2006

PS Seth, apologies for snapping at you. And I don’t see anything at all wrong with informing people that your drawing tool only works in certain browsers. It’s not a critical component of your site. Even if it was, your choice.

Charles, your writing isn’t waffling at all. You’ve written MUCH to think on that I believe goes beyond this post. And inspired me to think of new posts. So thank you.

I appreciate when people do a “Dave Rogers” in my comments. Even when the person doing so is Dave Rogers.

Shelley

[ 22 ]   7:02 pm 4/8/2006

Some scepticism, some deliberation, some sobriety is probably in order at some point.

We all know what happens when the top pushes too hard and the bottom doesn’t push hard enough. You end up growing social and economic problems over time, which leads to degrees of revolution or a crash. Like the manic depressive, this produces a lot of action but it’s a loser in the long haul. My conclusion, like your own, is that improvement begins at home.

What concerns me is that too often when a person expresses reservation about a new tech or a new idea, they (we) are branded as luddites, the enemy, the killjoy, or the person who takes all colour out of life (I think was a recent description).

Lessons from history, science, and politics do provide interesting examples to learn from. For instance, the Romans always preferred to work around a city than become dragged into a siege. Buddhist philosophy suggests a more indirect focus and patience has benefits. Shinto advocates building on positive consensus. How we handle things matters.

As for those 100.00 laptops–I’ve not seen that having access to the internet has increased tolerance, created greater equality for minority faiths and races, or helped women realize the potential of making up 50% of the planet.

Challenges, knowledge, and skills abound. What’s missing is character. To succeed well and in a qualitative sense takes vision, action, and persistence. The strong create the weak, and the weak create the strong. Both sides have to be held to account, and that’s where, like other issues, developing a better character is important to delivering better results.

You’ve written MUCH to think on that I believe goes beyond this post.

I think of little else. It’s the doing part I find hard.

Charles E. Hardwidge

[ 23 ]   7:06 pm 4/8/2006

Seth writes: “It’s not hype to support the JavaScript end of this newly deployed technology and shout it from the computer tops that IT WORKS.

True story: last Wednesday I was visiting an old friend. He had a new Dell laptop, very nice, very expensive, fully loaded. He himself is not very technical but his girlfriend is. I think his girlfriend got him the computer as a gift and set it up for him.

Anyway, he asked what I’d been up to lately. I was feeling proud of a music site I worked on this winter - it had a lot of nice little tricks, some Javascript stuff.

I tried to show it to him. The site looked wrong. The Javascript trick we were using to hook an MP3 player to the music files was simply not there on the screen. Mind you, we were using FireFox 1.5 on a new laptop from Dell.

Turns out his girlfriend (or perhaps Dell?) tried to protect him by putting in a Javascript blocker. I’m not sure of the details of how it works, but it deactivated a good bit of the Javascript on the site. It was like a pop-up blocker, but it did more than just block pop-ups.

In the end, we realized there was an icon down in the system tray which, when we clicked on it, allowed us to say “Make an exception for this site.” So then he got to see the site as it is suppose to be seen. But if I hadn’t been there to tell him that something was wrong, he would never have known to make an exception and allow Javascript on this site. He himself didn’t seem aware that his girlfriend (or maybe Dell) had installed the Javascript blocker.

For me, it was a reminder of how much one can not rely on Javascript.

Lawrence Krubner

[ 24 ]   9:26 am 4/9/2006

Lawrence, for me that is an example of trying to make the Internet “safe” by using the tool which throws out the baby with the bathwater. I’ll bet it was a free tool with a marketing plan behind it too. Making an inherently wild environment “safe” is a loosing proposition. It’s like trying to bring your baby up in a germ free environment … all you end up doing is giving them a deficient immune system. For me the pop-up blockers that come with the modern browsers are sufficient. They eliminate most of the pestilence, but not all. I wish this was as simple as Charles wants it to be and we could all just “Do it right”, and that would solve all of our problems with one stroke … but i think we had better do an environmental impact study of that meme before it gets out.

Shelley, wher’d the Specks go?

Seth Russell

[ 25 ]   12:11 pm 4/9/2006

Same discussion, different day eh?
The players names may have changed, but the teams remain the same. the wowsers agin the infonauts.

To recap, the wowsers are those folks who insist that we all have to get new machines, at least as good as theirs, so we can see all the wunnerful stuff they are producing and make sure that we tell everybody how kewl they are.

the infonauts are folks who are more concerned with getting information out here, and are not insisting that their is only one true way to code.

This is the next chapter in the same tired story.
If this ‘technology’ and these ‘techniques’ is so mainstream, why do the wowsers continue to call them ‘tricks’?

the head lemur

[ 26 ]   2:19 pm 4/9/2006

I think, I understand the Tao much better after this discussion.

This unintended consequence has been very illuminating.

Less a tipping point, more a confirmation.

Charles E. Hardwidge



Thanks to all those who have contributed to the discussion. Current policy in force at this site shuts down comments automatically when a comment is made after a certain period of time has passed. Please feel free to contact the author directly with additional comments and/or questions.