While Surfing through Blogdex, viagra generic I found a page about fighting with rubber bands. Not that I condone such behavior (Especially since, sildenafil as a nerd, I never was very good at shooting rubber bands in grade school.)–but the page has a certain nerdish appeal in that it describes the techniques and technology in exhaustive detail.
I may come back to tweak this essay repeatedly in the coming months. If, viagra usa in this work in progress, viagra canada I get a good groove going on, I may move those parts of the essay into my developer section to exist as permanent documents.
I hate it when some page designers make raw URLs as link text. The rationale behind doing this is that if a visitor decides to hardcopy your page, they’ll still be able to extract the links. To me this is lame and is really the fault of the browser that’s being used. Ideally the browser, before sending the page to a printer, would extract all the URL’s hiding behind the link text (including anchors!) and format the page so these are seen in the printed document. Lynx does this. And with print/paged media style sheet you can do this(although it still doesn’t quite work in IE6.).
How come most of the major news sites don’t put links into the body of their articles? Why? They have a myriad of navigation bars, tables of contents and frames, most of which are useless in comparison to a good site map or search function, but their actual content, their articles, are just mildly formatted text. Again what’s the point of putting a document on the Web if you don’t use hypertext the way it was intended?
Another thing that I’d like to see in browsers is a way to warn you that the link you’re going to invoke is going to open another browser window (Or in Opera’s case, open another window inside the browser). The link should be formatted differently somehow or perhaps there be some note in the status bar that warning the user. Some site designers set their links this way because they think it’s helpful or because they don’t want you to forget about their pages and, sometimes, they’re right. But often times it’s annoying and I think it breaks the Web metaphor: A link flips the user to a new page or a section in the current page. It doesn’t cause a whole other book to appear. Or it shouldn’t without warning the user first that it’s going to do that.
I hate it when major sites break up an article into separate web pages. Invariably, if the option is provided, I invoke the link that fuses the article into one page so I can scroll from top to bottom. If this causes problems in hardcopying the page, that’s a browser design problem, not mine. I have a theory about the motivations behind breaking up long articles into separate pages: This forces you to view more advertisements.
Jake still complains that some people dislike scrolling through long outlines of links (See point six in his Top 10 Mistakes.) but I think this is because links by themselves in a nested, bulleted list aren’t informative. Each bulleted point, in addition to pointing to a page and in addition to having descriptive link text, should also have a little detailed paragraph describing the page in more detail. This makes reading well designed site maps worthwhile and people are willing to scroll through them. The site map itself becomes content not merely navigation. Readers can see how you imagine your site to be structured. They may not agree with you but at least they can still find things. Site maps must become content and always be up to date.
Server-side scripts and page generation
Server scripts should be only used for Web I/O from a database or flat file–and this means only in changing content. They shouldn’t be wasted on generating page layout unless it’s all pre-fabricated before being uploaded to the Web directory. Why waste CPU cycles on changing the page layout when you have CSS and client-scripts? Why load the server down in real-time when you can pre-fabricate all the alternative page layouts as static HTML files before-hand?
One reason why people may not use links as much as they should is because of the legal issues that arise. If I point to a site with questionable material, in some cases, in some countries, I myself may be guilty of illegality too. If I make illegal material easy to find, I could be in trouble too. This seems stupid to me. Search engines often link to the same sites. It may require a little digging, but you can find links to DeCSS in Google, DMOZ and so on. Is someone going to sue them? Is this really just to intimidate the little people?
Tim Berners-Lee is still not happy with the Web. He thinks it can be better. Ted Nelson, or at least the staff on Project Xanadu, have often complained that the World Wide Web is an incomplete implementation of the hypertext idea.
Just sent a dispatch off to Ms. Carlysle (Whose nom-du-geurre is Laughing Cow Cheese.) about investment in nanotech. The resulting letter was so witty I figured it would make a good entry here.
What spurred the letter was some news yesterday that Lucent had managed to make a breakthrough in molecular transistors and, viagra buy stuff I guess, buy viagra some earlier research that Hewlett-Packard did–never mind the decade of work that IBM has done.
While looking into the Lucent story (I notice there are lot of gratuitous photos of the young, freshly scrubbed and photogenic researchers over there at Bell.*) I cited yesterday, I came across a site that claims to be investment news geared towards nano.
The trouble is that investing in nano now is sort of like investing, at the turn of the twentieth century, in Edison Electric (Which, after a byzantine series of buyouts, mergers and other financial chicanery, would eventually become the enormous multinational know as GE), Siemens (Still a chemical lab run by some German academics trying to make some money out of a nineteenth century toxic waste called coal tar.) and Mitsubishi (Still privately owned by a family of post-meiji, ex-samurai capitalists.) in hopes that you might be able to make some big coin on some vaguely imagined future of consumer electronics and the potential of radio (having heard of the tuned rod experiments of Hertz, Hemholtz and Marconi.).
In other words: Total blue sky.
* It makes me wonder if they have what it takes. If their science geeks looked like real nerds (dandruff, glasses, don’t-give-a-damn haircuts and bad skin.) maybe I’d be more assured.