Tag: scripts

  • Those Little Interconnected Things

    Ok. Now I’ve had my six days of fame.[1] Better get back to the regular blogging programming and routine. ๐Ÿ˜› But, let me ask you first: How does a Web event, an impending increase in domain name prices, browser incompatibilities, and advertising limitations result to me having to think of making a new WordPress theme for my site?

    During the time before the 2nd CSS Naked Day, I decided to make a plugin for WordPress that would strip every piece of stylesheet information from a Web page. It was somehow successful given that I was only receiving 50–100 unique visitors a day prior to the creation of that plugin, which in turn made my statistics plugin jump to receiving 200–300 unique visitors a day. In addition to the plugin, Dustin‘s pun resulted to a lot more SERP referrals. More visitors equal higher rankings; my Alexa rank turned from above 3 Million to just above 700 thousand in 10 days.

    Since Text Link Ads consider Alexa as one basis for accepting ad publishers, I thought it would be better for me. And with the impending increase in .com domain name prices, I’m starting to think I really need the money. But, still, I haven’t had ad placements since I’ve reinstalled TLA on my theme,[2] so I think it would be better to go back to [or at least serve it at the same time with] Google AdSense which I have used even before TLA. I was just frustrated that AdSense won’t serve XML-compatible scripts or at least a <noscript> fallback for those who don’t want to or cannot display scripts of document.write nature.

    Since it’s equally hard to modify a theme to contain ad spaces than to make one from scratch, I thought it was better to move to a Version 4 for the site. And because WordPress have deprecated some functions since 2.1, and WordPress 2.2 is just around the corner, I think I’m better off making a new one.

    I then thought of the need to create a theme served only with Content-Type: text/html, since Windows Internet Explorer 7 also has no intention of accepting true XHTML. But, I’m having doubts of doing so since I’ve been a fan of XML rules[3] imposed on HTML since the time I’ve learned them. I’ve also read articles on how to use AdSense with true XHTML pages.[4] So, I would most probably stick to my current content negotiation scheme.

    I’ve actually started making a template with a Web 2.0-ish theme a week ago based on some tutorials I found on the Web. Although, upon showing it to Shari, she told me it was a bit too bright. So, I guess it would have to be redesigned since I don’t want my handful of regular readers straining their eyes, and to be looking at [or rather getting distracted by] the design more than they do at the content of my articles.

    So, I guess you just have to wait for the next version of this site. I am finally going to a pool to swim tomorrow, so don’t expect it to be that soon. *excited* ๐Ÿ˜›

    Footnotes:

    1. ^ April 1–6 recorded ~2500 hits from human visitors only—more than half of what each previous month’s worth of page views even without Bad Behavior blocking robots.
    2. ^ Maybe because of the irrelevant keywords? IDK. I just hope not.
    3. ^ Must be well-formed, lowercase, etc. Therefore, cleaner and more readable.
    4. ^ One from Keystone Websites, and another from CSSplay thanks to Sir Regnard.
  • On Nofollow, Spam and Plugins

    When the search engine giant Google announced that it would implement the rel="nofollow" directive on its crawlers, most people had hopes it would be the end of comment spam, most especially when search competitors Yahoo! and MSN expressed support for the microformat as well.

    But, as the years passed even with WordPress immediately supporting the rel="nofollow" attribute since its inception, comment spam attacks on AjaLapus.com increased so suddenly. The most probable cause of the increase is when my homepage’s PageRank increased to 6 last 29th of January rendering it more visible on SERPs. From 50 spams a day to up to 200, the weight of these spammers causes my server precious bandwidth and processing, and me of my time when checking for false positives. These spammers could just be turning a blind eye on rel="nofollow" as spamming costs almost—if not absolutely—nothing to spread.

    From the words of Ben Hammersley:

    If the playing field is levelled by rel="nofollow", then everyone involved will be forced to try all the harder to get their links out there. The blogosphere will be hit all the harder because of the need to maximise the gains.

    Besides, them spammers are not only aiming to be displayed on SERPs, they are trying to be clicked on by human visitors as well. And, even when 99% of the blogs out there use rel="nofollow", the remaining 689,000[1] blogs that doesn’t could be easily found by mere crawling of these spambots on any link they could find. Why bother to scan for the use of rel="nofollow" when you could just post away spam as easily? These spammers affiliate with porn, pill and casino advertisers that earn thousands of dollars of revenue from clicks and visits from real people, consequently receiving commission from them—providing the motivation for more spamming.

    But, has this initiative from Google done its job? Many people do not think so. Aside from Ben, other people thought of it as utter failure.

    As Dylan Tweney may put it:

    Worse, nofollow has another, more pernicious effect, which is that it reduces the value of legitimate comments.

    It would also reduce the motivation to comment on blogs thinking that there’s no way we could benefit from reacting on someone else’s blog entry since our links would be regarded as nonexistent. So much for Web 2.0 and Web interaction. I know I have experienced this a lot of times before, though it has somehow dissipated with these realizations.

    Jeremy Zawodny has a better angle about this matter:

    I’ve seen that first hand. The “psychology of linking” did change in a fairly obvious way after nofollow started.

    ….

    Look. Linking is part of what makes the web work. If you’re actually concerned about every link you make being counted in some global database of site endorsements, you’re probably over-thinking just a bit.

    Straight to the point. So what do I do now since WordPress has no way of deactivating the addition of rel="nofollow" on comment URIs except for hacking into the source code? I’ve looked through Andy Beard‘s Ultimate List of DoFollow Plugins and found two different plugins that suits my taste:

    I currently use Kimmo’s DoFollow as it was the first one that got me interested. But, I think I need input from you guys: Which of the two do you think would be better to motivate commenters on my blog? The one in which they know their links would eventually be followable [DoFollow], or the other in which they’d have to accomplish a somehow obtrusive number of comments[2] on the whole site before their links would be followable [Link Love]?

    If you’re thinking that I may be then vulnerable to spam comments gaining ranking from my site: I wouldn’t worry, since Akismet has done a good[3] job of screening spam for me. I think Dougal Campbell made me realize this.

    And, I am planning to add another plugin that automatically closes comments on older entries that most spammers tend to target. I know there exists such plugins, I just can’t find them right know. Do you know any? How long should I make entries commentable? I have been receiving legitimate comments on older entries occasionally—a reason why I still haven’t decided about this kind of plugin yet. Maybe you could help me.

    Oh, by the way, there also exists 11 reasons against nofollow from a German site dedicated against the use of rel="nofollow". And, more reasons from Loren Baker, which could be what you really need to understand that nofollow is not the answer.

    Notes:

    1. ^ as Technorati currently tracks 68.9 million blogs
    2. ^ 10 comments as default—a somehow large number for a non-frequently updated Web log like this
    3. ^ not great, though—as there has been about 0.1% of false positives that occured
  • Cross-browser testing, yet again

    I have read about Internet Explorer 7: Beta 2 Preview the day after it was released [20th of March]. The developers say it is rendering-behavior complete so I downloaded it to test my pages for rendering issues but it seemed to be having trouble detecting my Internet connection that I ditched it immediately hoping it to be better on the final release. Also, my father always had trouble dealing with new software, especially web browsers, so I really did not consider it staying installed in my PC because he still uses IE6 instead of the system’s default Firefox browser [which is also because of his lack of adaptive skills on browsers] that if IE7 was installed it will technically be a new browser to get used to.

    I have just discovered earlier this morning a way to use IE7 as a standalone browser. I was happy that I would not have to install it again to replace IE6, therefore, enabling me to cross-browser test with four different browsers รขโ‚ฌโ€ four different rendering engines [i.e., Firefox, Opera, IE6 and IE7].

    But, now that I have the new MSIE7 [prefixed for unambiguity] with improved native rendering features, testing it on my web site proved it still doesn’t come near to what Dean Edwards has done with JavaScript on his IE7 “plugin” for browsers prior to the real MSIE7. Needless to say, I still have to hack my CSS or at least get Dean’s IE7’s content-generating module to work on MSIE7 for this web site uses a lot of content-generation from CSS to improve readability especially with lists.

    *sigh* So much for a hack-free Web every standards advocate dreams to come very soon.

    elliptical trainer