The “dot-com bubble” was a speculative bubble covering roughly 1995â€“2001 during which stock markets in Western nations saw their value increase rapidly from growth in the new Internet sector and related fields. The period was marked by the founding (and in many cases, spectacular failure) of a group of new Internet-based companies commonly referred to as dot-coms. A combination of rapidly increasing stock prices, individual speculation in stocks, and widely available venture capital created an exuberant environment in which many of these businesses dismissed standard business models, focusing on increasing market share at the expense of the bottom line. The bursting of the dot-com bubble marked the beginning of a relatively mild yet rather lengthy early 2000s recession in the developed world.
I read a Digg post today, that some how made it to the front page, that just made me laugh. The author of the post was obviously very ignorant in his approach and lazy on research that went in to his blog post.
See if anybody can find the Big Save As Button on the picture below.
As one commenter on this article put it:
floppyllamadigg: We’ve met the one user on the planet that can’t function without Clippy.
I tend to agree with floppyllamadigg in that any user that cannot find the huge circle button with the Windows Logo in it or isn’t at least curious enough to click that huge target probably should stick with Notepad or the more advanced WordPad.
Update: (17:30 EST) And the lemmings follow along.
Update: (2007-4-21) And the good times keep rolling over at Fumbled.org. The new post on Fumbled.org seems to be more of a backhanded apology. However as I have said before when you make a mistake, fess up, there is nothing more refreshing in this days world than that. So kudos to R Lee Creasy, who I featured above in the lemmings update.
Last week all the major search engine providers, announced that they were going to support a new specification at sitemap.org that allows them to auto discover your sitemap without you having to submit it:
Yahoo did a good job at summing up the advantages to putting your sitemap location in the robots.txt file.
All search crawlers recognize robots.txt, so it seemed like a good idea to use that mechanism to allow webmasters to share their Sitemaps. You agreed and encouraged us to allow robots.txt discovery of Sitemaps on our suggestion board. We took the idea to Google and Microsoft and are happy to announce today that you can now find your sitemaps in a uniform way across all participating engines.
If you want to see my implementation of this for my sitemap go to http://www.coderjournal.com/robots.txt. Further details about this can be found at http://sitemaps.org/protocol.htm or for your convenience I have included them below.
Specifying the Sitemap location in your robots.txt file
You can specify the location of the Sitemap using a robots.txt file. To do this, simply add the following line:
The <sitemap_location> should be the complete URL to the Sitemap, such as: http://www.example.com/sitemap.xml
This directive is independent of the user-agent line, so it doesn’t matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don’t need to list each individual Sitemap listed in the index file.