Protecting against DDOS attacks

With Distributed Denial of Service attacks very much in the news, I’m very glad to be hosting this guest article by someone who for now wishes to remain anonymous – Martin.

Until a few months ago I had never seen a DDoS attack and I had no idea what can be done about it, if anything at all. Not because I wasn’t interested; I was. But most of the papers on the net that deal with the subject are academically abstract and hardly useful in a concrete “I’m in deep shit, WTF do I do now?” situation. So here’s a very basic list of what you can reasonably do before you get attacked, assuming that you have no reason to expect an attack: Read the rest of this entry »

Wiki-isation of a static web site

From a presentation at HE Academy Technical Away Day, Newcastle, 7 February 2007

Most of our site content is in databases or in dedicated applications such as a blog or wiki. However, we still have a lot of content in static XHTML pages. We occasionally need to make corrections to this archive material, but I can’t justify the effort of slurping it all into a CMS, with attendant information architecture/URL design issues. This page describes a small project to allow authorised people to edit that content in a wiki-like way.

Requirements

  • Make changes immediately from the browser
  • Restrict editing only to the page content, not the other things on the page like breadcrumb trail, Server Side Includes etc.
  • Word-processor-style editing, but with option to edit source directly
  • Ability to paste from Word
  • Minimal effect on source formatting
  • Live spell-check
  • Multiple users with logging of their activity
  • Deployable on multiple sites
  • Secure
  • No money- must use only free software
  • Relatively small amount of new code: this is primarily supposed to be a time-saving exercise Read the rest of this entry »

So can you measure the number of unique users on your website?

I received an email the other day from a colleague asking the following:

“I was surfing for info on Unique Users and came across this – I’d really appreciate your views on it if you’ve got time to have a look. Do you know anything about this company and would it be worth getting involved with them?”

http://www.thinkmetrics.com/New-methods-of-web-analysis.ph

I read the article, and suggest you do too. Hmmm.. sounds like a miracle cure, doesn’t it?

After four years supporting the WebTrends web log analysis software in the 1990’s, I’m used to the limitations of log and cookie based analysis, and there are several assumptions in the method described in this article that I would take issue with…

Read the rest of this entry »

Setting Canonical Domain with Apache

An experiment in search engine optimization:
My work site, http://www.economicsnetwork.ac.uk (or economicsnetwork.ac.uk if you’re intimate) is also known by three other domain names, because of past re-branding. My problem? How to tell search engines that these are the exact same site, so they know that an external link to, say, http://www.economics.ltsn.ac.uk is to count as a link to http://www.economicsnetwork.ac.uk (and boost my site’s Google ranking, goddamit!). Establishing a canonical domain name like this should also help consistency of brand (i.e. helping the user know what site they are on and what to call it).

For a long time I had a <base href=”…”> tag in my home page to set the canonical domain. This is dumb. It only ensures that a user sees the domain once they’ve come to the home page and then clicked a link. A check shows that www.economics.ltsn.ac.uk, www.economics.heacademy.ac.uk and econltsn.ilrt.bris.ac.uk still exist in the Google index as separate sites. A serious fix requires a few lines of Apache config:

RewriteEngine on
RewriteCond %{HTTP_HOST} (ltsn|heacademy)
RewriteRule (.*) http://www.economicsnetwork.ac.uk$1 [R=301,L]

Read the rest of this entry »

Follow

Get every new post delivered to your Inbox.