Good corporate citizenship is a concept so established that some large companies even have someone to ensure they comply with their “corporate social responsibility”, usually as a small but important part of their “corporate governance”. In short, everyone tries to do the right thing in a way that does not hamper (much) the way they go about their business. That usually means avoiding any activity that borders illegality (kickbacks, harassments), not being unduly pushy against particular interests, and finding a way to “give back” to their surrounding society some part of the profit. That’s the short version; if you want a long version, I’m proud to say my father does consulting and teaching on the subject of CSR and business ethics, and he would be pleased to expand.
But to the point. On the web, there is a large number of grey or dark activities that can’t happen without the participation of third parties. A paedophile can’t send his pictures without an email account, nor host them for sharing without a file hosting service. A site promoting odious ideology or activities can’t work without web hosting, or without someone providing DNS addressing for it. A pirate can’t distribute proprietary content without someone cooking and setting up a network, and without websites where those wares are advertised and even hosted (which, again, rely on third-party services). A spammer can’t work without… well, you get the idea.
Of course, there is the issue of viability. It is not viable to vet and analyze every user and customer. That doesn’t mean it is not possible, it means there is not remotely profitable way to do it for large volumes of users.
Viable ways to do the right thing online
That has two answers: automatic filtering, and exception handling.
Some people like CafePress.com have an image-analysis system that automatically hunts any image that remotely resembles the logos or “trade dress” images of established companies (unconfortable for their customers, but healthy for the owners of those images); in many other instances, it is quite viable to install a system to find and recognize telltale signs of illegal or unhealthy behaviour.
Regarding exception handling, it has long been an established feature of email services and hosting companies to offer an “abuse@” email address, where spammers and other abusers can be reported. Making one such email clearly visible, and attending it fast, is the very minimum level of corporate netizenship needed. For instance, TinyPic.com doesn’t do any filtering (and thus host loads of indecent, stolen or illegal content) but it has a very responsive system: once they are told, they remove the offending piece. Others have less easy to find contacts but answer fast and well.
The alternative to these measures, when you offend too many people, are suits such as those brought against YouTube. Or (worse) direct legislation such as now governs the handling of user data and direct mail in many jurisdictions, imposing costs regardless of their effect on business models.
Some companies don’t either filter or react to whistle-blowers. This is particularly common with DNS services, where InterNIC comes to mind: they recently reacted over a week later to urgent demands of action against child pornography, and then argue that they no longer can find the offending picture…
Others ignore their responsibility. Some ISPs are cheeky enough to argue that “housing” an offending site (i.e. providing service and internet connection to a box owned by the customer), as opposed to “hosting” (renting the box to the customer, too), means they have no responsibility on the content they’re shipping out to the world or the behaviour of the users of that service. A certain Dutch ISP comes to mind (no, I don’t want to send them more sleazy business by advertising their lack of moral fibre).
Last but not least, quite a few companies run online services or sites that positively depend on illegal or at least “grey” behaviour on the side of users. For instance, the plethora of sites that host “torrent” addresses to pirated software or content. Or the variety of porn forums that depend on stolen content. Or forums and blogs hosts (some belonging to known newspapers) that conscientiously ignore abuse, defamation or worse in some of their published content. Again, names are advertising.
All these make profits on the backs of the people hurt by their customers’ and users’ behaviour, and don’t provide the minimum measures to staunch it. In economic terms, they live because they don’t have to pay for the damage their business wreaks on the rest of us (externalities). In ecological terms, they contaminate like crazy and pay no penalty for it.
It can be done
As we’ve seen, there are ways to do things better. And there are even more. Forums like Macuarium.com’s positively prohibit and effectively prevent any public hint of piracy, defamation or odious behaviour, through constant monitoring and active exception handling. It is now perfectly possible to scan images to detect telltale signs of porn, or of registered trademarks. Having a good “abuse@” desk is among the simplest measures in the book.
As the internet becomes civilised, more and more of those measures will be legally mandated. Even sooner, the industry will find it worthwhile to start behaving as responsible netizens… even if maverick irresponsible people continue to help others hurt the online citizenry.
If you are involved in any of those businesses (hosting, content, forums), lose the fear. There are viable things you can do to minimize the damage done, and some of them make sense for business as well.